A Cold Start Context-Aware Recommender System for Tour Planning Using Artificial Neural Network and Case Based Reasoning
Table 2
Algorithms for training backpropagation multilayer perceptron neural network.
Training algorithm
Description
GDX
It renovates weight and bias values conceding to the gradient descent and current trends in the error surface () with adaptive training rate.
RP
It considers only the sign of the partial derivative to determine the direction of the weight update and multiplies it with the step size ().
CGF
The vector of weights changes is computed as , where is the search direction that is computed as where the parameter is computed as
CGP
The vector of weights changes is computed as , where is the search direction that is computed as where the parameter is computed as
CGB
The search direction is reset to the negative of the gradient only if there is very little orthogonality between the present gradient and the past gradient as explained in this condition.
SCG
It denotes the quadratic approximation to the error in a neighborhood of a point by To minimize , the critical points for are found as the solution to the following linear system:
BFG
The vector of weights changes is computed as , where is the Hessian (second derivatives) matrix approximated by as
OSS
It generates a sequence of matrices that represents increasingly accurate approximations to the inverse Hessian (). The updated expression uses only the first derivative information of as follows: , where , , , and is the transfer function. It does not store the complete Hessian matrix and assumes that the previous Hessian was the identity matrix at each iteration.
LM
The vector of weights changes is computed as , where combination coefficient is always positive, is the identity matrix, is Jacobian matrix (first derivatives), and is network error.