Gradient descent hair

Gradient descent (gradient descent) or the steepest descent method (steepest descent) is an optimization problem of the most commonly used method unconstrained. Gradient descent is an iterative algorithm, each step requires the solution of the gradient vector of the objective function.

Suppose F ( X ) is R & lt n- first order partial derivative of a function on a continuous, solution requires unconstrained optimization problem is

X * denotes the objective function F ( X ) of the minimum point.

Mentioned gradient descent method is an iterative algorithm. Select the appropriate initial value x ( 0) , continue the iteration, updated values of x, for the objective function minimization until convergence. Since the negative gradient direction is to decrease the value of a function of the fastest direction, in each iteration, the negative gradient direction of the x update value, thus to decrease the function value.

Since F ( X ) having a continuous first order partial derivative value if the first iteration k X (k) , may be the F ( X ) in the X (k) first order Taylor development close to the opening:

Here, for the F ( X ) in the X (K) gradient.

 Obtaining a first value of the iteration k + 1 X (k + 1) :

 

 

Wherein, P K is a search direction, taking the negative gradient direction  , [lambda] K is the step size, determined by the one-dimensional search, i.e., [lambda] K such that:

 

 

 

 

Gradient descent algorithm is as follows:

Input: the objective function  F ( X ), gradient function  calculation accuracy [epsilon] ;

 

 

Output: F ( X ) of the minimum point X *

(1) takes the initial value X ( 0)R & lt n-  , set K = 0

(2) Calculate F (X (K) )

(3) calculating a gradient G K = G ( X ( K ) ), when || G K || < [epsilon] , the iteration is stopped, so that X * = X K

 

Guess you like

Origin www.cnblogs.com/loveEunha/p/11544131.html