What is the difference between Newton's method and gradient descent method?

insert image description here

1. Newton's method

Newton's Method is a numerical method for finding the root (zero point) of a function or optimizing a function. It is an iterative method to find the root or minimum by continuously approaching the target point. Newton's method is widely used in optimization and numerical solution problems, including the fields of machine learning and deep learning.

insert image description here

The core idea of ​​Newton's method is to use the local second-order information (gradient and Hessian matrix) of the function to approximate the target point. Its basic steps are as follows:

  1. Select initial point: Select an initial point as the starting point.

  2. Calculate gradient and Hessian matrix: Calculate the gradient vector and Hessian matrix (second derivative matrix) of the objective function.

  3. Update iteration point: Use the value of the current point, gradient and Hessian matrix to calculate the next iteration point. The update formula is:

Guess you like

Origin blog.csdn.net/m0_47256162/article/details/132181634