The steepest descent method, LMS algorithm, RLS algorithm and their comparison

The steepest descent method, LMS algorithm, RLS algorithm and their comparison

1. The steepest descent algorithm formula:
Insert picture description here

It is a true gradient, and the gradient formula contains mathematical expectations, which is not easy to find.
LMS algorithm algorithm formula:
Insert picture description here
Insert picture description here

Where J(n) is the gradient estimate.

2. The search direction of the steepest descent algorithm is the negative direction of the gradient, and each step of the update reduces the value of the objective function ("the steepest descent meaning") The
search direction of the LMS algorithm is the negative direction of the instantaneous gradient, and it is not guaranteed that each step of the update will make the objective function The value decreases, but the general trend reduces the value of the objective function.

3. Algorithm requirement angle:
gradient descent method requirement: gradient vector (search direction) at the same time linear
independence requirement of LMS algorithm:
linear independence of input signal vector at different time

4. In the update equation of the weight vector, the RLS algorithm has more iterative matrix P(n) than the LMS algorithm, and this matrix can be regarded as the approximate inverse matrix of the second derivative matrix of the cost function, so the RLS converges faster than the LMS algorithm , But the cost is that the amount of calculation is much larger than that of LMS.

Guess you like

Origin blog.csdn.net/qq_42005540/article/details/108410296