The role of convolutional neural network activation layer, convolutional neural network activation function

What is the LMS algorithm?

LMS (Least mean square) algorithm is the least mean square error algorithm.

The lms algorithm was proposed by B Widrow and ME Hoff of Stanford University in the United States in 1960 when studying adaptive theory. Because of its easy implementation, it was quickly widely used and became a standard algorithm for adaptive filtering.

In filter optimization design, a certain minimum cost function or a certain performance index is used to measure the quality of the filter. The most commonly used index is the mean square error. This method of measuring the quality of the filter is also called the mean square. error criterion.

The characteristics of the LMS algorithm are based on the small mean square error criterion and the mean square error surface. Naturally, we will think of updating along the projection direction of the steep decline of the mean square error at each moment on the weight vector surface, that is, through the inverse gradient vector of the objective function. to iteratively update.

Since the mean square error performance surface has only one unique minimum value, as long as the convergence step size is appropriately selected, no matter where the initial weight vector is, it can eventually converge to a small point on the error surface, or within one of its neighborhoods.

Google Artificial Intelligence Writing Project: Neural Network Pseudo-Original

What is LMS algorithm

LMS algorithm steps: 1. Set variables and parameters: X(n) is the input vector, or training sample W(n) is the weight vector e(n) is the deviation d(n) is the expected output y(n) is the actual output η is the learning rate n is the number of iterations 2. Initialization, assign a small random non-zero value to each of w (0), let n = 03, for a set of input samples x (n) and the corresponding expected output d, calculate e(n)=d(n)-X(n)W(n+1)=W(n)+ηX(n)e(n)4. Determine whether the conditions are met. If the conditions are met, the algorithm ends. If If n is not increased by 1, go to step 3 to continue executing the copywriting dog .

What is the LMS algorithm and what is its full name?

In 1959, the least mean square (LMS) algorithm proposed by Widrow and Hof played a great role in the development of adaptive technology. Because the LMS algorithm is simple and easy to implement, it is still widely used today.

Considerable research has been done on the performance of the LMS algorithm and improved algorithms, and it remains an important research topic today. Further research work involves studying the performance of this algorithm with non-stationary, correlated inputs.

When the eigenvalues ​​of the input correlation matrix are dispersed, the convergence of the LMS algorithm becomes worse. Another aspect of the research is as follows:

おすすめ

転載: blog.csdn.net/aifamao2/article/details/127362567