1. What is called the matrix decomposition?
To satisfy Av = λv, matrix decomposition is divided into: a singular value decomposition and eigenvalue decomposition
Decomposition Characteristics: confined to the square sign, Av = λv where v is the eigenvector matrix of A, if the equations are valid, then λ is the eigenvalue corresponding to each matrix containing only unique eigenvalue and the corresponding eigenvector.
Singular value decomposition: any matrix used in personalized recommendation, the PCA dimension reduction and the NLP, UΣV formula = A T , which is a matrix is a matrix, in addition to the elements on the main diagonal are all 0 each element on the main diagonal singular values are referred to, it is a matrix.
How to solve the U, Σ, V three matrices
If we transpose and A A do matrix multiplication, it will be a square matrix of n × n . Since it is a square, then we can feature decomposition, resulting eigenvalues and eigenvectors satisfy the following formula:
So that we can obtain a matrix of n eigenvalues and corresponding eigenvectors v of n. The all eigenvectors into an n × n matrix V, which is of the formula V we SVD of the matrix. Generally, we will each feature vector V is called the right singular vectors of A.
If we transpose of A and A do matrix multiplication, it will be a square matrix of m × m . Since it is a square, then we can feature decomposition, resulting eigenvalues and eigenvectors satisfy the following formula:
So that we can obtain a matrix of m corresponding to the m eigenvalues and eigenvectors of u. The all eigenvectors into an m × m matrix U, which is our SVD U-matrix of the equation. Generally, we will each feature vector U of A is called the left singular vectors.
U and V we find out, now left singular value matrix Σ not find the.
Since Σ is in addition to other locations singular values on the diagonal are 0, then we need a requirement for each singular value σ on it.
We have noticed:
So we can find each of our singular value, and then find the singular value matrix Σ.
Above there is a problem is not speaking, we say that the feature vector is composed of our SVD matrix V, and
The feature vector is composed of U in our SVD matrix, what according to you? This is actually very easy to prove, we have to prove V matrix as an example.
The type of proof of use . As can be seen in the feature vector V it is indeed composed of a matrix of our SVD. Similar methods can be obtained feature vector is composed of U matrix of our SVD.
Further, we can also see that our eigenvalues of matrix singular value is equal to the square of the matrix, that is to say the eigenvalues and singular values satisfy the following relationship:
This means that we can not be calculated singular values can be determined by the value of the square root of the features to find singular values.
SVD algorithm commonly used in the recommendation, data reduction and noise reduction, parallelized