foreword
The previous mainly talked about the relationship between the equation system and the matrix, now understand the relationship between the matrix and the matrix
Eigenvalues and Eigenvectors of a Square Matrix
Suppose A is a square matrix of order n, for a number λ \lambdal
If present: non-zero column vector α \alphaα,正解:A α ⃗ = λ α ⃗ A\vec{\alpha}=\lambda\vec{\alpha}Aa=la
-
λ \lambdaλ is called an eigenvalue of the matrix A
-
α ⃗ \vec{\alpha}ais called the eigenvector corresponding to the eigenvalue
- Since α ⃗ \vec\alphaais a nonzero column vector
- Put λ \lambdaλ as an unknown,A − λ E = 0 A-\lambda E = 0A−λE=0
- Because there exists λ \lambdaλ解 =>∣ A − λ E ∣ = 0 |A-\lambda E| = 0∣A−λE∣=0
Solve the characteristic equation
Write the characteristic matrix ( 4 − 2 1 1 ) − ( λ 0 0 λ ) = ( 4 − λ − 2 1 1 − λ ) \begin{pmatrix} 4 & -2\\ 1 & 1 for an n-order matrix A
\end{pmatrix} - \begin{pmatrix} \lambda & 0\\ 0 & \lambda\end{pmatrix} = \begin{pmatrix} 4- \lambda & -2\\ 1 & 1-\lambda\end{ pmatrix}(41−21)−(l00l)=(4−l1−21−l)
∣
∣ 4 − λ − 2 1 1 − λ ∣ = − ∣ 1 1 − λ 4 − λ − 2 ∣ = − ∣ 1 1 − λ 0 − 2 − ( 1 − λ ) ∗ ( 4 − λ ) ∣ = 0 \begin{vmatrix} 4- \lambda & -2\\ 1 & 1-\lambda\end{vmatrix} = -\begin{vmatrix} 1 & 1-\lambda\\ 4- \lambda& - 2\end{vmatrix} =-\begin{vmatrix} 1 & 1-\lambda\\ 0 & -2-(1-\lambda) *(4- \lambda) \end{vmatrix} = 0
4−l1−21−l
=−
14−l1−l−2
=−
101−l−2−(1−l )∗(4−l ).
=0
equation
2 − 5 λ + 6 = 0 ⟹ λ 1 = 2 , λ 2 = 3 \lambda^2-5\lambda + 6 =0 \Longrightarrow \lambda_1=2 ,\lambda_2=3l2−5 min+6=0⟹l1=2,l2=3
Solve the eigenvectors corresponding to the eigenvalues
- λ 1 = 2 , λ 2 = 3 \lambda_1=2 ,\lambda_2=3l1=2,l2=3代入( A − λ E ) α ⃗ = 0 (A-\lambda E)\vec{\alpha} = 0(A−λE)a=0
basic properties
- The eigenvalues and eigenvectors are similar to the problem of finding his coordinate system for "coordinates".
- Eigenvalue λ \lambdaλ is used to eliminate a certain dimension of "coordinates" and obtain a "coordinate system" whose feature vector is this dimension
- If λ \lambda appearsλ N multiple roots, then the obtained eigenvector "coordinate system" contains N dimensions
- Determinant of square matrix = product of all eigenvalues of square matrix
- The sum of the main diagonal elements of the square matrix = the sum of all eigenvalues of the square matrix
similarity matrix
The definition of similar matrix can be understood from the perspective of coordinate system transformation
- Need to think of: A and B as two transformations
- Then A = P − 1 BPA=P^{-1}BPA=P− 1 BPspecifically refers to:
- A is a <transformation> in the P coordinate system
- The <transformation> is a B transformation if observed in the standard coordinate system
For example: in the standard coordinate system, there is a telescopic transformation as B, and in the P coordinate system, the same telescopic transformation is observed as A
If A and B are similar, because of different viewing angles, but the essence is the same transformation
Properties of Similarity Matrix
If A and B are similar, that is, A ∽ BB ∽ AA \backsim B \quad B \backsim AA∽BB∽A
- Similar matrices have the same determinant value
- Similar matrices have the same eigenvalues
- Similar matrices have the same rank
- Similar matrices have the same trace
- Similarity matrices have the same invertibility
main reference
" 11.3 Solve eigenvalues and eigenvectors (basic solution method) "
" 11.4 Properties of eigenvalues and eigenvectors "
" 11.5 Eigenvalues and traces of matrices "
" 1.6 Algebraic multiplicity and geometric multiplicity of characteristic roots "
" 11.7 Similarity What is the matrix saying ?"
" Proof: The sum of the eigenvalues is equal to the trace, and the product of the eigenvalues is equal to the determinant "