Linear Algebra (4) Eigenvalue & Similarity Matrix

foreword

The previous mainly talked about the relationship between the equation system and the matrix, now understand the relationship between the matrix and the matrix

Eigenvalues ​​and Eigenvectors of a Square Matrix

Suppose A is a square matrix of order n, for a number λ \lambdal

If present: non-zero column vector α \alphaα,正解:A α ⃗ = λ α ⃗ A\vec{\alpha}=\lambda\vec{\alpha}Aa =la

  • λ \lambdaλ is called an eigenvalue of the matrix A

  • α ⃗ \vec{\alpha}a is called the eigenvector corresponding to the eigenvalue

insert image description here

  • Since α ⃗ \vec\alphaa is a nonzero column vector
  • Put λ \lambdaλ as an unknown,A − λ E = 0 A-\lambda E = 0AλE=0
  • Because there exists λ \lambdaλ解 =>∣ A − λ E ∣ = 0 |A-\lambda E| = 0AλE=0

Solve the characteristic equation

Write the characteristic matrix ( 4 − 2 1 1 ) − ( λ 0 0 λ ) = ( 4 − λ − 2 1 1 − λ ) \begin{pmatrix} 4 & -2\\ 1 & 1 for an n-order matrix A
\end{pmatrix} - \begin{pmatrix} \lambda & 0\\ 0 & \lambda\end{pmatrix} = \begin{pmatrix} 4- \lambda & -2\\ 1 & 1-\lambda\end{ pmatrix}(4121)(l00l)=(4l121l)

∣ 4 − λ − 2 1 1 − λ ∣ = − ∣ 1 1 − λ 4 − λ − 2 ∣ = − ∣ 1 1 − λ 0 − 2 − ( 1 − λ ) ∗ ( 4 − λ ) ∣ = 0 \begin{vmatrix} 4- \lambda & -2\\ 1 & 1-\lambda\end{vmatrix} = -\begin{vmatrix} 1 & 1-\lambda\\ 4- \lambda& - 2\end{vmatrix} =-\begin{vmatrix} 1 & 1-\lambda\\ 0 & -2-(1-\lambda) *(4- \lambda) \end{vmatrix} = 0 4l121l = 14l1l2 = 101l2(1l )(4l ). =0
equation
2 − 5 λ + 6 = 0 ⟹ λ 1 = 2 , λ 2 = 3 \lambda^2-5\lambda + 6 =0 \Longrightarrow \lambda_1=2 ,\lambda_2=3l25 min+6=0l1=2,l2=3
Solve the eigenvectors corresponding to the eigenvalues

  • λ 1 = 2 , λ 2 = 3 \lambda_1=2 ,\lambda_2=3l1=2,l2=3代入( A − λ E ) α ⃗ = 0 (A-\lambda E)\vec{\alpha} = 0AλEa =0
  • insert image description here
  • insert image description here
    basic properties
    insert image description here
  • The eigenvalues ​​and eigenvectors are similar to the problem of finding his coordinate system for "coordinates".
  • Eigenvalue λ \lambdaλ is used to eliminate a certain dimension of "coordinates" and obtain a "coordinate system" whose feature vector is this dimension
  • If λ \lambda appearsλ N multiple roots, then the obtained eigenvector "coordinate system" contains N dimensions

insert image description here

  • Determinant of square matrix = product of all eigenvalues ​​of square matrix
  • The sum of the main diagonal elements of the square matrix = the sum of all eigenvalues ​​​​of the square matrix

Proof: the sum of the eigenvalues ​​is equal to the trace, and the product of the eigenvalues ​​is equal to the determinant

similarity matrix

insert image description here
The definition of similar matrix can be understood from the perspective of coordinate system transformation

  1. Need to think of: A and B as two transformations
  2. Then A = P − 1 BPA=P^{-1}BPA=P1 BPspecifically refers to:
    • A is a <transformation> in the P coordinate system
    • The <transformation> is a B transformation if observed in the standard coordinate system

For example: in the standard coordinate system, there is a telescopic transformation as B, and in the P coordinate system, the same telescopic transformation is observed as A
insert image description here

If A and B are similar, because of different viewing angles, but the essence is the same transformation

Properties of Similarity Matrix

If A and B are similar, that is, A ∽ BB ∽ AA \backsim B \quad B \backsim AABBA

  1. Similar matrices have the same determinant value
  2. Similar matrices have the same eigenvalues
  3. Similar matrices have the same rank
  4. Similar matrices have the same trace
  5. Similarity matrices have the same invertibility

main reference

" 11.3 Solve eigenvalues ​​and eigenvectors (basic solution method) "
" 11.4 Properties of eigenvalues ​​and eigenvectors "
" 11.5 Eigenvalues ​​and traces of matrices "
" 1.6 Algebraic multiplicity and geometric multiplicity of characteristic roots "
" 11.7 Similarity What is the matrix saying ?"
" Proof: The sum of the eigenvalues ​​​​is equal to the trace, and the product of the eigenvalues ​​​​is equal to the determinant "

Guess you like

Origin blog.csdn.net/y3over/article/details/132313928