This is the final battle

Invariant factor elementary factor determinant factor smith standard form

Please add image description

Unitary matrix H-matrix, etc.

A H = A A^H = A AH=A is H-array

Properties of positive definite H matrix

Young AAA is a positive definite H-array.

  1. There exists an invertible matrix QQQ , divideA = QHQA=Q^HQA=QHQ.
  2. PP existsP , dividePHAP= IP^HAP=IPHAP=I.
  3. The eigenvalue of A is greater than 0.
  4. Q − 1 AQQ^{-1}AQQ1 AQis also a positive definite H-array.

schmidt orthogonalization

[ α 1 , α 2 , … , α n ] \Large[\alpha_1,\alpha_2,\dots,\alpha_n][ a1,a2,,an] .
_ , β 1 ) ( β 1 , β 1 ) β 1 − ( α i , β 2 ) ( β 2 , β 2 ) β 2 − ⋯ − ( α i , β i − 1 ) ( β i − 1 , β i − 1 ) β i − 1 \large \begin{aligned} \beta_1 &= \alpha_1\\ \beta_2 &= \alpha_2 - \frac{(\alpha_2,\beta_1)}{(\beta_1,\beta_1)}\ beta_1\\ \vdots \\ \beta_i &= \alpha_i - \frac{(\alpha_i,\beta_1)}{(\beta_1,\beta_1)}\beta_1- \frac{(\alpha_i,\beta_2)}{( \beta_2,\beta_2)}\beta_2-\dots-\frac{(\alpha_i,\beta_{i-1})}{(\beta_{i-1},\beta_{i-1})}\beta_ {i-1} \end{aligned}b1b2bi=a1=a2( b1,b1)( a2,b1)b1=ai( b1,b1)( ai,b1)b1( b2,b2)( ai,b2)b2( bi1,bi1)( ai,bi1)bi1

Projective transformation

Example: Given R 3 R^3RVectora = (1, 0, 0) in 3 a=(1, 0, 0)a=(1,0,0 )β = ( 2 , 0 , 3 ) \beta=(2, 0 ,3)b=(2,0,3 ) , then vectorx = (x 1, x 2, x 3) ∈ R 3 x=(x_1,x_2,x_3)\in{R^3}x=(x1,x2,x3)R3 In the subspacespan { α , β } span\{\alpha,\beta\}s p an { a ,What is the orthogonal projection on β } ?
First, standard orthogonalize α and β into η 1 = [ 1 , 0 , 0 ] T , η 2 = [ 0 , 0 , 1 ] TU = [ η 1 , η 2 ] Projection operator P = UUH vector x = The orthogonal projection of ( x 1 , x 2 , x 3 ) ∈ R 3 on the subspace span { α , β } is: P x = UUH x = [ 1 0 0 0 0 1 ] [ 1 0 0 0 0 1 ] [ x 1 , x 2 , x 3 ] T = [ 1 0 0 0 0 0 0 0 1 ] [ x 1 , x 2 , x 3 ] T = ( x 1 , 0 , x 3 ) \begin{aligned} First, \alpha, \beta \boldsymbol{standard orthogonalization} is \eta_1&=[1,0,0]^T, \eta_2=[0,0,1]^T\\ U &= [\eta_1, \eta_2] \\ \boldsymbol{Projection operator}& \boldsymbol{P=UU^H} \\ Vector x=(x_1,x_2,x_3)\in{R^3} & in subspace span\{\alpha The orthogonal projection on ,\beta\} is: \\ Px&=UU^Hx\\ &= { \left [ \begin {matrix} 1 & 0 \\ 0 & 0 \\ 0 & 1 \end {matrix} \right ] } { \left [ \begin {matrix} 1 & 0 & 0\\ 0 & 0 & 1\\ \end {matrix} \right ] } { [x_1,x_2,x_3]^T } \\& = \left[ \begin{matrix} 1 & 0&0 \\ 0 & 0 &0\\ 0 & 0 &1\\ \end{matrix} \right ] [x_1,x_2,x_3]^T \\&= (x_1, 0,x_3) \end{aligned}First let α ,β standard orthogonalization is eta1Uprojection operatorVector x=(x1,x2,x3)R3Px=[1,0,0]T,the2=[0,0,1]T=[ the1,the2]P=U U UHIn the subspace s p an { α ,The orthogonal projection on β } is:=U U UHx= 100001 [100001][x1,x2,x3]T= 100000001 [x1,x2,x3]T=(x1,0,x3)

Matrix factorization

singular value decomposition

ref
Please add image description

spectral decomposition

Request AAA is a normal matrix, that is,AHA = AAHA^HA=AA^HAHA=AAH
A = [ α 1 , α 2 , … , α n ] [ λ 1 λ 2 ⋱ λ n ] [ α 1 H α 2 H ⋮ α n H ] = λ 1 α 1 α 1 H + λ 2 α 2 α 2 H + ⋯ + λ n α n α n H = ∑ i = 1 r λ i ∑ j = 1 is α ij α ij H = ∑ λ i G i \large {\begin{align} A &= [\alpha_1 ,\alpha_2,\dots,\alpha_n] \left[ \begin{matrix} \lambda_1 & \\ & \lambda_2 &\\ & &\ddots\\ & & &\lambda_n \end{matrix} \right ] \left [ \begin{matrix} \alpha_1^H\\ \alpha_2^H\\ \vdots\\ \alpha_n^H \end{matrix} \right]\\ &=\lambda_1\alpha_1\alpha_1^H + \lambda_2\ alpha_2\alpha_2^H + \dots +\lambda_n\alpha_n\alpha_n^H\\ &=\sum_{i=1}^r \lambda_i\sum_{j=1}^{n_i}\alpha_{ij}\alpha_ {ij}^H\\\ &=\sum\lambda_i G_i\\ \end{align} }A =[ a1,a2,,an] l1l2ln a1Ha2HanH =l1a1a1H+l2a2a2H++lnananH=i=1rlij=1niaijaijH=liGi

Orthogonal triangular decomposition (UR decomposition)

Stop watching, I’m sleepy. .

norm

vector norm

  1. non-negativity
  2. Homogeneity: ∣ ∣ k α ∣ ∣ = ∣ k ∣ ∣ α ∣ ∣ ||k\alpha||=|k||\alpha||∣∣kα∣∣=k ∣∣ α ∣∣ , k is any complex number.
  3. Triangle inequality: Take any α, β, and have ∣ ∣ α + β ∣ ∣ ≤ ∣ ∣ α ∣ ∣ + ∣ ∣ β ∣ ∣ Take any \alpha and \beta, and have ||\alpha +\beta||\leq| |\alpha|| + ||\beta||Take any α , β , there is ∣∣ α+β∣∣∣∣α∣∣+∣∣β∣∣

2-范数: ∣ ∣ α ∣ ∣ 2 = ( ∑ i = 1 n ∣ a i ∣ 2 ) 1 2 ||\alpha||_2 = (\sum^n_{i=1}|a_i|^2)^{\frac{1}{2}} ∣∣α2=(i=1nai2)21= ( α H α ) 1 2 (\alpha^H\alpha)^{\frac{1}{2}} ( aHα)21

matrix norm

  1. Non-negativity: when A ≠ 0 , ∣ ∣ A ∣ ∣ > 0 when A\neq 0, ||A||>0This A=0,∣∣A∣∣>0 , this isA = 0 , ∣ ∣ A ∣ ∣ = 0 A=0,||A||=0A=0,∣∣A∣∣=0
  2. Homogeneity: ||kA||=|k|||A||, k is any complex number
  3. Triangle inequality: For any A , B ∈ C m × n , there will be ∣ ∣ A + B ∣ ∣ ≤ ∣ ∣ A ∣ ∣ + ∣ ∣ B ∣ ∣ For any A , B\in C^{m\times n}, There is ||A+B|| \leq||A|| + ||B||Take any A or BCm × n , thereis∣∣A+B∣∣∣∣A∣∣+∣∣B∣∣
  4. Consistency of matrix multiplication: any A , B , ∣ ∣ AB ∣ ∣ ≤ ∣ ∣ A ∣ ∣ ∣ ∣ B ∣ ∣ A,B,||AB||\leq ||A||||B||A,B,∣∣AB∣∣∣∣A∣∣∣∣B∣∣

When doing the questions: Points 1, 2, and 3 are all easy to prove.
Compatibility requires a little transformation:
such as:
Insert image description here

matrix function

Method 1 for finding matrix functions: Find J and the similarity transformation matrix P. Find J and the similarity transformation matrix P.Find J and similarity transformation matrix P

Please add image description

Method 2 for finding matrix functions: Use the minimum polynomial m (λ) Use the minimum polynomial m(\lambda)Using the minimum polynomial m ( λ )

Please add image description

Trigonometric and exponential matrix functions

Please add image description

function matrix

Just know how to derive integrals.

Guess you like

Origin blog.csdn.net/onlyyoujojo/article/details/134887631