This article then went on a " geometric series] matrix (a): matrix multiplication and inverse matrix " continue to introduce matrix.
Transpose
Matrix transpose relatively simple, is interchangeable between rows and columns, can be used for standard $ T $ denotes the transpose of a matrix.
$$A^T=(b_{ij})$$
其中 $ b_ {ij} = a_ {ji} $.
For example, for:
$$A=\begin{bmatrix}
1 & 2 & 3\\
4 & 5 & 6
\end{bmatrix}$$
Its transpose are:
$$A^T=\begin{bmatrix}
1 & 4\\
2 & 5\\
3 & 6
\end{bmatrix}$$
Some of the following formula for transposition:
$$(AB)^T=B^TA^T$$
$$(A+B)^T=A^T+B^T$$
$$(cA)^T=cA^T$$
$$(A^T)^T=A$$
Several Special Matrices
Orthogonal matrix
If the $ AA ^ T = I $, $ A $ matrix is called orthogonal matrix .
Obviously, the orthogonal matrix is reversible. The inverse matrix $ A ^ {- 1} = A ^ T $.
Symmetric matrix
If $ A = A ^ T $, $ A $ matrix is called symmetric matrix .
Symmetric matrix
If $ A = -A ^ T $, $ A $ matrix is called symmetric matrix .
Determinant
Geometric meaning
Determinant (determinant) concept was first used to solve the equations to solve problems. The result is one of the more well-known Cramer's rule . But with this proposed concept, a lot of theories about the determinants of the development has been far more than is used to solve the equations that simple.
From the previous article , we look at the matrix perspective is no longer confined to equations University of solving the problem of excessive use of background, we began to see it as a linear transformation and geometric transformations . This understanding can help us better understand the geometric meaning of the determinant.
" Linear Algebra nature --05-- determinant " is a very good movie, the reader is advised to watch. It has been very vividly geometric meaning of the determinant of visualization, I will not repeat here. Following is a brief summary and add something.
For $ 2 \ times 2 $-dimensional matrix, the Left said it would take $ A $-dimensional real vector from $ R ^ 2 $ mapped to the $ R ^ 2 $. We can imagine that it acts on a unit square $ S $, so converting to another form $ S '$. Converts the shape $ S '$ area is equal to the absolute value of the determinant.
Determinant is less than 0 , $ S $ indicates the orientation changed .
If the determinant is equal to 0 , represents $ S '$ becomes a point or a line (in this case represents a conversion irreversible because no longer the lines or after dimensionality reduction transformation back to the original point of the matrix).
Below is an example:
If extended to $ 3 \ times 3 $ real moment, the absolute value of the determinant is equal to the volume of a cube V $ 1 $ conversion cube $ '$ of the volume V .
why?
Determinant may actually reflect a linear transformation of the geometric transformation or " stretch factor ." If there is a transformation A, may be the area of the geometry of $ 3 $ S 12 converted into the area $ S '$, we can be sure that transform A (also square) of the determinant is 4. If there is a transform B, may be the area of the geometry of $ 3 $ S 0 is converted into the area $ S '$ (line segment), this transformation B (also square) of the determinant is 0.
Correspondingly, there is a very important determinant of multiplication properties.
For any $ n \ times n $ matrices $ A $ and $ B $, are:
$$ det (AB) = (detA) (detB) $$
DET wherein () represents the determinant of a matrix.
This multiplication nature, in fact, an expression of this factor that multiplies the stretch characteristics. If conversion can $ B $ 2-fold increase of the area, converting $ A $ 3 times the area can be increased, then the $ AB $ 6 times the area can be increased.
definition
After determinant has an intuitive feel, we look at the definition of determinant.
$ N \ times n matrix determinant from $ $ R ^ {n \ times n} $ $ R & lt mapped at $ space.
$ The R ^ {n \ times n} \ right arrow R $
Before determinant specific definition, define cofactor .
If $ A $ a $ n \ times n $ matrix defines cofactor $ A_ {ij} $ removal $ I $ row and $ J $ column $ (n-1) \ times (n-1) $ matrix .
For example, for
$$A=\begin{bmatrix}
1 & 0 & 3\\
2 & 1 & 2\\
0 & 5 & 1
\end{bmatrix}$$
Its cofactor
$$A_{21}=\begin{bmatrix}
0 & 3\\
5 & 1
\end{bmatrix}$$
With more than sub-type definition after, here is the determinant of recursive definition :
$$\left\{\begin{matrix}
det\left [ a \right ]=a
\\
detA=\sum_{\nu =1}^{n}(-1)^{\nu+1}detA_{\nu1}
\end{matrix}\right.$$
nature
Here are some of the important nature of the determinant:
1. $ det (AB) = (detA) (detB) $.
2. phalanx $ A $ is invertible matrix $ \ Leftrightarrow detA \ neq 0 $.
3. If $ A $ invertible, then $ det (A ^ {- 1}) = (detA) ^ {- 1} $.
4. Detai $ = det (A ^ T) $
Eigenvectors and eigenvalues
Geometric meaning
Recommend watching "Video essence of linear algebra --10-- eigenvectors and eigenvalues ", these two concepts have provided a good visualization.
Simply put, geometric transformation is applied after $ A $ ($ Ax $) , many vectors will change the original direction. But the feature vector does not change direction, just change the size occur. And this is a factor of varying sizes telescoping feature vector of the feature values .
There may be a plurality of feature vectors, may be only one, or may not exist.
A special case is:
$$A=\begin{bmatrix}
cos\theta &-sin\theta \\
sin\theta & cos\theta
\end{bmatrix}$$
It is a rotation matrix representing rotation transformation of two-dimensional space . In this case, all the two-dimensional space vector will produce rotation , i.e., will change direction . Thus, this matrix does not exist the feature vector of real numbers .
definition
If $ Ax = \ lambda x $, where $ x \ neq 0 $, $ \ lambda $ is a scalar, then that feature vector $ X $, $ \ $ is the lambda value corresponding features.
Compute
Calculating eigenvalues and eigenvectors can be obtained by the following formula:
$$ the (A- \ lambda I) = 0 $$
reference
- 《Algebra》by Michael Artin
- 《Rotation Transforms for Computer Graphics》by John Vince
- Wikipedia: determinant
- The nature of linear algebra --05-- determinant
- Essentially linear algebra --10-- eigenvectors and eigenvalues