Linear Algebra - matrix multiplication (Continued)

Mentioned before, the matrix multiplication can be regarded as vectors for the substrate change, the selected substrate may directly lead to changes in the vector dimension. 2 * 3 matrix can be a two-dimensional vectors are mapped to three-dimensional space, so that the matrix can be regarded as a means to manipulate space.

Matrix changing the vector dimension

 

To understand the specific nature of this transformation, we must start from the analysis of the matrix itself. In accordance with previous thinking, is a substrate of each column of the matrix vector multiplication matrix is ​​a linear combination of the right column vector. Similarly, the matrix multiplication can be seen as the left linear combination of row vectors of the matrix, the columns of the matrix constitutes a column space, line constitutes a line space. After the linear transformation, if T (x) = 0 i.e. Ax = 0, x is called the null space of A N (A) of.

 

We rank is represented by the spatial dimension of the matrix, although the m * n matrix of input and output objects are n-dimensional vector and m-dimensional vector, but the rank of the column space and row space n and m are not necessarily, take the example column space, when the matrix between the basis vectors are linearly related, then the set of spatial dimensions of the substrate will fall open. Column rank is A great number of linearly independent columns, the row rank is the maximum number of linearly independent rows

 

For the m * n matrix of linear transformation T (or A) of the entire scope n-dimensional space, and T (x) only in the null space N (A) and the column space, further, a four-dimensional space we can is divided into two sub-spaces, and all spatial matrix row space perpendicular to the row space, space perpendicular to the row space is mapped to 0, so it is also a null space. All remaining line included in the vector space, and a linear combination of lines and spaces null space, the space will be mapped to the column.

Four sub-space matrix

 

Rank - Zero Theorem

 Linear algebra, rank - zero degree theorem gives the relationship between the degree and zero (nullity) rank (Rank) a linear change or a matrix.

dim N(A)+dim rank(A)=n

Rank and where the dimension is the dimension vector and matrix null space of the matrix n, assume there are n linearly independent vectors in the substrate in the dimension n, wherein {a1, a2 .....} for all satisfy T (c1a1 + c2a2 + .....) = 0 of the substrate, {c1, c2 .....} as random factor, the remaining base vector {b1, b2 .....} is, T (C1b1) + T (C2b2) + ..... = T (C1b1 + C2b2 + .....). To prove dimension size before and after the conversion constant need to prove T (C1b1 + C2b2 + .....)! = 0.

If T (C1b1 + C2b2 + .....) = 0, then C1b1 + C2b2 + ..... = c1a1 + c2a2 + ....., because all linearly independent basis vectors, then all random coefficients are equal to 0, That is unless C1, C2 .... = 0, otherwise T (C1b1) + T (C2b2) + .....! = 0, so that {T (C1b1), T (C2b2) ....} linearly independent, and the number of linearly independent of both the substrate and the sum of n, theorems proved. By proving we can also see that the row rank matrix equals column rank.

 

Row rank and column rank

Row rank is equal to one column rank properties of the matrix, assuming that column rank matrix is ​​r, the rank of the row is c, the matrix consisting of r linearly independent vectors of a matrix of m * r, then m * n matrix can considered as the product of r and m * r * n two matrices (which may be because all into such vectors can be combined into a column of this row r), left by the time perspective, all row vectors are It can be regarded as a linear combination of r * n matrix, so that c <= r, using the same transpose operation a, the less there are C r, C r is equal to it, the row rank equal column rank. Also observed that all matrices can be decomposed into a product of the row and column rank vector of rank vectors.

Guess you like

Origin www.cnblogs.com/matrixmlpforever/p/10960602.html