[PubMed Mathematics] Linear Algebra Chapter 3 - Vector | 2) Vector group correlation and properties of linear representation, equivalence of vector group, maximum linearly independent group and rank


introduction

Following the previous article, let's learn the correlation properties of learning vector group correlation and linear representation


2. Correlation and linear representation of vector groups

2.3 Vector Group Correlation and Properties of Linear Representation

Property 1 —— vector group α 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anThe necessary and sufficient condition for linear correlation is the vector group α 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anAt least one vector in is linearly representable by the rest.

Proof: Necessity: Let vector groups α 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anlinear correlation, there is a set of constants k 1 , k 2 , … , kn k_1,k_2,\dots,k_n that are not all zerok1,k2,,kn, making k 1 α 1 + k 2 α 2 + ⋯ + kn α n = 0 , k_1\alpha_1+k_2\alpha_2+\dots+k_n\alpha_n=0,k1a1+k2a2++knan=0 , we might as well set a non-zero constant ask 1 k_1k1, ie k 1 ≠ 0 k_1 \ne 0k1=0,刪α 1 = − k 2 k 1 α 2 − ⋯ − knk 1 α n , \alpha_1=\frac{-k_2}{k_1}\alpha_2-\dots-\frac{k_n}{k_1}\alpha_n,a1=k1k2a2k1knan, namely the vectorα 1 \alpha_1a1can be represented linearly by the rest of the vectors.

Sufficiency: Suppose there are constants l 1 , l 2 , … , lk − 1 , lk + 1 , … , ln l_1,l_2,\dots,l_{k-1},l_{k+1},\dots,l_nl1,l2,,lk1,lk+1,,ln(missing lk l_klk) , let k = l 1 α 1 + ⋯ + lk − 1 α k − 1 + ⋯ + ln α n \alpha_k=l_1\alpha_1+\dots+l_{k-1}\alpha_{k-1}+\ dots+l_n\alpha_nak=l1a1++lk1ak1++lnan则有:l 1 α 1 + l 2 α 2 + ⋯ + lk − 1 α k − 1 + ( − 1 ) α k + ⋯ + ln α n = 0 l_1\alpha_1+l_2\alpha_2+\dots+l_{k -1}\alpha_{k-1}+\pmb{(-1)\alpha_k}+\dots+l_n\alpha_n=0l1a1+l2a2++lk1ak1+( 1 ) ak++lnan=0 , because the existence coefficient is not 0, so the vector groupα 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anlinear correlation.

1. A necessary and sufficient condition for a vector to be linearly dependent is that the vector is a zero vector.
2. The necessary and sufficient condition for two vectors to be linearly related is that the two vectors are proportional.
3. The vector group containing the zero vector must be linearly related.

Property 2 —— Suppose vector groups α 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anLinearly independent, then:
(1) If α 1 , α 2 , … , α n , b \pmb{\alpha_1,\alpha_2,\dots,\alpha_n,b}a1,a2,,an,b is linearly related, then the vectorbbb can be represented byα 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anUnique linear representation.
(2) α 1 , α 2 , … , α n , b \pmb{\alpha_1,\alpha_2,\dots,\alpha_n,b}a1,a2,,an,The necessary and sufficient condition for b to be linearly independent is the vector bbb cannot be represented byα 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anLinear representation.

Property 3 —— If a vector group is linearly independent, then any part of the vector group is linearly independent.

Property 4 —— If a vector group has a partial vector group that is linearly related, then the vector group must be linearly related.

Property 5 —— Suppose α 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anfor nnn nn_n- dimensional vector, thenα 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anThe necessary and sufficient condition for linear independence is ∣ α 1 , α 2 , … , α n ∣ ≠ 0 |\pmb{\alpha_1,\alpha_2,\dots,\alpha_n}| \ne 0α1,a2,,an=0 , that is, the determinant formed by these vectors is not 0.
Proof: α 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anLinearly independent, that is, the corresponding homogeneous equations have only zero solutions, so the coefficient determinant is not 0.

Property 6 —— Suppose α 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anfor nnn mm_m- dimensional vector, ifm < nm < nm<n , then the vector groupα 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anMust be linearly related.

(1) The number of vectors in the vector group corresponds to the number of unknowns in the homogeneous linear equation system. The more the number of vectors in the vector group, the more the number of unknowns in the homogeneous linear equation system, the greater the possibility of generating free variables, so the possibility of non-zero solutions to the homogeneous linear equation system increases, that is The possibility of linear correlation of vectors increases, so the possibility of linear correlation increases after increasing the number of vectors.
(2) The dimension of the vector in the vector group corresponds to the number of homogeneous linear equations. The more the dimension, the more the number of homogeneous linear equations. Only the possibility of zero solution increases, that is, the vector is linearly independent The possibility of increasing, so the possibility of linear independence increases after increasing the dimension of the vector.

Property 7 —— Suppose vector group α 1 ′ , α 2 ′ , … , α n ′ \pmb{\alpha_1',\alpha_2',\dots,\alpha_n'}a1,a2,,anis the vector group α 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anThe extended vector group (that is, added dimension), if the vector group α 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anLinearly independent, then the vector set α 1 ′ , α 2 ′ , … , α n ′ \pmb{\alpha_1',\alpha_2',\dots,\alpha_n'}a1,a2,,anLinearly independent, and vice versa.

Negative example: such as the original vector group α 1 = ( 1 , 0 ) T , α 2 = ( 0 , 1 ) T , α 3 = ( 0 , 0 ) T \pmb{\alpha_1=(1,0)^T,\ alpha_2=(0,1)^T,\alpha_3=(0,0)^T}a1=(1,0)T,a2=(0,1)T,a3=(0,0)T , after expansionα 1 ′ = ( 1 , 0 , 0 ) T , α 2 ′ = ( 0 , 1 , 0 ) T , α 3 ′ = ( 0 , 0 , 1 ) T \pmb{\alpha_1'=( 1,0,0)^T,\alpha_2'=(0,1,0)^T,\alpha_3'=(0,0,1)^T}a1=(1,0,0)T,a2=(0,1,0)T,a3=(0,0,1)T , the determinant of which is1 ≠ 0 1 \ne 01=0 , so the vector groupα 1 ′ , α 2 ′ , α 3 ′ \pmb{\alpha_1',\alpha_2',\alpha_3'}a1,a2,a3Linearly independent, but the original set of vectors contains zero vectors, linearly dependent.

Property 8 —— Suppose the vector group α 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anis a pairwise orthogonal non-zero vector group, then α 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anLinearly independent, and vice versa.
Proof: let k 1 α 1 + k 2 α 2 + ⋯ + kn α n = 0 k_1\alpha_1+k_2\alpha_2+\dots+k_n\alpha_n=0k1a1+k2a2++knan=0,由( α 1 , k 1 α 1 + k 2 α 2 + ⋯ + kn α n ) = ( α 1 , 0 ) = k 1 ( α 1 , α 1 ) + k 2 ( α 1 , α 2 ) + ⋯ + kn ( α 1 , α n ) = 0 (\alpha_1,k_1\alpha_1+k_2\alpha_2+\dots+k_n\alpha_n)=(\alpha_1,0)=k_1(\alpha_1,\alpha_1)+k_2( \alpha_1,\alpha_2)+\dots+k_n(\alpha_1,\alpha_n)=0( a1,k1a1+k2a2++knan)=( a1,0)=k1( a1,a1)+k2( a1,a2)++kn( a1,an)=0 and two pairs of vector groups are orthogonal tok 1 ( α 1 , α 1 ) = 0 k_1(\alpha_1,\alpha_1)=0k1( a1,a1)=0 , andα 1 \alpha_1a1is zero, so k 1 = 0 k_1 = 0k1=0
k 2 α 2 + ⋯ + kn α n = 0 k_2\alpha_2+\dots+k_n\alpha_n=0k2a2++knan=0,由( α 2 , k 2 α 2 + ⋯ + kn α n ) = ( α 2 , 0 ) = k 2 ( α 2 , α 2 ) + ⋯ + kn ( α 2 , α n ) = 0 (\ alpha_2,k_2\alpha_2+\dots+k_n\alpha_n)=(\alpha_2,0)=k_2(\alpha_2,\alpha_2)+\dots+k_n(\alpha_2,\alpha_n)=0( a2,k2a2++knan)=( a2,0)=k2( a2,a2)++kn( a2,an)=0 and two pairs of vector groups are orthogonal,k 2 ( α 2 , α 2 ) = 0 k_2(\alpha_2,\alpha_2)=0k2( a2,a2)=0 , andα 2 \alpha_2a2is zero, so k 2 = 0 k_2 = 0k2=0 . In the same way,kn = 0 k_n=0kn=0 , so the vector groupα 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anlinearly independent.


3. Vector group equivalence, maximum linearly independent group and rank of vector group

3.1 Basic concepts

Vector group linear representation - If each vector in a vector group can be linearly represented by another vector group, it is said that this vector group can be linearly represented by another vector group.

Equivalence of vector groups ——If two vector groups have the same dimension and can be expressed linearly with each other, the two vector groups are said to be equivalent.

For two equivalent vector groups, the number of vectors is not necessarily the same.

Maximum linearly independent group and rank of vector group—— Let α 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anis a group of vectors, if it satisfies:
(1) Vector group α 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anthere is rrr vectors are linearly independent;
(2) anyr + 1 r+1r+1 vector (not necessarily) must be linearly related,
calledrrThe vector group composed of r linearly independent vectors is the vector groupα 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anThe maximum linearly independent group of , the number of vectors contained in the maximum linearly independent group is called the rank of the vector.

How about it? Is it similar to the rank of the matrix, the portal .

The vectors that can be linearly represented by other vectors are redundant vectors of the vector group, and finding the maximum linearly independent group of the vector group is essentially a process of removing redundant vectors.

The maximal linearly independent group of a vector group does not necessarily have uniqueness.

Vector group α 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anThe necessary and sufficient condition for the maximum linearly independent group of is itself is that the rank of the vector group is n . n.n.

令峰组A : α 1 , α 2 , … , α n ; B : α 1 , α 2 , … , α n , b A:\pmb{\alpha_1,\alpha_2,\dots,\alpha_n};B: \pmb{\alpha_1,\alpha_2,\dots,\alpha_n,b}A:a1,a2,,anB:a1,a2,,an,b , then the vector groupA , BA,BA,The rank of B has two situations:
(1)AArank of A and BBThe ranks of B are equal, the necessary and sufficient condition isbbb can be represented byα 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anLinear representation.
(2) AAThe rank ratio of A to BBThe rank of B is less than 1, and its necessary and sufficient condition isbbb cannot be represented byα 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anLinear representation.

A = [ α 1 , α 2 , … , α n ] A=[\pmb{\alpha_1,\alpha_2,\dots,\alpha_n}]A=[ a1,a2,,an] , if matrixAAA is transformed into B = [ β 1 , β 2 , … , β n ] B=[\pmb{\beta_1,\beta_2,\dots,\beta_n}]after a finite number of elementary column transformationsB=[ b1,b2,,bn] , then the vector groupα 1 , α 2 , … , α n \pmb{\alpha_1,\alpha_2,\dots,\alpha_n}a1,a2,,anAnd the vector group β 1 , β 2 , … , β n \pmb{\beta_1,\beta_2,\dots,\beta_n}b1,b2,,bnequivalence.

Matrix equivalence is also the definition, which can be obtained by elementary transformation.


write at the end

Regarding the nature of the vector rank and the remaining content, let's put it later.

Up to now, we have actually been exposed to matrices, vectors, and linear equations. The relationship between the rank of vectors and the solutions of equations reminded me of the relationship between matrices and equations, and matrices are composed of equations. Let People are a little fascinated. I plan to sort out these three in the next article, not waiting until the time to learn linear equations. Oo

Guess you like

Origin blog.csdn.net/Douglassssssss/article/details/132459687