[Mathematics for postgraduate entrance examination] Linear Algebra Chapter 3 - Vector | 1) Basic concepts, correlation and linear representation of vector groups


introduction

Vectors are the key and difficult point of linear algebra. A vector is a matrix, and a matrix is ​​composed of vectors. The relationship between a vector group and a matrix is ​​very close.


1. The concept and operation of vector

1.1 Basic concepts

Vector – A quantity that has both magnitude (length) and direction is called a vector, ( a 1 , a 2 , … , an ) T , ( a 1 , a 2 , … , an ) (a_1,a_2,\dots, a_n)^T,(a_1,a_2,\dots,a_n)(a1,a2,,an)T,(a1,a2,,an) are called nnrespectivelyn- dimensional column vector andnnn- dimensional row vector, whereai a_iainn called the vectorn components, in general, the vector we refer to is a column vector.

The modulus of the vector - let the vector α = ( a 1 , a 2 , … , an ) T \alpha=(a_1,a_2,\dots,a_n)^Ta=(a1,a2,,an)T ,称 a 1 2 + a 2 2 + ⋯ + a n 2 \sqrt{a_1^2+a_2^2+\dots+a_n^2} a12+a22++an2 For the vector α \alphaThe modulus or length of α , denoted as∣ α ∣ . |\alpha|.α∣.

Unitization of vectors - let vector α = ( a 1 , a 2 , … , an ) T \alpha=(a_1,a_2,\dots,a_n)^Ta=(a1,a2,,an)T is a non-zero vector, and the vectorα \alphaA vector with the same direction and a length of 1 is calledα \alphaThe unit vector corresponding to α , let α 0 = 1 ∣ α ∣ α , \alpha^0=\frac{1}{|\alpha|}\alpha,a0=α1α , then it is calledα 0 \alpha^0a0 is the vectorα \alphaNormalization vector for α .

Three arithmetic operations on vectors - addition, subtraction, and multiplication by a constant.

Inner product of vectors - set vector α = ( a 1 , a 2 , … , an ) T \alpha=(a_1,a_2,\dots,a_n)^Ta=(a1,a2,,an)T , set the vectorβ = ( b 1 , b 2 , … , bn ) T \beta=(b_1,b_2,\dots,b_n)^Tb=(b1,b2,,bn)T ,称 a 1 b 1 + a 2 b 2 + ⋯ + a n b n a_1b_1+a_2b_2+\dots+a_nb_n a1b1+a2b2++anbnis the vector α , β \alpha ,\betaa ,The inner product of β , denoted as( α , β ) . (\alpha,\beta).( a ,b ) .

1.2 Properties of Vector Operations

(1) The nature of the three arithmetic operations
insert image description here
(2) The nature of the vector inner product operation

  1. ( α , β ) = ( β , α ) = α T β = β T α . (\alpha,\beta)=(\beta,\alpha)=\alpha^T\beta=\beta^T\alpha.( a ,b )=( b ,a )=aTβ=bT a.
  2. ( α , α ) = α T α = ∣ α ∣ 2 , (\alpha,\alpha)=\alpha^T\alpha=|\alpha|^2,( a ,a )=aT a=α2 ,and( α , α ) = 0 (\alpha,\alpha)=0( a ,a )=The necessary and sufficient condition for 0 is α = 0. \alpha=0.a=0.
  3. ( a , k 1 β 1 + k 2 β 2 + ⋯ + kn β n ) = k 1 ( α , β 1 ) + k 2 ( α 2 , β 2 ) + ⋯ + kn ( α , β n ). (a,k_1\beta_1+k_2\beta_2+\dots+k_n\beta_n)=k_1(\alpha,\beta_1)+k_2(\alpha_2,\beta_2)+\dots+k_n(\alpha,\beta_n).(a,k1b1+k2b2++knbn)=k1( a ,b1)+k2( a2,b2)++kn( a ,bn).
  4. ( α , β ) = 0 (\alpha,\beta)=0( a ,b )=0 ,即 a 1 b 1 + a 2 b 2 + ⋯ + a n b n = 0 a_1b_1+a_2b_2+\dots+a_nb_n=0 a1b1+a2b2++anbn=0 , calledα , β \alpha,\betaa ,β is orthogonal, denoted asα ⊥ β \alpha \bot \betaα β , in particular, the zero vector is orthogonal to any vector.

2. Correlation and linear representation of vector groups

2.1 Theoretical background

For a system of homogeneous linear equations:

insert image description here

and a system of inhomogeneous linear equations:

insert image description here

α 1 = ( a 11 , a 21 , … , a m 1 ) T , α 2 = ( a 12 , a 22 , … , a m 2 ) T , … , α n = ( a 1 n , a 2 n , … , a m n ) T , b = ( b 1 , b 2 , … , b m ) T \alpha_1=(a_{11},a_{21},\dots,a_{m1})^T,\alpha_2=(a_{12},a_{22},\dots,a_{m2})^T,\dots,\alpha_n=(a_{1n},a_{2n},\dots,a_{mn})^T,b=(b_{1},b_{2},\dots,b_{m})^T a1=(a11,a21,,am 1)T,a2=(a12,a22,,am 2)T,,an=(a1n,a2 n,,amn)T,b=(b1,b2,,bm)T , then the equations (I) (II) can be expressed in the following vector form:x 1 α 1 + x 2 α 2 + ⋯ + xn α n = 0 ( I ) x_1\alpha_1+x_2\alpha_2+\dots+x_n\ alpha_n=0 (I)x1a1+x2a2++xnan=0I x 1 α 1 + x 2 α 2 + ⋯ + x n α n = b ( I I ) x_1\alpha_1+x_2\alpha_2+\dots+x_n\alpha_n=b (II) x1a1+x2a2++xnan=b II

1,set 1 , a 2 , … , a n \alpha_1,\alpha_2,\dots,\alpha_na1,a2,,anis a vector group, called k 1 α 1 + k 2 α 2 + ⋯ + kn α n k_1\alpha_1+k_2\alpha_2+\dots+k_n\alpha_nk1a1+k2a2++knanis the vector group α 1 , α 2 , … , α n \alpha_1,\alpha_2,\dots,\alpha_na1,a2,,anlinear combination of .
2. Let α 1 , α 2 , … , α n \alpha_1,\alpha_2,\dots,\alpha_na1,a2,,anis a vector group, bbb is a vector, if there is a set of numbersk 1 , k 2 , … , kn k_1,k_2,\dots,k_nk1,k2,,kn, making b = k 1 α 1 + k 2 α 2 + ⋯ + kn α nb=k_1\alpha_1+k_2\alpha_2+\dots+k_n\alpha_nb=k1a1+k2a2++knan, called vector bbb can be composed of vector groupsα 1 , α 2 , … , α n \alpha_1,\alpha_2,\dots,\alpha_na1,a2,,anLinear representation.

2.2 Basic concepts of correlation and linear representation

(1) Relevance

Align sublinear equation system x 1 α 1 + x 2 α 2 + ⋯ + xn α n = 0 ( ∗ ) x_1\alpha_1+x_2\alpha_2+\dots+x_n\alpha_n=0 (*)x1a1+x2a2++xnan=0) (1) If the equation system (*) has only zero solution, then the vector groupα 1 , α 2 , … , α n \alpha_1,\alpha_2,\dots,\alpha_na1,a2,,an linearly independent .

(2) If the equation system (*) has a non-zero solution, there is a set of numbers k 1 , k 2 , … , kn k_1,k_2,\dots,k_n that are not all zerok1,k2,,knMake k 1 α 1 + k 2 α 2 + ⋯ + kn α n = 0 , k_1\alpha_1+k_2\alpha_2+\dots+k_n\alpha_n=0,k1a1+k2a2++knan=0 , called the vector groupα 1 , α 2 , … , α n \alpha_1,\alpha_2,\dots,\alpha_na1,a2,,an linear correlation .

(2) Linear representation

For non-homogeneous linear equations x 1 α 1 + x 2 α 2 + ⋯ + xn α n = b ( ∗ ∗ ) x_1\alpha_1+x_2\alpha_2+\dots+x_n\alpha_n=b (**)x1a1+x2a2++xnan=b) (1) If the equation system (**) has a solution, there are constantsk 1 , k 2 , … , kn k_1,k_2,\dots,k_nk1,k2,,kn, making b = k 1 α 1 + k 2 α 2 + ⋯ + kn α nb=k_1\alpha_1+k_2\alpha_2+\dots+k_n\alpha_nb=k1a1+k2a2++knan, called vector bbb can be composed of vector groupsα 1 , α 2 , … , α n \alpha_1,\alpha_2,\dots,\alpha_na1,a2,,an Linear representation .

(2) If the equation system (**) has no solution, the vector bb is calledb cannot be composed of vector groupsα 1 , α 2 , … , α n \alpha_1,\alpha_2,\dots,\alpha_na1,a2,,anLinear representation.

2.3 Vector Group Correlation and Properties of Linear Representation

There is a lot of content in this piece, and I will put it in the next article.

Guess you like

Origin blog.csdn.net/Douglassssssss/article/details/132437895