N-dimensional vector of linear algebra

 Vector space is an important research object of linear algebra and has a wide range of applications.

1 n-dimensional vector operation

    A vector has both magnitude and direction, as follows:

m*n numbers aij(i=1,2,...,m;j=1,2,...,n) are arranged in a rectangular number table with m rows and n columns

    If the vectors have the same size and the same direction, then the two vectors are equal

    An ordered array (a1, a2,...,an) composed of n numbers a1, a2,...,an is called an n-dimensional vector, and these n numbers are called n components of the vector, and the components are all A vector of real numbers is called a real vector, and a vector with complex numbers in its components is called a complex vector. The column vector is represented by the letter αβ, and the row vector is represented by αT, βT

    Two vectors are equal if the following conditions are met:

    A vector whose components are all 0 is called a 0 vector, denoted as 0

    Linear operations on vectors are similar to operations on matrices:

    We can study n-ary systems of linear equations with n-dimensional vectors:

    All n-dimensional vectors on the real number field constitute an n-dimensional vector space, denoted as R^n

2 Linear dependence of groups of vectors

    If α1, α2, ..., αm, β∈R^n, there is a set of real numbers k1, k2, ..., km, satisfying the following formula, then β is said to be a vector group α1, α2, ..., αm A linear combination of k1,k2,...,km is called the combination coefficient

    In the following formulas, when k1, k2, ..., km are real numbers that are not all 0, then α1, α2, ..., αm are said to be linearly dependent, otherwise they are called linearly independent

    The necessary and sufficient condition for α1, α2, ..., αm linear correlation is that |A|=0

    If the rank of matrix A=(α1, α2, ..., αm) is less than m, it is said that α1, α2, ..., αm are linearly dependent, otherwise it is called linearly independent

    The necessary and sufficient condition for α1, α2, ..., αm to be linearly dependent is that at least one vector can be linearly represented by the remaining m-1 vectors

    Suppose two n-dimensional vector groups A: α1, α2, ..., αs and B: β1, β2, ..., βt, if any vector in B can be linearly represented by the vector in A, then it is called B can be represented linearly by A

    If A can be represented by B, and B can be represented by A, then A and B are said to be equivalent, denoted as A~B

    If there is a partial group B in the vector group A: α1, α2, ..., αs: αi1, αi2, ..., αir, if B is linearly independent, then add any other vector α, αi1, αi2, . .., αir, α linear correlation, then B is called a maximal linear independent group of A

    The number of vectors contained in the maximum linearly independent group of vector group A is called the rank of this vector group, denoted as r(A)

    If the vector group A can be linearly represented by the vector group B, then r(A)<=r(B)

    r(A+B)<=r(A)+r(B),r(AB)<=min{r(A),r(B)}

    The rank of the row vector group of the matrix is ​​called the row rank of A, and the rank of the column vector group of the matrix is ​​called the column rank of A

    The elementary row transformation of a matrix does not change the linear relationship between column vectors, and the elementary column transformation of a matrix does not change the linear relationship between row vectors

3 vector spaces

    Let V be a non-empty subset of n-dimensional vector space R^n, if

  • V is closed for vector addition, that is, for any α, β∈V, there is α+β∈V

  • V is closed for number multiplication vector, that is, any α∈V, any k∈R, there is kα∈V

    Then the set V is said to be a vector space

    Suppose α1, α2, ..., αr are vectors in the vector space V, satisfying

  • α1, α2, ..., αr are linearly independent

  • Any vector in V can be linearly represented by α1, α2, ..., αr

    Then α1, α2, ..., αr are called a basis of the vector space V, r is called the dimension of the vector space V, and V is called an r-dimensional vector space

    A: α1, α2, ..., αn and B: β1, β2, ..., βn are two sets of bases of R, if (β1, β2, ..., βn)=(α1, α2, .. ., αn)C, then the matrix C is called the transition matrix from α to β

    Let α=(a1,a2,...,an)^T, β=(b1,b2,...,bn)^T be the two vectors in R^n, write (α,β) as The inner product of α and β, the inner product is a real number, as follows:

    The length of the vector α is called the modulus |α| of α, as follows:

    A vector of length 1 is a unit vector

    Assuming two non-zero vectors α, β∈R, specify the angle θ between α and β, as follows:

    If (α, β)=0, it is said that α and β are orthogonal, that is, the necessary and sufficient condition for two non-zero vectors to be orthogonal is θ=π/2

    If the vector groups α1, α2, ..., αr are orthogonal in pairs, then α1, α2, ..., αr are called orthogonal vector groups, and the orthogonal vector groups must be linearly independent vector groups

    A matrix is ​​called an orthogonal matrix if it satisfies the following conditions:

    Orthogonal matrices have the following properties:

Guess you like

Origin blog.csdn.net/qq_40732350/article/details/128346088