[Advanced Algebra] Linear Space - Summary of Knowledge Points

1. collection, mapping

Number field : Let P be a collection of complex numbers, including 0 and 1. If the sum, difference, product, or quotient (divisor is not 0) of any two numbers in P is still a number in P, it is called P is a number field.

Common number field: complex number field C; real number field R; rational number field Q.

Set : Refers to a bunch of things viewed as a whole, for example, a straight line is a set of points, and all the solutions of a system of linear equations form a set, that is, the solution set.

Elements : The things that make up a set are called elements of the set.

Collection description method :

  1. Descriptive method: M={x | x has property P}
    such as M = {(x, y) | x 2 +y 2 = 4, x, y∈R}
  2. Enumeration method: M={a 1 , a 2 ,…,a n }
    such as N={0,1,2,3,…}
  3. Empty collection : contains no elements. An example is the set of solutions to a system of linear equations with no solutions.

Relationships between collections :

  1. B is a subset of A, and B is said to be included in A
  2. A and B are equal, recorded as A=B

Operations between sets : intersection, union

Mapping : Let M and M´ be two given non-empty sets. If there is a corresponding rule σ, through this rule σ, for each element a in M, there is a uniquely determined element and It corresponds, then σ is called a mapping from M to M´, denoted as σ: M → M' calls a´ the image
of a under the mapping σ , and a is called the **preimage of a´ under the mapping σ , ** recorded as σ(a)=a' or σ: a↦a'

A mapping of a set M to M itself is called a transformation of M

Let the mapping σ: M → M', the set σ(M) = {σ(a)|a∈M} is called the image of M under the mapping σ , usually denoted as Imσ. Clearly, Imσ ⊆ M'

Unit map : σ maps each element to itself, which is called the identity map or unit map of the set M, denoted as I M .

Functions : Functions can be thought of as a special case of maps.

Mapping multiplication : Let the mapping σ: M → M', τ: M' → M'', the product τσ be defined as (τσ)(a)=τ(σ(a)), that is, the result of successively implementing σ and τ , is A mapping from M to M".

Multiplicative associative law of mapping : Let mapping σ: M → M', τ: M' → M'', φ: M'' → M''', (φτ)σ = φ(τσ).

The nature of the mapping : Let the mapping σ: M → M',

  1. If Imσ = M', that is, for any y ∈ M', there exists x ∈ M, so that y = σ(x), then σ is said to be a surjection from M to M´ (or σ is called mapping );

  2. If the images of different elements in M ​​are also different, that is, a 1 ≠ a 2 must have σ(a 1 ) ≠ σ(a 2 ), then σ is called an injective from M to M´ (or σ is 1— 1 );

  3. If σ is both injective and surjective, then σ is called bijective (or σ is called 1-1 correspondence)

    For finite sets, the necessary and sufficient condition for the existence of a bijection between two sets is that they contain the same number of elements;

    For a finite set A and its subset B, if B≠A (that is, B is a proper subset of A), then there cannot be a 1-1 correspondence between A and B; but this is not necessarily the case for an infinite set.

Reversible mapping : Let the mapping σ: M → M', if there is a mapping τ: M' → M, so that τσ = I M , στ = I M , then σ is called a reversible mapping, and τ is the inverse mapping of σ , denoted as σ -1 .

  • If σ is a reversible mapping, then σ -1 is also a reversible mapping, and (σ -1 ) -1
  • 若σ(a) = a',则 σ -1 (a') = a
  • The necessary and sufficient condition for σ to be a reversible map is that σ is 1-1 corresponding to

2. Definition and simple properties of linear spaces

The purpose of introducing linear space : to study whether there are solutions to linear equations, how many solutions there are, and the structure of solution sets when there are infinitely many solutions

Definition of linear space : Suppose V is a non-empty set, P is a number field, and an algebraic operation is defined in the set V, called addition: that is, for ∀ɑ, β ∈ V, there is a unique one in V The element γ corresponds to them, and γ is called the sum of ɑ and β , denoted γ = ɑ + β; there is also an operation defined between the elements of P and V, called quantitative multiplication: that is, ∀ɑ ∈ V, ∀k ∈ P, there is a unique element δ in V and they correspond to each other, and δ is called the product of k and α, which is recorded as δ=kα.

If addition and multiplication are closed and satisfy the following eight operation rules, then V is called a linear space on the number field P :

Addition rules:

  • a + b = b + a

  • (α + β) + γ = α + (β + γ)

  • There is an element 0 in V, for ∀ɑ ∈ V, α + 0 = α

    (Element 0 with this property is called the zero element of V)

  • For ∀ɑ ∈ V, there is an element β in V such that α + β = 0

    (β is called the negative element of α)

Quantity multiplication rules:

  • 1a = a
  • k(lα) = (kl)α

Quantity multiplication and addition rules:

  • (k + l)α = kα + lα
  • k(α + β) = kα + kβ

Note:

  1. Addition and quantitative multiplication that satisfy the above eight rules are also called linear operations

  2. The elements of a linear space are also called vectors , and the linear space is also called a vector space. But the vector here is not necessarily an ordered array.

  3. If the set is not closed for the defined addition and multiplication operations, or the operation is closed but does not satisfy any of the eight rules, then the set cannot form a linear space.

    For example, for the linear space R 2 , if the vector [0, 0] T is removed , it does not satisfy the additive multiplication closure, so it is not a linear space, so the linear space must contain zero vectors ;

  4. A linear space containing only one vector—the zero vector {0} is called the null space .

  5. P[x] unary polynomial ring on the exponent field P , representing a polynomial defined on the number field P

    P[x] n represents the space formed by polynomials defined on the number field P of degree no more than n plus zero polynomials.

  6. P is a linear space composed of all m ×n matrices on the m×n exponent field P, which is m×n dimensional.

Properties of linear spaces :

  1. zero elements are unique;
  2. ∀ɑ ∈ V , its negative elements are unique, denoted as -α
  3. 0ɑ = 0, k0 = 0,(-1)ɑ = -ɑ, k(α - β) = kα - kβ
  4. If kα = 0, then k=0 or α=0

3. Dimensions, basis and coordinates

Linear relationship between vectors in linear space : We know in linear algebra that if V is a linear space on the number field P, then:

  1. α 12 ,···,α~r ~∈ V(r ≥1), k 1 ,k 2 ,···,k r ∈ P, then the formula k 1 α 1 +k 2 α 2 +· ··+k r α r is called a linear combination of the vector group α 12 ,···,α r ;

  2. α 12 ,···,α r ,β~ ~∈ V, if there exists k 1 ,k 2 ,···,k r ∈ P, let β = k 1 α 1 +k 2 α 2 +· ··+k r α r , then the vector β can be expressed linearly through the vector group α 12 ,···.,α r ​;

    If each vector in the vector group β 1 , β 2 ,···,β s can be expressed linearly through the vector group α 12 ,···,α r , then the vector group β 12 , ···, β s can be expressed linearly through the vector group α 12 ,···,α r ​;

    Two vector groups are said to be equivalent if they can be expressed linearly with each other ;

  3. α 12 ,···,α r ∈ V, if there exists a number k 1 ,k 2 ,···,k r ∈ P that is not all zero, such that k 1 α 1 +k 2 α 2 +·· ·+k r α r = 0, then the vector group α 12 ,···,α r is said to be linearly dependent ;

    If k 1 α 1 +k 2 α 2 +···+k r α r = 0 holds true only when k 1 =k 2 = ··· = k 1 = 0 , then the vector set α 1 , α 2 ,···,α r is linearly independent .

Conclusion :

  1. One vector in the vector group α 1 , α 2 ,···,α r is linearly dependent⇔ α 12 ,···,α r can be expressed linearly through the rest of the vectors;

  2. If the vector group α 12 ,···,α r is linearly independent and can be expressed linearly by the vector group β 12 ,···,β s , then r ≤ s;

    If the vector group α 12 ,···,α r is equivalent to the vector group β 12 ,···,β s , then r = s;

  3. If the vector group α 12 ,···,α r is linearly independent, but the vector group α 12 ,···,α r ,β is linearly dependent, then β can be obtained by the vector group α 12 , ···, α r is expressed linearly, and the expression is unique.

Infinite-dimensional linear space : If any number of linearly independent vectors can be found in the linear space V, then V is said to be an infinite-dimensional linear space .

Finite-dimensional linear spaces :

  1. n-dimensional linear space : if there are n linearly independent vectors in the linear space V, but any n+1 vectors are linearly related, then V is said to be an n-dimensional linear space ; often recorded as dimV=n (zero space dimension is defined as 0)
  2. Basis : In an n-dimensional linear space V, n linearly independent vectors ɛ 1 , ɛ 2 ,···,ɛ n are called a set of bases of V ;
  3. Coordinates : Let ɛ 1 , ɛ 2 ,···,ɛ n be a set of basis of linear space V, α ∈ V, if α = a 1 ɛ 1 +a 2 ɛ 2 +···+a n ɛ n , a 1 , a 2 ,···,a n ∈ P, then the array a 1 ,a 2 ,···,a n is called the coordinate of α under the base ɛ 12 ,··,ɛ n , denoted as (a 1 ,a 2 ,···,a n ), it is unique, and the coordinates of α are generally different in different bases.

Determination of basis and dimension of linear space : If the vector group α 1 , α 2 ,···,α n in linear space V satisfy:

  1. Independence : α 1 , α 2 ,..., α~n ~linearly independent;
  2. Representability : ∀β ∈ V, β can be expressed linearly through α 1 , α 2 ,···,α n , then V is an n-dimensional linear space, α 12 ,···,α n is V a set of bases

Standard basis : Generally, the vector space P n = {(a 1 ,a 2 ,…,a n )|a i ∈ P, i = 1,2,…,n} is n-dimensional, ɛ 1 =(1 ,0,…,0),ɛ 2 =(0,1,…,0),···,ɛ n =(0,0,…,1) is a set of basis of P n . is called the standard basis of P n .

Note:

  • The basis of n-dimensional linear space V is not unique, and any n linearly independent vectors in V are a set of basis of V.
  • Any two sets of basis vectors are equivalent

4. Base transformation and coordinate transformation

Vector form notation :

  1. V is an n-dimensional linear space on the number field P, α 12 ,...,α~n ~ is a set of vectors in V, β ∈ V, if β = x 1 α 1 +x 2 α 2 + ···+x n α n , then recorded as

insert image description here

  1. V is an n-dimensional linear space on the number field P, α 12 ,···,α n and k 1 ,k 2 ,···,k n are two sets of vectors in V, if

insert image description here

is denoted as

insert image description here

Basis transformation : Let V be an n-dimensional linear space on the number field P, ɛ 1 , ɛ 2 ,···,ɛ n , ɛ' 1 ,ɛ' 2 ,···,ɛ' n are two groups of bases in V, like

insert image description here

Then the coefficient matrix on the right side of the formula is called the transition matrix from the basis ɛ 1 , ɛ 2 ,···,ɛ n to the basis ɛ' 1 ,ɛ' 2 , ··,ɛ' n , and the above formula is called the transition matrix from the basis ɛ 12 , ··· n to the base ɛ' 1 ,ɛ' 2 ,··,ɛ' n basis transformation formula.

Nature :

  1. Transition matrices are all invertible matrices; conversely, any invertible matrix can be regarded as a transition matrix between two groups of bases;
  2. If the transition matrix from base ɛ 1 , ɛ 2 ,···,ɛ n to base ɛ' 1 ,ɛ' 2 ,··,ɛ' n is A, then from base ɛ' 1 ,ɛ' 2 ,· ··,ɛ' n to base ɛ 12 ,···, the transition matrix of ɛ n is A -1
  3. If the transition matrix from base α 12 ,···,α n to base β 12 ,···,β n is A, from base β 12 ,··,β n to The transition matrix of the basis γ 1 , γ 2 ,···,γ n is B, then the transition from the basis α 12 ,···,α n to the basis γ 12 ,··,γ n Matrix is ​​AB

Coordinate transformation : V is an n-dimensional linear space on the number field P, ɛ 1 , ɛ 2 ,···,ɛ n and ɛ' 1 ,ɛ' 2 ,··,ɛ' n are two groups of bases in V, and

insert image description here

Suppose ξ∈V and the coordinates of ξ under the basis ɛ 1 , ɛ 2 ,··,ɛ n and ɛ' 1 ,ɛ' 2 ,··,ɛ' n are respectively (x 1 ,x 2 ,·· ·,x n ) and ( x' 1 ,x' 2 ,···,x' n ), namely

insert image description here

insert image description here

insert image description here

The above two formulas are called the coordinate transformation formula of the vector ξ under the basis transformation

5. Linear Subspace

Definition of linear subspace : Let V be a linear space on the number field P, set W ⊆ V (W≠∅), if W also constitutes a linear space on the number field P for the two operations (addition and multiplication) in V space, then W is called a linear subspace of V , or subspace for short .

Note:

  • The dimension of any linear subspace cannot exceed the dimension of the entire space

  • The subset W = {0} containing only zero vectors is a linear subspace of V, called the zero subspace of V

Solution space : n-ary homogeneous linear equations

insert image description here

The set W of all solution vectors of is a subspace of the n-dimensional vector space P n for the linear space formed by the usual vector addition and quantity multiplication, and W is called the solution space of the above equations .

Note:

  • Dimension of solution space W = n - rank (A)
  • A basic solution system of the system of equations is a set of basis of the solution space W

Generate subspace : V is a linear space on the number field P, α 1 , α 2 ,..., α n are a group of vectors in V, then the subspace

insert image description here

The subspace generated by α 1 , α 2 , ···,α~n ~​ called V is denoted as L(α 12 ,···,α n ), called α 12 , ···,α n is a set of generators of L(α 12 ,···,α n ) .

Theorem :

  • Suppose W is any subspace of n-dimensional linear space V, α 12 ,···,α n is a set of basis of W, then W = L(α 12 ,···,α n )

  • α 12 ,···,α r and β 12 ,···,β s are two sets of vectors in the linear space V, then L(α 12 ,···,α r ) = L(β 12 ,···,β s ) ⇔ α 12 ,···,α r is equivalent to β 12 ,···,β s

  • Dimension of generating subspace L(α 12 ,···,α r )= rank of vector group α 12 ,···,α r

    Inference: Let α 1 , α 2 ,...,α s be a group of vectors in the linear space V that are not all zero, α i1i2 ,...,α ir (r≤s) is one of its poles Large irrelevant group, then L(α 12 ,···,α s ) = L(α i1i2 ,···,α ir )

  • Let α 1 , α 2 ,···,α n be a set of basis of n-dimensional linear space V on P, and A be an n×s matrix on P, if (β 1 , β 2 , ···,β s ) = (α 12 ,···,α n )A, then the dimension of L(β 12 ,···,β s ) = Rank(A)

  • Base expansion theorem : W is an m-dimensional subspace of n-dimensional linear space V, and α 1 , α 2 ,..., α m are a set of bases of W, then this set of vectors must be expanded to a set of bases of V .That is, n-m vectors α m+1 , α m+2 ,···,α n must be found in V , so that α 12 ,···,α m are a group of basis of V.

6. Intersection and sum of subspaces

Intersection of subspaces : Let V1 and V2 be subspaces of linear space V, then the set V 1 ∩ V 2 = {a | a ∈ V 1 and a ∈ V 2 } is also a subspace of V, called V1 and Intersection space of V2 .

The intersection of multiple subspaces is

insert image description here

Sum of subspaces : Let V1 and V2 be subspaces of linear space V, then set V 1 +V 2 = {a 1 +a 2 | a 1 ∈ V 1 , a 2 ∈ V 2 } is also a subspace of V , called the sum space of V1 and V2 .

The sum of multiple subspaces is denoted as

insert image description here

insert image description here

Note:

  • The Intersection and Sum of Subspaces Satisfy the Commutative Law
  • The union of subspaces is not necessarily a subspace of V

Properties related to the intersection and sum of subspaces :

  1. Let V 1 , V 2 , W be subspaces of the linear space V:

    If W⊆V 1 , W⊆V 2 , then W⊆V 1 ∩V 2

    If V 1 ⊆W, V 2 ⊆W, then V 1 +V 2 ⊆W

  2. Suppose V 1 , V 2 are the subspaces of the linear space V, then the following three conditions are equivalent:

    V1⊆V2

    V 1 ∩V 2 =V 1

    V 1 + V 2 = V 2

  3. α 12 ,···,α r and β 12 ,···,β s are two sets of vectors in the linear space V, then L(α 12 ,···,α r ) + L(β 12 ,···,β s ) = L(α 12 ,···,α r , β 12 ,···,β s )

  4. Dimension formula : Let V 1 and V 2 be the two subspaces of the linear space V, then dimV 1 + dimV 2 = dim(V 1 +V 2 ) + dim(V 1 ∩V 2 ). It can be seen that the dimension of the sum of subspaces is often smaller than the sum of the dimensions of the subspaces.

    Inference: Let V 1 and V 2 be the two subspaces of the linear space V, if dimV 1 + dimV 2 > n, then V 1 and V 2 must contain non-zero common vectors. That is, V 1V 2 must contain non-zero zero vector.

Application : In P n , use W 1 and W 2 to represent the solution spaces of two homogeneous linear equations respectively, then W 1W 2 is the solution space of the common solutions of these two homogeneous linear equations.

7. Direct sum of subspaces

The definition of the direct sum : Let V 1 , V 2 be the two subspaces of the linear space V, if the decomposition formula of each vector α in the sum V 1 +V 2 is α = α 1 + α 2 , α 1V 1 , α 2 ∈ V 2 is unique, and V 1 + V 2 is called a direct sum , denoted as V 1 ⊕V 2

Straightforward judgment :

  1. The sum V 1 +V 2 is the only direct sum ⇔ zero vector decomposition formula, that is, if α 1 + α 2 = 0, α 1 ∈ V 1 , α 2 ∈ V 2 , then there must be α 1 = α 2 = 0

  2. and V 1 +V 2 are straight and ⇔ V 1 ∩ V 2 ={0}

  3. and V 1 +V 2 is straight and ⇔ dimV 1 + dimV 2 = dim(V 1 +V 2 )

  4. Suppose U is a subspace of linear space V, then there must be a subspace W, so that V = U ⊕ W, such W is called a cosubspace of U, and the cosubspace is generally not unique (unless U is a trivial subspace )

  5. Suppose ɛ 1 , ɛ 2 ,···,ɛ n , ɳ 12 ,···,ɳ n are a group of basis in the linear subspace V 1 , V 2 respectively, then

    And V 1 +V 2 is straight and ⇔ ɛ 12 ,···,ɛ n12 ,···,ɳ n are linearly independent

The direct sum of multiple subspaces : Let V 1 , V 2 ,...,V s all be subspaces of the linear space V, if the decomposition formula of each vector α in V 1 +V 2 +...+V s is α = α 1 + α 2 + … + α s , α i ∈ V i , i=1,2,…,s is unique, then the sum is called the direct sum , recorded as V 1 ⊕V 2 ⊕…⊕V s

Judgment of the direct sum of multiple subspaces : the same as the above direct sum judgment, note:insert image description here

Note: Every n-dimensional linear space can be expressed as a direct sum of n one-dimensional subspaces;

8. Isomorphisms of linear spaces

Definition of isomorphic mapping : Let V, V' be a linear space on the number field P, if the mapping σ: V→V' has the following properties:

  1. σ is bijective
  2. σ(α+β) = σ(α)+σ(β) ,∀α, β ∈ V
  3. σ(kα) = kσ(α),∀k ∈ P,∀α ∈ V

Then σ is said to be an isomorphic map from V to V' , and the linear space V is said to be isomorphic to V' , denoted as V≅V'

Isomorphic related conclusions :

  1. Any n-dimensional linear space on the number field P is isomorphic to P n .

  2. Suppose V and V' are both linear spaces on the number field P, and σ is an isomorphic map from V to V', then:

    1. σ(0)=0,σ(-α)=-σ(α)
    2. σ(k 1 α 1 +k 2 α 2 +···+k r α r ) = k 1 σ(α 1 )+k 2 σ(α 2 )+…+k r σ(α r ),k i ∈ P,α i ∈ V,i=1,2,…,r
    3. The necessary and sufficient condition for the vector group α 12 ,···,α r to be linearly dependent (linearly independent) in V is that their images σ(α 1 ), σ(α 2 ),…, σ(α r ) are linear Correlated (linearly independent)
    4. dimV = dimV’
    5. σ: The inverse mapping σ -1 of V→V' is the isomorphic mapping from V' to V
    6. If W is a subspace of V, then the image set of W under σ σ(W) = {σ(α) | α∈W} is the subspace of V', and dimW = dimσ(W)
    7. From the above six points, it is known that the isomorphic map maintains zero elements, negative elements, linear combinations and linear dependencies, and the isomorphic map maps subspaces into subspaces
  3. Is the product of two isomorphic maps still an isomorphic map?

    Note: The isomorphic relationship has reflexivity, symmetry and transitivity

  4. Two finite-dimensional linear spaces V and V' on the number field P are isomorphic ⇔ dimV 1 = dimV 2

Guess you like

Origin blog.csdn.net/qq_43557907/article/details/127352050