Linear Algebra (5) Linear Space

foreword

" Linear Algebra (3) Linear Equations & Vector Spaces " I understand linear spaces by solving linear equations. This chapter looks at it from another angle

what is space

Everyone is more familiar: the plane Cartesian coordinate system is the most common two-dimensional space. The
insert image description here
space is composed of infinitely many coordinate points.
insert image description here
Each coordinate point is a vector.
insert image description here

  • Conversely, it can also be said: 2-dimensional space is composed of infinitely many 2-dimensional vectors
  • Similarly, in 3D space, each 3D coordinate point is a 3D vector
  • Then the same reason: there are infinitely many 3D vectors in 3D space, or 3D space is composed of infinitely many 3D vectors

All vectors in the space can be expressed as e 1 ⃗ , e 2 ⃗ , . . . , en ⃗ \vec{e_{1}},\vec{e_{2}},...,\vec{e_ {n}}e1 ,e2 ,...,en The linear combination of , if there is a vector denoted as: a ⃗ \vec{a}a
a ⃗ = k 1 ⋅ e 1 ⃗ + k 2 ⋅ e 2 ⃗ + . . . + kn ⋅ en ⃗ ,k 1 , k 2 , . . . , kn can be solved\vec{a}=k_{1 }·\vec{e_{1}}+k_{2}·\vec{e_{2}}+...+k_{n}·\vec{e_{n}}, k_{1},k_{ 2},...,k_{n} have solutionsa =k1e1 +k2e2 +...+knen k1,k2,...,knIt is
said that these vectorse 1 ⃗ , e 2 ⃗ , . . . , en ⃗ \vec{e_{1}},\vec{e_{2}},...,\vec{e_{ n}}e1 ,e2 ,...,en for this space base

Definition and properties of linear space

insert image description here
insert image description here
insert image description here

vector addition

insert image description here
[ x 1 y 1 ] + [ x 2 y 2 ] = [ x 1 + x 2 y 1 + y 2 ] = [ 2 + 3 4 + 1 ] \begin{bmatrix} x_1 \\ y_1 \end{bmatrix} + \begin{bmatrix} x_2 \\ y_2 \end{bmatrix} = \begin{bmatrix} x_1 + x_2 \\ y_1+ y_2 \end{bmatrix} = \begin{bmatrix} 2 + 3 \\ 4+ 1 \end{bmatrix} [x1y1]+[x2y2]=[x1+x2y1+y2]=[2+34+1]

Number and vector multiplication

insert image description here
[ x y ] ∗ 2 = [ 2 x 2 y ] \begin{bmatrix} x \\ y \end{bmatrix} * 2 = \begin{bmatrix} 2x \\ 2y \end{bmatrix} [xy]2=[2x _2 y]

Dimensions , Coordinates and Basis

insert image description here
A concept of linear independence appears here, and the concept of linear independence here is similar to the linear independence in the vector space, but the scope of the vector becomes wider.

insert image description here

  1. The basis of the n-dimensional linear space V is not unique. Any n linearly independent vectors in V are a basis of V
  2. vector a ⃗ \vec{a}a target ( a 1 , a 2 , . . . an ) (a_1,a_2,...a_n)(a1,a2,...an)( ε 1 , ε 2 , . . . ε n ) (\varepsilon_1,\varepsilon_2,...\varepsilon_n)( e1,e2,... ehn) basis, is unique and definite
How to determine the dimension and basis of linear space

insert image description here

Euclidean space

Euclidean space is a type of space, a special collection. Elements in a Euclidean set: ordered tuples of real numbers

Example: (2,3)(2,4)(3,4)(3,5) is an ordered 2-tuple of real numbers

  • Order means: such as (2,3) and (3,2) are two different elements
  • That is: the real numbers in each element are sequential
  • A real number means: the numbers in each element are ∈ R
  • Tuple means: each element is composed of several ordered numbers
  • Such as: 2 numbers form = 2 tuples, n numbers form = n tuples

Euclidean set = ordered tuple of real numbers = set of n-dimensional coordinate points
So, the Euclidean space is the space we use from small to large

Euclidean space conforms to 8 theorems of space

subspace

A subspace is a part of a whole space. But it is also a space and must satisfy the definition of a vector space.
insert image description here

intersection of subspaces

insert image description here

sum of subspaces

V 1 , V 2 V_1,V_2 of the subspaceV1,V2The union of is not a simple addition of elements, resulting in "the union of subspaces does not belong to the subspace".
insert image description here
So define the sum of the subspaces
insert image description here

direct sum of subspaces

insert image description here
Direct sums of subspaces are special sums. The basis requires that each subspace is independent of each other.

The entire linear space can be regarded as a big cake.

  • Straight and decomposition is to cut the cake into small pieces, each small piece of cake is a subspace, there is no intersection between all the small cakes, and they can be assembled into the whole cake.
  • The sum of the subspaces is that the cake was not cut properly when dividing the cake, and the small cakes cannot form the whole cake (the intersection between the subspaces is not empty).

inner product space

In the previous content, we abstractly introduced vectors, matrices, and linear transformations in linear space. But in geometry, the vector also has the modulus of the vector, the inner product operation of the vector, etc. In order to introduce operations such as the modulus of vectors and the inner product of vectors, we introduce the "definition of inner product". That is, inner product space = linear space + inner product definition.
insert image description here
insert image description here

angle between vectors

insert image description here
cos ⁡ θ = cos ⁡ ( α − β ) = cos ⁡ ( α ) cos ⁡ ( β ) + sin ⁡ ( α ) sin ⁡ ( β ) = x 1 x 1 2 + y 1 2 ∗ x 2 x 2 2 + y 2 2 + y 1 x 1 2 + y 1 2 ∗ y 2 x 2 2 + y 2 2 \cos\theta = \cos(\alpha-\beta) =\cos(\alpha)\cos(\beta) + \sin(\alpha)\sin(\beta)=\cfrac{x_1}{\sqrt{\gdef\bar#1{#1^2} \bar{x_1} + \bar{y_1} }} * \ cfrac{x_2}{\sqrt{\gdef\bar#1{#1^2} \bar{x_2} + \bar{y_2} }} + \cfrac{y_1}{\sqrt{\gdef\bar#1{ #1^2} \bar{x_1} + \bar{y_1} }} * \cfrac{y_2}{\sqrt{\gdef\bar#1{#1^2} \bar{x_2} + \bar{y_2 } }}cosi=cos ( ab )=cos ( α )cos ( b )+sin ( a )sin ( b )=x12+y12 x1x22+y22 x2+x12+y12 y1x22+y22 y2
cos ⁡ θ = x 1 x 2 + y 1 y 2 x 1 2 + y 1 2 x 2 2 + y 2 2 = a ⃗ ∗ b ⃗ ∣ a ⃗ ∣ ∣ b ⃗ ∣ \cos\theta = \cfrac{x_1x_2+y_1y_2}{\sqrt{\gdef\bar#1{#1^2} \bar{x_1} + \bar{y_1}}\sqrt{\gdef\bar#1{#1^2} \bar{x_2} + \bar{y_2}}} = \cfrac{\vec{a} *\vec{b}}{|\vec{a} ||\vec{b}|} cosi=x12+y12 x22+y22 x1x2+y1y2=a ∣∣b a b

The above-mentioned a, b vectors are only in the 2-dimensional coordinate system. If the coordinate system is converted to n-dimensional, that is, the vector a is (x1, x2, x3...xn) and the vector b is (y1, y2, y3...yn)
cos ⁡ θ = ∑ i = 1 n ( xi ∗ yi ) ∑ i = 1 nxi 2 ∑ i = 1 nyi 2 = [ a , b ] [ a , a ] [ b , b ] \cos\theta = \cfrac{\ sum_{i=1}^n(x_i*y_i)}{\sqrt{\sum_{i=1}^n\gdef\bar#1{#1^2} \bar{x_i}}\sqrt{\sum_ {i=1}^n\gdef\bar#1{#1^2} \bar{y_i}}}=\cfrac{[a,b]}{\sqrt{[a,a]}\sqrt{[ b,b]}}cosi=i=1nxi2 i=1nyi2 i=1n(xiyi)=[a,a] [b,b] [a,b]

The angle between two vectors θ \thetaθ =90°, that is, the twovectors are orthogonal.

Two vectors are orthogonal to each other, and these two vectors are combined into a set of vectors, which is called an orthogonal vector set

insert image description here

Orthogonal basis

insert image description here
If ∣ en ∣ = 1 |e_n|=1en=1 , it is calledan orthonormal basis

Schmidt solves for an orthogonal basis

Through a simple projection method, you can find the orthogonal basis of a basis
insert image description here
Known a set of basis { KaTeX parse error: Expected 'EOF', got '}' at position 18: …lpha_1,\alpha_2}̲ Find its orthogonal basis

  1. β 1 = α 1 \beta_1=\alpha_1b1=a1
  2. Get β 1 \beta_1b1The unit basis on is β 1 [ β 1 , β 1 ] \cfrac{\beta_1}{\sqrt{[\beta_1,\beta_1]}}[ b1,b1] b1
  3. Calculate α 1 \alpha_1a1In beta 1 \beta_1b1projection on
  4. Calculate the projection length, [ α 2 , β 1 ] [ α 2 , α 2 ] [ β 1 , β 1 ] ∗ [ α 2 , α 2 ] \cfrac{[\alpha_2,\beta_1]}{\sqrt{[\ alpha_2,\alpha_2]}\sqrt{[\beta_1,\beta_1]}} *\sqrt{[\alpha_2,\alpha_2]}[ a2,a2] [ b1,b1] [ a2,b1][ a2,a2]
  5. The projection is length * β 1 \beta_1b1The identity basis on [ α 2 , β 1 ] [ β 1 , β 1 ] ∗ β 1 \cfrac{[\alpha_2,\beta_1]}{[\beta_1,\beta_1]} *\beta_1[ b1,b1][ a2,b1]b1
  6. The orthonormal basis is α 2 − [ α 2 , β 1 ] [ β 1 , β 1 ] ∗ β 1 \alpha_2 - \cfrac{[\alpha_2,\beta_1]}{[\beta_1,\beta_1]} *\ beta_1a2[ b1,b1][ a2,b1]b1
  7. 正交基组载{ α 2 − [ α 2 , β 1 ] [ β 1 , β 1 ] ∗ β 1 , [ α 2 , β 1 ] [ β 1 , β 1 ] ∗ β 1 \alpha_2 - \cfrac{ [\alpha_2,\beta_1]}{[\beta_1,\beta_1]} *\beta_1,\cfrac{[\alpha_2,\beta_1]}{[\beta_1,\beta_1]} *\beta_1 a2[ b1,b1][ a2,b1]b1,[ b1,b1][ a2,b1]b1}

If it is three-dimensional
insert image description here

Quadrature Complement

Definition: Let UUU isVVV的子空间,则 U ⊥ = { v ∈ V : ∀ u ∈ U < v , u > = 0 } U^\perp =\{v\in V : \forall u\in U \left< v,u\right> =0 \} U={ vV:uUv,u=0 } call itUUOrthogonal complement of U. ∀ u \forall uu meansall uin the set

  1. U ⊥ U^\perp U isVVsubspace of V ;
  2. V ⊥ = { 0 } V^\perp=\{0\}V={ 0 } and{ 0 } ⊥ = V \{0\}^\perp=V{ 0}=V
  3. U ⊥ ∩ U = { 0 } U^\perp \cap U = \{0\} UU={ 0};
  4. Like U , WU,WU,W isVVA subset of V , and U ⊆ WU\sube WUW ,则 W ⊥ ⊆ U ⊥ W^\perp \sube U^\perp WU

Theorem: Orthogonal decomposition of finite-dimensional subspace: V = U ⊕ U ⊥ V = U \oplus U^\perpV=UU

  1. ( U ⊥ ) ⊥ = U (U^\perp)^\perp=U (U)=U
  2. dim ⁡ V = dim ⁡ U + dim ⁡ U ⊥ \dim V = \dim U + \dim U^\perp dimV=dimU+dimU

How to solve the basis of the orthogonal complement?

  1. Suppose dim V = 3 , dim U = 2 and the basis set is [ { 1 , 0 , 0 } , { 0 , 1 , 0 } ] dim V = 3 , dim U = 2 and the basis set is [\{1,0 ,0\},\{0,1,0\}]dimV=3,d im U=2 and the basis set is [{ 1 ,0,0},{ 0,1,0}]
  2. Get matrix A = [ 1 0 0 0 1 0 0 0 0 ] A=\begin{bmatrix} 1 &0&0 \\ 0&1&0 \\ 0&0&0 \end{bmatrix}A= 100010000
  3. Set U ⊥ U^\perpU的基组 x ⃗ = [ x y z ] \vec{x}=\begin{bmatrix} x\\ y\\ z \end{bmatrix} x = xyz
  4. getA x = 0 Ax=0Ax=0 homogeneous equations, you generally solve it as {0,0,1}

The basis of the orthogonal complement is the solution of the equation system, the number of solutions = dim V - R(A)

main reference

" Euclidean space is a vector space "
" What is a generated space "
" Intersection and sum of subspaces "
" 3.10 Operations of subspaces "
" Orthogonal basis and orthonormal basis "
" How to understand Schmidt (Schmidt) Orthogonalization " Orthogonal complements
( orthogonal complements) "

Guess you like

Origin blog.csdn.net/y3over/article/details/131281406
Recommended