1.3 Matrices and Elementary Row Operations

Hoffman这本书有个特点,初读文字有些拗口,英语没点阅读功底容易理解错误他的意思,但是如果能读进去,就能发现这本书最好的地方:视角高。如果说Rudin的分析是大巧不工的大师级,Hoffman的线代就是一个略有些唠叨的大师级。从1.3开始,就有些体会出这样的意思。这一节至少有如下两个拔高:矩阵拔高到函数维度,row-equivalence是一个等价关系。
这一节说Matrix和elementary row operations,我在学北大王萼芳那本高等代数时候也是先学的这一块(当然要在行列式之后),当时就觉得很难理解,把一堆数排成方块形有什么作用,对行进行变换的时候为何选那特定的三种变换。Hoffman是从上一节说的system of linear equations得出的matrix,实际是系数矩阵,但随后正式给的定义又是:matrix是从整数对 ( i , j ) (i,j) F F 的一个函数,其entry就是这一函数在特定的 ( i , j ) (i,j) 上的取值;我们常见的矩阵的样子是矩阵最方便的一种表示。
接下来给出三种elementary row operations:某一行变为原来的 c c ( c 0 ) (c\neq 0) ;变为自身加上另外一行的 c c 倍;调换两行。这三种都可以分别用一个函数 e e 精确的表示出来,这个 e e 是定义在一个固定的 m m (矩阵的行数)上,不用管列是多少。为何选这三种?最重要的是这三种方式可以用同样的方式可逆,也就是Theorem 2的结论。由elementary row operations得到的矩阵和原来矩阵是row-equivalent的,并且揭示了row-equivalent满足self-reflexive,symmetric和transitivity,故而是一个等价关系。由于elementary row operations不会改变以该矩阵作为coefficient matrix的方程组系统的解空间,因此Theorem 3说明:row-equivalent的两个矩阵为coefficient matrix对应的方程组系统有相同的解空间。
最后,有一个row-reduced的定义,即满足一是每一个非零行从左往右先碰到的非零元素(leading non-zero entry)必须是1,二是含有“一是”里这样的1的列,其他元素都是0。Theorem 4说明任何一个matrix都可以被row-reduced。证明是纯说理性的,可能那个时代(20世纪70年代)画图不易,硬是像讲故事一样把证明说清楚了。

Exercises

1. Find all solutions to the system of equations

( 1 i ) x 1 i x 2 = 0 2 x 1 + ( 1 i ) x 2 = 0 (1-i)x_1-ix_2=0\\2x_1+(1-i)x_2=0
Solution: We have
{ ( 1 i ) x 1 i x 2 = 0 2 x 1 + ( 1 i ) x 2 = 0 { 2 x 1 ( 1 + i ) i x 2 = 0 2 x 1 + ( 1 i ) x 2 = 0 { 2 x 1 + ( 1 i ) x 2 = 0 2 x 1 + ( 1 i ) x 2 = 0 \begin{cases}(1-i) x_1-ix_2=0\\2x_1+(1-i) x_2=0\end{cases} \Rightarrow \begin{cases}2x_1-(1+i)ix_2=0\\2x_1+(1-i) x_2=0\end{cases} ⇒ \begin{cases}2x_1+(1-i)x_2=0\\2x_1+(1-i) x_2=0\end{cases}
so let x 2 = c x_2=c , we have x 1 = ( i 1 ) c / 2 x_1=(i-1)c/2 , so all solutions to the system is
x 1 = ( i 1 ) c 2 , x 2 = c , c C x_1=\dfrac{(i-1)c}{2},\quad x_2=c,\quad ∀c\in C

2. If

A = [ 3 1 2 2 1 1 1 3 0 ] A=\begin{bmatrix}3&-1&2\\2&1&1\\1&-3&0 \end{bmatrix}

find all solutions of A X = 0 AX=0 by row-reducing A A .

Solution: Since
[ 3 1 2 2 1 1 1 3 0 ] [ 0 8 2 0 7 1 1 3 0 ] [ 0 1 1 0 7 1 1 3 0 ] [ 0 1 1 0 0 6 1 0 3 ] [ 0 1 1 0 0 1 1 0 3 ] [ 0 1 0 0 0 1 1 0 0 ] \begin{bmatrix}3&-1&2\\2&1&1\\1&-3&0\end{bmatrix}→\begin{bmatrix}0&8&2\\0&7&1\\1&-3&0\end{bmatrix}→\begin{bmatrix}0&1&1\\0&7&1\\1&-3&0\end{bmatrix}→\begin{bmatrix}0&1&1\\0&0&-6\\1&0&3\end{bmatrix}→\begin{bmatrix}0&1&1\\0&0&1\\1&0&3\end{bmatrix}→\begin{bmatrix}0&1&0\\0&0&1\\1&0&0\end{bmatrix}
so A X = 0 AX=0 has only trivial solutions.

3. If

A = [ 6 4 0 4 2 0 1 0 3 ] A=\begin{bmatrix}6&-4&0\\4&-2&0\\-1&0&3 \end{bmatrix}

find all solutions of A X = 2 X AX=2X and A X = 3 X AX=3X .

Solution: It’s equivalent to solving ( A 2 I ) X = 0 (A-2I)X=0 and ( A 3 I ) X = 0 (A-3I)X=0 , use row-reducing we have
A 2 I = [ 4 4 0 4 4 0 1 0 1 ] [ 1 1 0 0 0 0 0 1 1 ] A-2I=\begin{bmatrix}4&-4&0\\4&-4&0\\-1&0&1\end{bmatrix}→\begin{bmatrix}1&-1&0\\0&0&0\\0&1&1\end{bmatrix}
so let x 2 = c x_2=c , we have x 1 = c x_1=c and x 3 = c x_3=-c , so all solutions to the system ( A 2 I ) X = 0 (A-2I)X=0 is ( c , c , c ) , c C (c,c,-c),c\in C .
A 3 I = [ 3 4 0 4 5 0 1 0 0 ] [ 0 0 0 0 1 0 1 0 0 ] A-3I=\begin{bmatrix}3&-4&0\\4&-5&0\\-1&0&0\end{bmatrix}→\begin{bmatrix}0&0&0\\0&1&0\\1&0&0\end{bmatrix}
so all solutions to the system ( A 3 I ) X = 0 (A-3I)X=0 is ( 0 , 0 , c ) , c C (0,0,c),c\in C .

4. Find a row-reduced matrix which is row-equivalent to

A = [ i ( 1 + i ) 0 1 2 1 1 2 i 1 ] A=\begin{bmatrix}i&-(1+i)&0\\1&-2&1\\1&2i&-1 \end{bmatrix}
Solution:
[ i ( 1 + i ) 0 1 2 1 1 2 i 1 ] [ 0 1 + i i 1 2 1 0 2 i + 2 2 ] [ 0 1 + i i 1 2 1 0 i + 1 1 ] [ 0 0 0 1 2 1 0 1 i 1 2 ] [ 0 0 0 1 0 i 0 1 i 1 2 ] \begin{bmatrix}i&-(1+i)&0\\1&-2&1\\1&2i&-1\end{bmatrix}→\begin{bmatrix}0&-1+i&-i\\1&-2&1\\0&2i+2&-2\end{bmatrix}\\→\begin{bmatrix}0&-1+i&-i\\1&-2&1\\0&i+1&-1\end{bmatrix}→\begin{bmatrix}0&0&0\\1&-2&1\\0&1&\frac{i-1}{2}\end{bmatrix}→\begin{bmatrix}0&0&0\\1&0&i\\0&1&\frac{i-1}{2}\end{bmatrix}

5. Prove that the following two matrices are not row-equivalent:

[ 2 0 0 a 1 0 b c 3 ] , [ 1 1 2 2 0 1 1 3 5 ] \begin{bmatrix}2&0&0\\a&-1&0\\b&c&3 \end{bmatrix},\quad \begin{bmatrix}1&1&2\\-2&0&-1\\1&3&5 \end{bmatrix}
Solution: The first matrix can be row reduced to
[ 2 0 0 a 1 0 b c 3 ] [ 1 0 0 0 1 0 0 c 3 ] [ 1 0 0 0 1 0 0 0 1 ] \begin{bmatrix}2&0&0\\a&-1&0\\b&c&3\end{bmatrix}→\begin{bmatrix}1&0&0\\0&-1&0\\0&c&3\end{bmatrix}→\begin{bmatrix}1&0&0\\0&1&0\\0&0&1\end{bmatrix}
the second matrix can be row-reduced to
[ 1 1 2 2 0 1 1 3 5 ] [ 1 1 2 0 2 3 0 2 3 ] [ 1 1 2 0 1 1 / 2 0 0 0 ] [ 1 0 2 0 1 1 / 2 0 0 0 ] \begin{bmatrix}1&1&2\\-2&0&-1\\1&3&5\end{bmatrix}→\begin{bmatrix}1&1&2\\0&2&3\\0&2&3\end{bmatrix}→\begin{bmatrix}1&1&2\\0&1&1/2\\0&0&0\end{bmatrix}→\begin{bmatrix}1&0&2\\0&1&1/2\\0&0&0\end{bmatrix}
thus they are not row-equivalent.

6.Let A = [ a b c d ] A=\begin{bmatrix}a&b\\c&d\end{bmatrix} be a 2 × 2 2\times 2 matrix with complex entries. Suppose that A A is row-reduced and also that a + b + c + d = 0 a+b+c+d=0 . Prove that there are exactly three such matrices.

Solution: The zero matrix [ 0 0 0 0 ] \begin{bmatrix}0&0\\0&0\end{bmatrix} is row-reduced and satisfy a + b + c + d = 0 a+b+c+d=0 , suppose there’re nonzero entries, then at least two entries shall be nonzero, otherwise contradicting a + b + c + d = 0 a+b+c+d=0 , consider the case when there’re exactly 2 2 nonzero entries, then they can’t belong to different rows, since then the matrix would be [ 1 0 0 1 ] \begin{bmatrix}1&0\\0&1\end{bmatrix} or [ 0 1 1 0 ] \begin{bmatrix}0&1\\1&0\end{bmatrix} , under the restriction of row-reduced, but then contradicting a + b + c + d = 0 a+b+c+d=0 , in the case when they belong to the same row, we get the matrix [ 1 1 0 0 ] \begin{bmatrix}1&-1\\0&0\end{bmatrix} or [ 0 0 1 1 ] \begin{bmatrix}0&0\\1&-1\end{bmatrix} satisfy the condition.
Now consider the case when there’re exactly 3 3 nonzero entries, then the matrix is of the form [ 1 b 0 d ] \begin{bmatrix}1&b\\0&d\end{bmatrix} or [ 0 b 1 d ] \begin{bmatrix}0&b\\1&d\end{bmatrix} , but notice then either b = 0 , d = 1 b=0,d=1 or b = 1 , d = 0 b=1,d=0 , under the restriction of row-reduced, this contradicting a + b + c + d = 0 a+b+c+d=0 .
Finally the case when all 4 4 entries is nonzero contradicts the restriction of row-reduced.
Thus there’re exactly three such matrices: [ 0 0 0 0 ] \begin{bmatrix}0&0\\0&0\end{bmatrix} , [ 1 1 0 0 ] \begin{bmatrix}1&-1\\0&0\end{bmatrix} and [ 0 0 1 1 ] \begin{bmatrix}0&0\\1&-1\end{bmatrix} .

7.Prove that the interchange of two rows of a matrix can be accomplished by a finite sequence of elementary row operations of the other two types.

Solution: It’s enough to show the process for a two-row matrix:
[ a 1 a n b 1 b n ] add row1 to row2 [ a 1 a n b 1 + a 1 b n + a n ] add (-1)*row2 to row1 [ b 1 b n b 1 + a 1 b n + a n ] add row1 to row2 [ b 1 b n a 1 a n ] multiply row1 by -1 [ b 1 b n a 1 a n ] \begin{aligned}\begin{bmatrix}a_1&\cdots&a_n\\b_1&\cdots&b_n\end{bmatrix} \xrightarrow {\text{add row1 to row2}} \begin{bmatrix}a_1&\cdots&a_n\\b_1+a_1&\cdots&b_n+a_n\end{bmatrix} \\ \xrightarrow{\text{add (-1)*row2 to row1}} \begin{bmatrix}-b_1&\cdots&-b_n\\b_1+a_1&\cdots&b_n+a_n\end{bmatrix}\\ \xrightarrow{\text{add row1 to row2}} \begin{bmatrix}-b_1&\cdots&-b_n\\a_1&\cdots&a_n\end{bmatrix}\\ \xrightarrow{\text{multiply row1 by -1}} \begin{bmatrix}b_1&\cdots&b_n\\a_1&\cdots&a_n\end{bmatrix}\end{aligned}

8. Consider the system of equations A X = 0 AX=0 where A = [ a b c d ] A=\begin{bmatrix}a&b\\c&d\end{bmatrix} is a 2 × 2 2\times 2 matrix over the field F F . Prove the following:
( a ) If every entry of A A is 0 0 , then every pair ( x 1 , x 2 ) (x_1,x_2) is a solution of A X = 0 AX=0 .
( b ) If a d b c 0 ad-bc\neq 0 , the system A X = 0 AX=0 has only the trivial solution x 1 = x 2 = 0 x_1=x_2=0 .
( c ) If a d b c = 0 ad-bc= 0 , and some entry of A A is different from 0 0 , then there is a solution ( x 1 0 , x 2 0 ) (x_1^0,x_2^0) such that ( x 1 , x 2 ) (x_1,x_2) is a solution if and only if there is a scalar y y such that x 1 = y x 1 0 , x 2 = y x 2 0 x_1=yx_1^0,x_2=yx_2^0 .

Solution:
( a ) We have 0 X = 0 0X=0 for every X X .
( b ) Since a a and c c can’t both be 0 0 , we suppose a 0 a\neq 0 , use row reduction, we have
A = [ a b c d ] [ a b c a d a ] [ a b 0 a d b c ] [ 1 0 0 1 ] A=\begin{bmatrix}a&b\\c&d\end{bmatrix}→\begin{bmatrix}a&b\\ca&da\end{bmatrix}→\begin{bmatrix}a&b\\0&ad-bc\end{bmatrix}→\begin{bmatrix}1&0\\0&1 \end{bmatrix}
and the conclusion follows.
( c ) If both a a and c c is 0 0 , then A A is row-equivalent to [ 0 1 0 0 ] \begin{bmatrix}0&1\\0&0\end{bmatrix} or [ 0 0 0 1 ] \begin{bmatrix}0&0\\0&1\end{bmatrix} . If not, then A A is row-equivalent to [ a b 0 a d b c ] \begin{bmatrix}a&b\\0&ad-bc\end{bmatrix} or [ 0 b c a d c d ] \begin{bmatrix}0&bc-ad\\c&d\end{bmatrix} , in all cases A A is row-equivalent to a matrix that has nonzero entries in only one row, thus the system A X = 0 AX=0 has the same solution with
α x 1 + β x 2 = 0 , ( α 0 ) ( β 0 ) αx_1+βx_2=0,\quad (α≠0)∨(β≠0)
let ( x 1 0 , x 2 0 ) (x_1^0,x_2^0) be a nonzero solution, then α x 1 0 = β x 2 0 αx_1^0=-βx_2^0 , thus either α 0 α≠0 , which leads to x 1 0 = k x 2 0 , k = β / α x_1^0=kx_2^0,k=-β/α , or β 0 β≠0 , which leads to x 2 0 = l x 1 0 , l = α / β x_2^0=lx_1^0,l=-α/β . If there’s any ( x 1 , x 2 ) (x_1,x_2) which satisfies α x 1 + β x 2 = 0 αx_1+βx_2=0 , then we must also have x 1 = k x 2 x_1=kx_2 or x 2 = l x 1 x_2=lx_1 , and the conclusion follows.
In fact, ( x 1 0 , x 2 0 ) (x_1^0,x_2^0) is a basis for the solution space in this case, so any ( x 1 , x 2 ) (x_1,x_2) being a solution, we must have ( x 1 , x 2 ) = y ( x 1 0 , x 2 0 ) (x_1,x_2 )=y(x_1^0,x_2^0) .

发布了77 篇原创文章 · 获赞 14 · 访问量 2806

猜你喜欢

转载自blog.csdn.net/christangdt/article/details/103216644
1.3
Row
今日推荐