Javascript required
Skip to content Skip to sidebar Skip to footer

Find 3 Linearly Independent of the Solution of the Systems

Solutions [edit | edit source]

This exercise is recommended for all readers.
Problem 1

Decide whether each subset of R 3 {\displaystyle \mathbb {R} ^{3}} is linearly dependent or linearly independent.

  1. { ( 1 3 5 ) , ( 2 2 4 ) , ( 4 4 14 ) } {\displaystyle \{{\begin{pmatrix}1\\-3\\5\end{pmatrix}},{\begin{pmatrix}2\\2\\4\end{pmatrix}},{\begin{pmatrix}4\\-4\\14\end{pmatrix}}\}}
  2. { ( 1 7 7 ) , ( 2 7 7 ) , ( 3 7 7 ) } {\displaystyle \{{\begin{pmatrix}1\\7\\7\end{pmatrix}},{\begin{pmatrix}2\\7\\7\end{pmatrix}},{\begin{pmatrix}3\\7\\7\end{pmatrix}}\}}
  3. { ( 0 0 1 ) , ( 1 0 4 ) } {\displaystyle \{{\begin{pmatrix}0\\0\\-1\end{pmatrix}},{\begin{pmatrix}1\\0\\4\end{pmatrix}}\}}
  4. { ( 9 9 0 ) , ( 2 0 1 ) , ( 3 5 4 ) , ( 12 12 1 ) } {\displaystyle \{{\begin{pmatrix}9\\9\\0\end{pmatrix}},{\begin{pmatrix}2\\0\\1\end{pmatrix}},{\begin{pmatrix}3\\5\\-4\end{pmatrix}},{\begin{pmatrix}12\\12\\-1\end{pmatrix}}\}}
Answer

For each of these, when the subset is independent it must be proved, and when the subset is dependent an example of a dependence must be given.

  1. It is dependent. Considering
    c 1 ( 1 3 5 ) + c 2 ( 2 2 4 ) + c 3 ( 4 4 14 ) = ( 0 0 0 ) {\displaystyle c_{1}{\begin{pmatrix}1\\-3\\5\end{pmatrix}}+c_{2}{\begin{pmatrix}2\\2\\4\end{pmatrix}}+c_{3}{\begin{pmatrix}4\\-4\\14\end{pmatrix}}={\begin{pmatrix}0\\0\\0\end{pmatrix}}}
    gives rise to this linear system.
    c 1 + 2 c 2 + 4 c 3 = 0 3 c 1 + 2 c 2 4 c 3 = 0 5 c 1 + 4 c 2 + 14 c 3 = 0 {\displaystyle {\begin{array}{*{3}{rc}r}c_{1}&+&2c_{2}&+&4c_{3}&=&0\\-3c_{1}&+&2c_{2}&-&4c_{3}&=&0\\5c_{1}&+&4c_{2}&+&14c_{3}&=&0\end{array}}}
    Gauss' method
    ( 1 2 4 0 3 2 4 0 5 4 14 0 ) 5 ρ 1 + ρ 3 3 ρ 1 + ρ 2 ( 3 / 4 ) ρ 2 + ρ 3 ( 1 2 4 0 0 8 8 0 0 0 0 0 ) {\displaystyle \left({\begin{array}{*{3}{c}|c}1&2&4&0\\-3&2&-4&0\\5&4&14&0\end{array}}\right){\xrightarrow[{-5\rho _{1}+\rho _{3}}]{3\rho _{1}+\rho _{2}}}\;{\xrightarrow[{}]{(3/4)\rho _{2}+\rho _{3}}}\left({\begin{array}{*{3}{c}|c}1&2&4&0\\0&8&8&0\\0&0&0&0\end{array}}\right)}
    yields a free variable, so there are infinitely many solutions. For an example of a particular dependence we can set c 3 {\displaystyle c_{3}} to be, say, 1 {\displaystyle 1} . Then we get c 2 = 1 {\displaystyle c_{2}=-1} and c 1 = 2 {\displaystyle c_{1}=-2} .
  2. It is dependent. The linear system that arises here
    ( 1 2 3 0 7 7 7 0 7 7 7 0 ) 7 ρ 1 + ρ 3 7 ρ 1 + ρ 2 ρ 2 + ρ 3 ( 1 2 3 0 0 7 14 0 0 0 0 0 ) {\displaystyle \left({\begin{array}{*{3}{c}|c}1&2&3&0\\7&7&7&0\\7&7&7&0\end{array}}\right)\;{\xrightarrow[{-7\rho _{1}+\rho _{3}}]{-7\rho _{1}+\rho _{2}}}\;{\xrightarrow[{}]{-\rho _{2}+\rho _{3}}}\;\left({\begin{array}{*{3}{c}|c}1&2&3&0\\0&-7&-14&0\\0&0&0&0\end{array}}\right)}
    has infinitely many solutions. We can get a particular solution by taking c 3 {\displaystyle c_{3}} to be, say, 1 {\displaystyle 1} , and back-substituting to get the resulting c 2 {\displaystyle c_{2}} and c 1 {\displaystyle c_{1}} .
  3. It is linearly independent. The system
    ( 0 1 0 0 0 0 1 4 0 ) ρ 1 ρ 2 ρ 3 ρ 1 ( 1 4 0 0 1 0 0 0 0 ) {\displaystyle \left({\begin{array}{*{2}{c}|c}0&1&0\\0&0&0\\-1&4&0\end{array}}\right)\;{\xrightarrow[{}]{\rho _{1}\leftrightarrow \rho _{2}}}\;{\xrightarrow[{}]{\rho _{3}\leftrightarrow \rho _{1}}}\;\left({\begin{array}{*{2}{c}|c}-1&4&0\\0&1&0\\0&0&0\end{array}}\right)}
    has only the solution c 1 = 0 {\displaystyle c_{1}=0} and c 2 = 0 {\displaystyle c_{2}=0} . (We could also have gotten the answer by inspection— the second vector is obviously not a multiple of the first, and vice versa.)
  4. It is linearly dependent. The linear system
    ( 9 2 3 12 0 9 0 5 12 0 0 1 4 1 0 ) {\displaystyle \left({\begin{array}{*{4}{c}|c}9&2&3&12&0\\9&0&5&12&0\\0&1&-4&-1&0\end{array}}\right)}
    has more unknowns than equations, and so Gauss' method must end with at least one variable free (there can't be a contradictory equation because the system is homogeneous, and so has at least the solution of all zeroes). To exhibit a combination, we can do the reduction
    ρ 1 + ρ 2 ( 1 / 2 ) ρ 2 + ρ 3 ( 9 2 3 12 0 0 2 2 0 0 0 0 3 1 0 ) {\displaystyle {\xrightarrow[{}]{-\rho _{1}+\rho _{2}}}\;{\xrightarrow[{}]{(1/2)\rho _{2}+\rho _{3}}}\;\left({\begin{array}{*{4}{c}|c}9&2&3&12&0\\0&-2&2&0&0\\0&0&-3&-1&0\end{array}}\right)}
    and take, say, c 4 = 1 {\displaystyle c_{4}=1} . Then we have that c 3 = 1 / 3 {\displaystyle c_{3}=-1/3} , c 2 = 1 / 3 {\displaystyle c_{2}=-1/3} , and c 1 = 31 / 27 {\displaystyle c_{1}=-31/27} .
This exercise is recommended for all readers.
Problem 2

Which of these subsets of P 3 {\displaystyle {\mathcal {P}}_{3}} are linearly dependent and which are independent?

  1. { 3 x + 9 x 2 , 5 6 x + 3 x 2 , 1 + 1 x 5 x 2 } {\displaystyle \{3-x+9x^{2},5-6x+3x^{2},1+1x-5x^{2}\}}
  2. { x 2 , 1 + 4 x 2 } {\displaystyle \{-x^{2},1+4x^{2}\}}
  3. { 2 + x + 7 x 2 , 3 x + 2 x 2 , 4 3 x 2 } {\displaystyle \{2+x+7x^{2},3-x+2x^{2},4-3x^{2}\}}
  4. { 8 + 3 x + 3 x 2 , x + 2 x 2 , 2 + 2 x + 2 x 2 , 8 2 x + 5 x 2 } {\displaystyle \{8+3x+3x^{2},x+2x^{2},2+2x+2x^{2},8-2x+5x^{2}\}}
Answer

In the cases of independence, that must be proved. Otherwise, a specific dependence must be produced. (Of course, dependences other than the ones exhibited here are possible.)

  1. This set is independent. Setting up the relation c 1 ( 3 x + 9 x 2 ) + c 2 ( 5 6 x + 3 x 2 ) + c 3 ( 1 + 1 x 5 x 2 ) = 0 + 0 x + 0 x 2 {\displaystyle c_{1}(3-x+9x^{2})+c_{2}(5-6x+3x^{2})+c_{3}(1+1x-5x^{2})=0+0x+0x^{2}} gives a linear system
    ( 3 5 1 0 1 6 1 0 9 3 5 0 ) 3 ρ 1 + ρ 3 ( 1 / 3 ) ρ 1 + ρ 2 3 ρ 2 ( 12 / 13 ) ρ 2 + ρ 3 ( 3 5 1 0 0 13 4 0 0 0 128 / 13 0 ) {\displaystyle \left({\begin{array}{*{3}{c}|c}3&5&1&0\\-1&-6&1&0\\9&3&-5&0\end{array}}\right)\;{\xrightarrow[{-3\rho _{1}+\rho _{3}}]{(1/3)\rho _{1}+\rho _{2}}}\;{\xrightarrow[{}]{3\rho _{2}}}\;{\xrightarrow[{}]{-(12/13)\rho _{2}+\rho _{3}}}\;\left({\begin{array}{*{3}{c}|c}3&5&1&0\\0&-13&4&0\\0&0&-128/13&0\end{array}}\right)}
    with only one solution: c 1 = 0 {\displaystyle c_{1}=0} , c 2 = 0 {\displaystyle c_{2}=0} , and c 3 = 0 {\displaystyle c_{3}=0} .
  2. This set is independent. We can see this by inspection, straight from the definition of linear independence. Obviously neither is a multiple of the other.
  3. This set is linearly independent. The linear system reduces in this way
    ( 2 3 4 0 1 1 0 0 7 2 3 0 ) ( 7 / 2 ) ρ 1 + ρ 3 ( 1 / 2 ) ρ 1 + ρ 2 ( 17 / 5 ) ρ 2 + ρ 3 ( 2 3 4 0 0 5 / 2 2 0 0 0 51 / 5 0 ) {\displaystyle \left({\begin{array}{*{3}{c}|c}2&3&4&0\\1&-1&0&0\\7&2&-3&0\end{array}}\right)\;{\xrightarrow[{-(7/2)\rho _{1}+\rho _{3}}]{-(1/2)\rho _{1}+\rho _{2}}}\;{\xrightarrow[{}]{-(17/5)\rho _{2}+\rho _{3}}}\;\left({\begin{array}{*{3}{c}|c}2&3&4&0\\0&-5/2&-2&0\\0&0&-51/5&0\end{array}}\right)}
    to show that there is only the solution c 1 = 0 {\displaystyle c_{1}=0} , c 2 = 0 {\displaystyle c_{2}=0} , and c 3 = 0 {\displaystyle c_{3}=0} .
  4. This set is linearly dependent. The linear system
    ( 8 0 2 8 0 3 1 2 2 0 3 2 2 5 0 ) {\displaystyle \left({\begin{array}{*{4}{c}|c}8&0&2&8&0\\3&1&2&-2&0\\3&2&2&5&0\end{array}}\right)}
    must, after reduction, end with at least one variable free (there are more variables than equations, and there is no possibility of a contradictory equation because the system is homogeneous). We can take the free variables as parameters to describe the solution set. We can then set the parameter to a nonzero value to get a nontrivial linear relation.
This exercise is recommended for all readers.
This exercise is recommended for all readers.
Problem 4

Which of these subsets of the space of real-valued functions of one real variable is linearly dependent and which is linearly independent? (Note that we have abbreviated some constant functions; e.g., in the first item, the " 2 {\displaystyle 2} " stands for the constant function f ( x ) = 2 {\displaystyle f(x)=2} .)

  1. { 2 , 4 sin 2 ( x ) , cos 2 ( x ) } {\displaystyle \{2,4\sin ^{2}(x),\cos ^{2}(x)\}}
  2. { 1 , sin ( x ) , sin ( 2 x ) } {\displaystyle \{1,\sin(x),\sin(2x)\}}
  3. { x , cos ( x ) } {\displaystyle \{x,\cos(x)\}}
  4. { ( 1 + x ) 2 , x 2 + 2 x , 3 } {\displaystyle \{(1+x)^{2},x^{2}+2x,3\}}
  5. { cos ( 2 x ) , sin 2 ( x ) , cos 2 ( x ) } {\displaystyle \{\cos(2x),\sin ^{2}(x),\cos ^{2}(x)\}}
  6. { 0 , x , x 2 } {\displaystyle \{0,x,x^{2}\}}
Answer

In each case, that the set is independent must be proved, and that it is dependent must be shown by exhibiting a specific dependence.

  1. This set is dependent. The familiar relation sin 2 ( x ) + cos 2 ( x ) = 1 {\displaystyle \sin ^{2}(x)+\cos ^{2}(x)=1} shows that 2 = c 1 ( 4 sin 2 ( x ) ) + c 2 ( cos 2 ( x ) ) {\displaystyle 2=c_{1}\cdot (4\sin ^{2}(x))+c_{2}\cdot (\cos ^{2}(x))} is satisfied by c 1 = 1 / 2 {\displaystyle c_{1}=1/2} and c 2 = 2 {\displaystyle c_{2}=2} .
  2. This set is independent. Consider the relationship c 1 1 + c 2 sin ( x ) + c 3 sin ( 2 x ) = 0 {\displaystyle c_{1}\cdot 1+c_{2}\cdot \sin(x)+c_{3}\cdot \sin(2x)=0} (that " 0 {\displaystyle 0} " is the zero function). Taking x = 0 {\displaystyle x=0} , x = π / 2 {\displaystyle x=\pi /2} and x = π / 4 {\displaystyle x=\pi /4} gives this system.
    c 1 = 0 c 1 + c 2 = 0 c 1 + ( 2 / 2 ) c 2 + c 3 = 0 {\displaystyle {\begin{array}{*{3}{rc}r}c_{1}&&&&&=&0\\c_{1}&+&c_{2}&&&=&0\\c_{1}&+&({\sqrt {2}}/2)c_{2}&+&c_{3}&=&0\end{array}}}
    whose only solution is c 1 = 0 {\displaystyle c_{1}=0} , c 2 = 0 {\displaystyle c_{2}=0} , and c 3 = 0 {\displaystyle c_{3}=0} .
  3. By inspection, this set is independent. Any dependence cos ( x ) = c x {\displaystyle \cos(x)=c\cdot x} is not possible since the cosine function is not a multiple of the identity function (we are applying Corollary 1.17).
  4. By inspection, we spot that there is a dependence. Because ( 1 + x ) 2 = x 2 + 2 x + 1 {\displaystyle (1+x)^{2}=x^{2}+2x+1} , we get that c 1 ( 1 + x ) 2 + c 2 ( x 2 + 2 x ) = 3 {\displaystyle c_{1}\cdot (1+x)^{2}+c_{2}\cdot (x^{2}+2x)=3} is satisfied by c 1 = 3 {\displaystyle c_{1}=3} and c 2 = 3 {\displaystyle c_{2}=-3} .
  5. This set is dependent. The easiest way to see that is to recall the trigonometric relationship cos 2 ( x ) sin 2 ( x ) = cos ( 2 x ) {\displaystyle \cos ^{2}(x)-\sin ^{2}(x)=\cos(2x)} . (Remark. A person who doesn't recall this, and tries some x {\displaystyle x} 's, simply never gets a system leading to a unique solution, and never gets to conclude that the set is independent. Of course, this person might wonder if they simply never tried the right set of x {\displaystyle x} 's, but a few tries will lead most people to look instead for a dependence.)
  6. This set is dependent, because it contains the zero object in the vector space, the zero polynomial.
Problem 6

Why does Lemma 1.4 say "distinct"?

Answer

To emphasize that the equation 1 s + ( 1 ) s = 0 {\displaystyle 1\cdot {\vec {s}}+(-1)\cdot {\vec {s}}={\vec {0}}} does not make the set dependent.

This exercise is recommended for all readers.
Problem 7

Show that the nonzero rows of an echelon form matrix form a linearly independent set.

Answer

We have already showed this: the Linear Combination Lemma and its corollary state that in an echelon form matrix, no nonzero row is a linear combination of the others.

This exercise is recommended for all readers.
Problem 8
  1. Show that if the set { u , v , w } {\displaystyle \{{\vec {u}},{\vec {v}},{\vec {w}}\}} is linearly independent set then so is the set { u , u + v , u + v + w } {\displaystyle \{{\vec {u}},{\vec {u}}+{\vec {v}},{\vec {u}}+{\vec {v}}+{\vec {w}}\}} .
  2. What is the relationship between the linear independence or dependence of the set { u , v , w } {\displaystyle \{{\vec {u}},{\vec {v}},{\vec {w}}\}} and the independence or dependence of { u v , v w , w u } {\displaystyle \{{\vec {u}}-{\vec {v}},{\vec {v}}-{\vec {w}},{\vec {w}}-{\vec {u}}\}} ?
Answer
  1. Assume that the set { u , v , w } {\displaystyle \{{\vec {u}},{\vec {v}},{\vec {w}}\}} is linearly independent, so that any relationship d 0 u + d 1 v + d 2 w = 0 {\displaystyle d_{0}{\vec {u}}+d_{1}{\vec {v}}+d_{2}{\vec {w}}={\vec {0}}} leads to the conclusion that d 0 = 0 {\displaystyle d_{0}=0} , d 1 = 0 {\displaystyle d_{1}=0} , and d 2 = 0 {\displaystyle d_{2}=0} . Consider the relationship c 1 ( u ) + c 2 ( u + v ) + c 3 ( u + v + w ) = 0 {\displaystyle c_{1}({\vec {u}})+c_{2}({\vec {u}}+{\vec {v}})+c_{3}({\vec {u}}+{\vec {v}}+{\vec {w}})={\vec {0}}} . Rewrite it to get ( c 1 + c 2 + c 3 ) u + ( c 2 + c 3 ) v + ( c 3 ) w = 0 {\displaystyle (c_{1}+c_{2}+c_{3}){\vec {u}}+(c_{2}+c_{3}){\vec {v}}+(c_{3}){\vec {w}}={\vec {0}}} . Taking d 0 {\displaystyle d_{0}} to be c 1 + c 2 + c 3 {\displaystyle c_{1}+c_{2}+c_{3}} , taking d 1 {\displaystyle d_{1}} to be c 2 + c 3 {\displaystyle c_{2}+c_{3}} , and taking d 2 {\displaystyle d_{2}} to be c 3 {\displaystyle c_{3}} we have this system.
    c 1 + c 2 + c 3 = 0 c 2 + c 3 = 0 c 3 = 0 {\displaystyle {\begin{array}{*{3}{rc}r}c_{1}&+&c_{2}&+&c_{3}&=&0\\&&c_{2}&+&c_{3}&=&0\\&&&&c_{3}&=&0\end{array}}}
    Conclusion: the c {\displaystyle c} 's are all zero, and so the set is linearly independent.
  2. The second set is dependent
    1 ( u v ) + 1 ( v w ) + 1 ( w u ) = 0 {\displaystyle 1\cdot ({\vec {u}}-{\vec {v}})+1\cdot ({\vec {v}}-{\vec {w}})+1\cdot ({\vec {w}}-{\vec {u}})={\vec {0}}}
    whether or not the first set is independent.
Problem 10

In any vector space V {\displaystyle V} , the empty set is linearly independent. What about all of V {\displaystyle V} ?

Answer

This set is linearly dependent set because it contains the zero vector.

Problem 11

Show that if { x , y , z } {\displaystyle \{{\vec {x}},{\vec {y}},{\vec {z}}\}} is linearly independent then so are all of its proper subsets: { x , y } {\displaystyle \{{\vec {x}},{\vec {y}}\}} , { x , z } {\displaystyle \{{\vec {x}},{\vec {z}}\}} , { y , z } {\displaystyle \{{\vec {y}},{\vec {z}}\}} , { x } {\displaystyle \{{\vec {x}}\}} , { y } {\displaystyle \{{\vec {y}}\}} , { z } {\displaystyle \{{\vec {z}}\}} , and { } {\displaystyle \{\}} . Is that "only if" also?

Answer

The "if" half is given by Lemma 1.14. The converse (the "only if" statement) does not hold. An example is to consider the vector space R 2 {\displaystyle \mathbb {R} ^{2}} and these vectors.

x = ( 1 0 ) , y = ( 0 1 ) , z = ( 1 1 ) {\displaystyle {\vec {x}}={\begin{pmatrix}1\\0\end{pmatrix}},\quad {\vec {y}}={\begin{pmatrix}0\\1\end{pmatrix}},\quad {\vec {z}}={\begin{pmatrix}1\\1\end{pmatrix}}}
Problem 12
  1. Show that this
    S = { ( 1 1 0 ) , ( 1 2 0 ) } {\displaystyle S=\{{\begin{pmatrix}1\\1\\0\end{pmatrix}},{\begin{pmatrix}-1\\2\\0\end{pmatrix}}\}}
    is a linearly independent subset of R 3 {\displaystyle \mathbb {R} ^{3}} .
  2. Show that
    ( 3 2 0 ) {\displaystyle {\begin{pmatrix}3\\2\\0\end{pmatrix}}}
    is in the span of S {\displaystyle S} by finding c 1 {\displaystyle c_{1}} and c 2 {\displaystyle c_{2}} giving a linear relationship.
    c 1 ( 1 1 0 ) + c 2 ( 1 2 0 ) = ( 3 2 0 ) {\displaystyle c_{1}{\begin{pmatrix}1\\1\\0\end{pmatrix}}+c_{2}{\begin{pmatrix}-1\\2\\0\end{pmatrix}}={\begin{pmatrix}3\\2\\0\end{pmatrix}}}
    Show that the pair c 1 , c 2 {\displaystyle c_{1},c_{2}} is unique.
  3. Assume that S {\displaystyle S} is a subset of a vector space and that v {\displaystyle {\vec {v}}} is in [ S ] {\displaystyle [S]} , so that v {\displaystyle {\vec {v}}} is a linear combination of vectors from S {\displaystyle S} . Prove that if S {\displaystyle S} is linearly independent then a linear combination of vectors from S {\displaystyle S} adding to v {\displaystyle {\vec {v}}} is unique (that is, unique up to reordering and adding or taking away terms of the form 0 s {\displaystyle 0\cdot {\vec {s}}} ). Thus S {\displaystyle S} as a spanning set is minimal in this strong sense: each vector in [ S ] {\displaystyle [S]} is "hit" a minimum number of times— only once.
  4. Prove that it can happen when S {\displaystyle S} is not linearly independent that distinct linear combinations sum to the same vector.
Answer
  1. The linear system arising from
    c 1 ( 1 1 0 ) + c 2 ( 1 2 0 ) = ( 0 0 0 ) {\displaystyle c_{1}{\begin{pmatrix}1\\1\\0\end{pmatrix}}+c_{2}{\begin{pmatrix}-1\\2\\0\end{pmatrix}}={\begin{pmatrix}0\\0\\0\end{pmatrix}}}
    has the unique solution c 1 = 0 {\displaystyle c_{1}=0} and c 2 = 0 {\displaystyle c_{2}=0} .
  2. The linear system arising from
    c 1 ( 1 1 0 ) + c 2 ( 1 2 0 ) = ( 3 2 0 ) {\displaystyle c_{1}{\begin{pmatrix}1\\1\\0\end{pmatrix}}+c_{2}{\begin{pmatrix}-1\\2\\0\end{pmatrix}}={\begin{pmatrix}3\\2\\0\end{pmatrix}}}
    has the unique solution c 1 = 8 / 3 {\displaystyle c_{1}=8/3} and c 2 = 1 / 3 {\displaystyle c_{2}=-1/3} .
  3. Suppose that S {\displaystyle S} is linearly independent. Suppose that we have both v = c 1 s 1 + + c n s n {\displaystyle {\vec {v}}=c_{1}{\vec {s}}_{1}+\dots +c_{n}{\vec {s}}_{n}} and v = d 1 t 1 + + d m t m {\displaystyle {\vec {v}}=d_{1}{\vec {t}}_{1}+\dots +d_{m}{\vec {t}}_{m}} (where the vectors are members of S {\displaystyle S} ). Now,
    c 1 s 1 + + c n s n = v = d 1 t 1 + + d m t m {\displaystyle c_{1}{\vec {s}}_{1}+\dots +c_{n}{\vec {s}}_{n}={\vec {v}}=d_{1}{\vec {t}}_{1}+\dots +d_{m}{\vec {t}}_{m}}
    can be rewritten in this way.
    c 1 s 1 + + c n s n d 1 t 1 d m t m = 0 {\displaystyle c_{1}{\vec {s}}_{1}+\dots +c_{n}{\vec {s}}_{n}-d_{1}{\vec {t}}_{1}-\dots -d_{m}{\vec {t}}_{m}={\vec {0}}}
    Possibly some of the s {\displaystyle {\vec {s}}\,} 's equal some of the t {\displaystyle {\vec {t}}\,} 's; we can combine the associated coefficients (i.e., if s i = t j {\displaystyle {\vec {s}}_{i}={\vec {t}}_{j}} then + c i s i + d j t j {\displaystyle \cdots +c_{i}{\vec {s}}_{i}+\dots -d_{j}{\vec {t}}_{j}-\cdots } can be rewritten as + ( c i d j ) s i + {\displaystyle \cdots +(c_{i}-d_{j}){\vec {s}}_{i}+\cdots } ). That equation is a linear relationship among distinct (after the combining is done) members of the set S {\displaystyle S} . We've assumed that S {\displaystyle S} is linearly independent, so all of the coefficients are zero. If i {\displaystyle i} is such that s i {\displaystyle {\vec {s}}_{i}} does not equal any t j {\displaystyle {\vec {t}}_{j}} then c i {\displaystyle c_{i}} is zero. If j {\displaystyle j} is such that t j {\displaystyle {\vec {t}}_{j}} does not equal any s i {\displaystyle {\vec {s}}_{i}} then d j {\displaystyle d_{j}} is zero. In the final case, we have that c i d j = 0 {\displaystyle c_{i}-d_{j}=0} and so c i = d j {\displaystyle c_{i}=d_{j}} . Therefore, the original two sums are the same, except perhaps for some 0 s i {\displaystyle 0\cdot {\vec {s}}_{i}} or 0 t j {\displaystyle 0\cdot {\vec {t}}_{j}} terms that we can neglect.
  4. This set is not linearly independent:
    S = { ( 1 0 ) , ( 2 0 ) } R 2 {\displaystyle S=\{{\begin{pmatrix}1\\0\end{pmatrix}},{\begin{pmatrix}2\\0\end{pmatrix}}\}\subset \mathbb {R} ^{2}}
    and these two linear combinations give the same result
    ( 0 0 ) = 2 ( 1 0 ) 1 ( 2 0 ) = 4 ( 1 0 ) 2 ( 2 0 ) {\displaystyle {\begin{pmatrix}0\\0\end{pmatrix}}=2\cdot {\begin{pmatrix}1\\0\end{pmatrix}}-1\cdot {\begin{pmatrix}2\\0\end{pmatrix}}=4\cdot {\begin{pmatrix}1\\0\end{pmatrix}}-2\cdot {\begin{pmatrix}2\\0\end{pmatrix}}}
    Thus, a linearly dependent set might have indistinct sums. In fact, this stronger statement holds: if a set is linearly dependent then it must have the property that there are two distinct linear combinations that sum to the same vector. Briefly, where c 1 s 1 + + c n s n = 0 {\displaystyle c_{1}{\vec {s}}_{1}+\dots +c_{n}{\vec {s}}_{n}={\vec {0}}} then multiplying both sides of the relationship by two gives another relationship. If the first relationship is nontrivial then the second is also.
Problem 14

Return to Section 1.2 and redefine point, line, plane, and other linear surfaces to avoid degenerate cases.

Answer

The work in this section suggests that an n {\displaystyle n} -dimensional non-degenerate linear surface should be defined as the span of a linearly independent set of n {\displaystyle n} vectors.

Problem 15
  1. Show that any set of four vectors in R 2 {\displaystyle \mathbb {R} ^{2}} is linearly dependent.
  2. Is this true for any set of five? Any set of three?
  3. What is the most number of elements that a linearly independent subset of R 2 {\displaystyle \mathbb {R} ^{2}} can have?
Answer
  1. For any a 1 , 1 {\displaystyle a_{1,1}} , ..., a 2 , 4 {\displaystyle a_{2,4}} ,
    c 1 ( a 1 , 1 a 2 , 1 ) + c 2 ( a 1 , 2 a 2 , 2 ) + c 3 ( a 1 , 3 a 2 , 3 ) + c 4 ( a 1 , 4 a 2 , 4 ) = ( 0 0 ) {\displaystyle c_{1}{\begin{pmatrix}a_{1,1}\\a_{2,1}\end{pmatrix}}+c_{2}{\begin{pmatrix}a_{1,2}\\a_{2,2}\end{pmatrix}}+c_{3}{\begin{pmatrix}a_{1,3}\\a_{2,3}\end{pmatrix}}+c_{4}{\begin{pmatrix}a_{1,4}\\a_{2,4}\end{pmatrix}}={\begin{pmatrix}0\\0\end{pmatrix}}}
    yields a linear system
    a 1 , 1 c 1 + a 1 , 2 c 2 + a 1 , 3 c 3 + a 1 , 4 c 4 = 0 a 2 , 1 c 1 + a 2 , 2 c 2 + a 2 , 3 c 3 + a 2 , 4 c 4 = 0 {\displaystyle {\begin{array}{*{4}{rc}r}a_{1,1}c_{1}&+&a_{1,2}c_{2}&+&a_{1,3}c_{3}&+&a_{1,4}c_{4}&=&0\\a_{2,1}c_{1}&+&a_{2,2}c_{2}&+&a_{2,3}c_{3}&+&a_{2,4}c_{4}&=&0\end{array}}}
    that has infinitely many solutions (Gauss' method leaves at least two variables free). Hence there are nontrivial linear relationships among the given members of R 2 {\displaystyle \mathbb {R} ^{2}} .
  2. Any set five vectors is a superset of a set of four vectors, and so is linearly dependent. With three vectors from R 2 {\displaystyle \mathbb {R} ^{2}} , the argument from the prior item still applies, with the slight change that Gauss' method now only leaves at least one variable free (but that still gives infinitely many solutions).
  3. The prior item shows that no three-element subset of R 2 {\displaystyle \mathbb {R} ^{2}} is independent. We know that there are two-element subsets of R 2 {\displaystyle \mathbb {R} ^{2}} that are independent— one is
    { ( 1 0 ) , ( 0 1 ) } {\displaystyle \{{\begin{pmatrix}1\\0\end{pmatrix}},{\begin{pmatrix}0\\1\end{pmatrix}}\}}
    and so the answer is two.
This exercise is recommended for all readers.
Problem 16

Is there a set of four vectors in R 3 {\displaystyle \mathbb {R} ^{3}} , any three of which form a linearly independent set?

Answer

Yes; here is one.

{ ( 1 0 0 ) , ( 0 1 0 ) , ( 0 0 1 ) , ( 1 1 1 ) } {\displaystyle \{{\begin{pmatrix}1\\0\\0\end{pmatrix}},{\begin{pmatrix}0\\1\\0\end{pmatrix}},{\begin{pmatrix}0\\0\\1\end{pmatrix}},{\begin{pmatrix}1\\1\\1\end{pmatrix}}\}}
Problem 17

Must every linearly dependent set have a subset that is dependent and a subset that is independent?

Answer

Yes. The two improper subsets, the entire set and the empty subset, serve as examples.

Problem 18

In R 4 {\displaystyle \mathbb {R} ^{4}} , what is the biggest linearly independent set you can find? The smallest? The biggest linearly dependent set? The smallest? ("Biggest" and "smallest" mean that there are no supersets or subsets with the same property.)

Answer

In R 4 {\displaystyle \mathbb {R} ^{4}} the biggest linearly independent set has four vectors. There are many examples of such sets, this is one.

{ ( 1 0 0 0 ) , ( 0 1 0 0 ) , ( 0 0 1 0 ) , ( 0 0 0 1 ) } {\displaystyle \{{\begin{pmatrix}1\\0\\0\\0\end{pmatrix}},{\begin{pmatrix}0\\1\\0\\0\end{pmatrix}},{\begin{pmatrix}0\\0\\1\\0\end{pmatrix}},{\begin{pmatrix}0\\0\\0\\1\end{pmatrix}}\}}

To see that no set with five or more vectors can be independent, set up

c 1 ( a 1 , 1 a 2 , 1 a 3 , 1 a 4 , 1 ) + c 2 ( a 1 , 2 a 2 , 2 a 3 , 2 a 4 , 2 ) + c 3 ( a 1 , 3 a 2 , 3 a 3 , 3 a 4 , 3 ) + c 4 ( a 1 , 4 a 2 , 4 a 3 , 4 a 4 , 4 ) + c 5 ( a 1 , 5 a 2 , 5 a 3 , 5 a 4 , 5 ) = ( 0 0 0 0 ) {\displaystyle c_{1}{\begin{pmatrix}a_{1,1}\\a_{2,1}\\a_{3,1}\\a_{4,1}\end{pmatrix}}+c_{2}{\begin{pmatrix}a_{1,2}\\a_{2,2}\\a_{3,2}\\a_{4,2}\end{pmatrix}}+c_{3}{\begin{pmatrix}a_{1,3}\\a_{2,3}\\a_{3,3}\\a_{4,3}\end{pmatrix}}+c_{4}{\begin{pmatrix}a_{1,4}\\a_{2,4}\\a_{3,4}\\a_{4,4}\end{pmatrix}}+c_{5}{\begin{pmatrix}a_{1,5}\\a_{2,5}\\a_{3,5}\\a_{4,5}\end{pmatrix}}={\begin{pmatrix}0\\0\\0\\0\end{pmatrix}}}

and note that the resulting linear system

a 1 , 1 c 1 + a 1 , 2 c 2 + a 1 , 3 c 3 + a 1 , 4 c 4 + a 1 , 5 c 5 = 0 a 2 , 1 c 1 + a 2 , 2 c 2 + a 2 , 3 c 3 + a 2 , 4 c 4 + a 2 , 5 c 5 = 0 a 3 , 1 c 1 + a 3 , 2 c 2 + a 3 , 3 c 3 + a 3 , 4 c 4 + a 3 , 5 c 5 = 0 a 4 , 1 c 1 + a 4 , 2 c 2 + a 4 , 3 c 3 + a 4 , 4 c 4 + a 4 , 5 c 5 = 0 {\displaystyle {\begin{array}{*{5}{rc}r}a_{1,1}c_{1}&+&a_{1,2}c_{2}&+&a_{1,3}c_{3}&+&a_{1,4}c_{4}&+&a_{1,5}c_{5}&=&0\\a_{2,1}c_{1}&+&a_{2,2}c_{2}&+&a_{2,3}c_{3}&+&a_{2,4}c_{4}&+&a_{2,5}c_{5}&=&0\\a_{3,1}c_{1}&+&a_{3,2}c_{2}&+&a_{3,3}c_{3}&+&a_{3,4}c_{4}&+&a_{3,5}c_{5}&=&0\\a_{4,1}c_{1}&+&a_{4,2}c_{2}&+&a_{4,3}c_{3}&+&a_{4,4}c_{4}&+&a_{4,5}c_{5}&=&0\end{array}}}

has four equations and five unknowns, so Gauss' method must end with at least one c {\displaystyle c} variable free, so there are infinitely many solutions, and so the above linear relationship among the four-tall vectors has more solutions than just the trivial solution.

The smallest linearly independent set is the empty set.

The biggest linearly dependent set is R 4 {\displaystyle \mathbb {R} ^{4}} . The smallest is { 0 } {\displaystyle \{{\vec {0}}\}} .

This exercise is recommended for all readers.
Problem 19

Linear independence and linear dependence are properties of sets. We can thus naturally ask how those properties act with respect to the familiar elementary set relations and operations. In this body of this subsection we have covered the subset and superset relations. We can also consider the operations of intersection, complementation, and union.

  1. How does linear independence relate to intersection: can an intersection of linearly independent sets be independent? Must it be?
  2. How does linear independence relate to complementation?
  3. Show that the union of two linearly independent sets need not be linearly independent.
  4. Characterize when the union of two linearly independent sets is linearly independent, in terms of the intersection of the span of each.
Answer
  1. The intersection of two linearly independent sets S T {\displaystyle S\cap T} must be linearly independent as it is a subset of the linearly independent set S {\displaystyle S} (as well as the linearly independent set T {\displaystyle T} also, of course).
  2. The complement of a linearly independent set is linearly dependent as it contains the zero vector.
  3. We must produce an example. One, in R 2 {\displaystyle \mathbb {R} ^{2}} , is
    S = { ( 1 0 ) } and T = { ( 2 0 ) } {\displaystyle S=\{{\begin{pmatrix}1\\0\end{pmatrix}}\}\quad {\text{and}}\quad T=\{{\begin{pmatrix}2\\0\end{pmatrix}}\}}
    since the linear dependence of S 1 S 2 {\displaystyle S_{1}\cup S_{2}} is easily seen.
  4. The union of two linearly independent sets S T {\displaystyle S\cup T} is linearly independent if and only if their spans have a trivial intersection [ S ] [ T ] = { 0 } {\displaystyle [S]\cap [T]=\{{\vec {0}}\}} . To prove that, assume that S {\displaystyle S} and T {\displaystyle T} are linearly independent subsets of some vector space. For the "only if" direction, assume that the intersection of the spans is trivial [ S ] [ T ] = { 0 } {\displaystyle [S]\cap [T]=\{{\vec {0}}\}} . Consider the set S T {\displaystyle S\cup T} . Any linear relationship c 1 s 1 + + c n s n + d 1 t 1 + + d m t m = 0 {\displaystyle c_{1}{\vec {s}}_{1}+\dots +c_{n}{\vec {s}}_{n}+d_{1}{\vec {t}}_{1}+\dots +d_{m}{\vec {t}}_{m}={\vec {0}}} gives c 1 s 1 + + c n s n = d 1 t 1 d m t m {\displaystyle c_{1}{\vec {s}}_{1}+\dots +c_{n}{\vec {s}}_{n}=-d_{1}{\vec {t}}_{1}-\dots -d_{m}{\vec {t}}_{m}} . The left side of that equation sums to a vector in [ S ] {\displaystyle [S]} , and the right side is a vector in [ T ] {\displaystyle [T]} . Therefore, since the intersection of the spans is trivial, both sides equal the zero vector. Because S {\displaystyle S} is linearly independent, all of the c {\displaystyle c} 's are zero. Because T {\displaystyle T} is linearly independent, all of the d {\displaystyle d} 's are zero. Thus, the original linear relationship among members of S T {\displaystyle S\cup T} only holds if all of the coefficients are zero. That shows that S T {\displaystyle S\cup T} is linearly independent. For the "if" half we can make the same argument in reverse. If the union S T {\displaystyle S\cup T} is linearly independent, that is, if the only solution to c 1 s 1 + + c n s n + d 1 t 1 + + d m t m = 0 {\displaystyle c_{1}{\vec {s}}_{1}+\cdots +c_{n}{\vec {s}}_{n}+d_{1}{\vec {t}}_{1}+\cdots +d_{m}{\vec {t}}_{m}={\vec {0}}} is the trivial solution c 1 = 0 {\displaystyle c_{1}=0} , ..., d m = 0 {\displaystyle d_{m}=0} , then any vector v {\displaystyle {\vec {v}}} in the intersection of the spans v = c 1 s 1 + + c n s n = d 1 t 1 = d m t m {\displaystyle {\vec {v}}=c_{1}{\vec {s}}_{1}+\cdots +c_{n}{\vec {s}}_{n}=-d_{1}{\vec {t}}_{1}-\cdots =d_{m}{\vec {t}}_{m}} must be the zero vector because each scalar is zero.
This exercise is recommended for all readers.
Problem 20

For Theorem 1.12,

  1. fill in the induction for the proof;
  2. give an alternate proof that starts with the empty set and builds a sequence of linearly independent subsets of the given finite set until one appears with the same span as the given set.
Answer
  1. We do induction on the number of vectors in the finite set S {\displaystyle S} . The base case is that S {\displaystyle S} has no elements. In this case S {\displaystyle S} is linearly independent and there is nothing to check— a subset of S {\displaystyle S} that has the same span as S {\displaystyle S} is S {\displaystyle S} itself. For the inductive step assume that the theorem is true for all sets of size n = 0 {\displaystyle n=0} , n = 1 {\displaystyle n=1} , ..., n = k {\displaystyle n=k} in order to prove that it holds when S {\displaystyle S} has n = k + 1 {\displaystyle n=k+1} elements. If the k + 1 {\displaystyle k+1} -element set S = { s 0 , , s k } {\displaystyle S=\{{\vec {s}}_{0},\dots ,{\vec {s}}_{k}\}} is linearly independent then the theorem is trivial, so assume that it is dependent. By Corollary 1.17 there is an s i {\displaystyle {\vec {s}}_{i}} that is a linear combination of other vectors in S {\displaystyle S} . Define S 1 = S { s i } {\displaystyle S_{1}=S-\{{\vec {s}}_{i}\}} and note that S 1 {\displaystyle S_{1}} has the same span as S {\displaystyle S} by Lemma 1.1. The set S 1 {\displaystyle S_{1}} has k {\displaystyle k} elements and so the inductive hypothesis applies to give that it has a linearly independent subset with the same span. That subset of S 1 {\displaystyle S_{1}} is the desired subset of S {\displaystyle S} .
  2. Here is a sketch of the argument. The induction argument details have been left out. If the finite set S {\displaystyle S} is empty then there is nothing to prove. If S = { 0 } {\displaystyle S=\{{\vec {0}}\}} then the empty subset will do. Otherwise, take some nonzero vector s 1 S {\displaystyle {\vec {s}}_{1}\in S} and define S 1 = { s 1 } {\displaystyle S_{1}=\{{\vec {s}}_{1}\}} . If [ S 1 ] = [ S ] {\displaystyle [S_{1}]=[S]} then this proof is finished by noting that S 1 {\displaystyle S_{1}} is linearly independent. If not, then there is a nonzero vector s 2 S [ S 1 ] {\displaystyle {\vec {s}}_{2}\in S-[S_{1}]} (if every s S {\displaystyle {\vec {s}}\in S} is in [ S 1 ] {\displaystyle [S_{1}]} then [ S 1 ] = [ S ] {\displaystyle [S_{1}]=[S]} ). Define S 2 = S 1 { s 2 } {\displaystyle S_{2}=S_{1}\cup \{{\vec {s}}_{2}\}} . If [ S 2 ] = [ S ] {\displaystyle [S_{2}]=[S]} then this proof is finished by using Theorem 1.17 to show that S 2 {\displaystyle S_{2}} is linearly independent. Repeat the last paragraph until a set with a big enough span appears. That must eventually happen because S {\displaystyle S} is finite, and [ S ] {\displaystyle [S]} will be reached at worst when every vector from S {\displaystyle S} has been used.
Problem 21

With a little calculation we can get formulas to determine whether or not a set of vectors is linearly independent.

  1. Show that this subset of R 2 {\displaystyle \mathbb {R} ^{2}}
    { ( a c ) , ( b d ) } {\displaystyle \{{\begin{pmatrix}a\\c\end{pmatrix}},{\begin{pmatrix}b\\d\end{pmatrix}}\}}
    is linearly independent if and only if a d b c 0 {\displaystyle ad-bc\neq 0} .
  2. Show that this subset of R 3 {\displaystyle \mathbb {R} ^{3}}
    { ( a d g ) , ( b e h ) , ( c f i ) } {\displaystyle \{{\begin{pmatrix}a\\d\\g\end{pmatrix}},{\begin{pmatrix}b\\e\\h\end{pmatrix}},{\begin{pmatrix}c\\f\\i\end{pmatrix}}\}}
    is linearly independent iff a e i + b f g + c d h h f a i d b g e c 0 {\displaystyle aei+bfg+cdh-hfa-idb-gec\neq 0} .
  3. When is this subset of R 3 {\displaystyle \mathbb {R} ^{3}}
    { ( a d g ) , ( b e h ) } {\displaystyle \{{\begin{pmatrix}a\\d\\g\end{pmatrix}},{\begin{pmatrix}b\\e\\h\end{pmatrix}}\}}
    linearly independent?
  4. This is an opinion question: for a set of four vectors from R 4 {\displaystyle \mathbb {R} ^{4}} , must there be a formula involving the sixteen entries that determines independence of the set? (You needn't produce such a formula, just decide if one exists.)
Answer
  1. Assuming first that a 0 {\displaystyle a\neq 0} ,
    x ( a c ) + y ( b d ) = ( 0 0 ) {\displaystyle x{\begin{pmatrix}a\\c\end{pmatrix}}+y{\begin{pmatrix}b\\d\end{pmatrix}}={\begin{pmatrix}0\\0\end{pmatrix}}}
    gives
    a x + b y = 0 c x + d y = 0 ( c / a ) ρ 1 + ρ 2 a x + b y = 0 ( ( c / a ) b + d ) y = 0 {\displaystyle {\begin{array}{*{2}{rc}r}ax&+&by&=&0\\cx&+&dy&=&0\end{array}}\;{\xrightarrow[{}]{-(c/a)\rho _{1}+\rho _{2}}}\;{\begin{array}{*{2}{rc}r}ax&+&by&=&0\\&&(-(c/a)b+d)y&=&0\end{array}}}
    which has a solution if and only if 0 ( c / a ) b + d = ( c b + a d ) / d {\displaystyle 0\neq -(c/a)b+d=(-cb+ad)/d} (we've assumed in this case that a 0 {\displaystyle a\neq 0} , and so back substitution yields a unique solution). The a = 0 {\displaystyle a=0} case is also not hard— break it into the c 0 {\displaystyle c\neq 0} and c = 0 {\displaystyle c=0} subcases and note that in these cases a d b c = 0 d b c {\displaystyle ad-bc=0\cdot d-bc} . Comment. An earlier exercise showed that a two-vector set is linearly dependent if and only if either vector is a scalar multiple of the other. That can also be used to make the calculation.
  2. The equation
    c 1 ( a d g ) + c 2 ( b e h ) + c 3 ( c f i ) = ( 0 0 0 ) {\displaystyle c_{1}{\begin{pmatrix}a\\d\\g\end{pmatrix}}+c_{2}{\begin{pmatrix}b\\e\\h\end{pmatrix}}+c_{3}{\begin{pmatrix}c\\f\\i\end{pmatrix}}={\begin{pmatrix}0\\0\\0\end{pmatrix}}}
    gives rise to a homogeneous linear system. We proceed by writing it in matrix form and applying Gauss' method. We first reduce the matrix to upper-triangular. Assume that a 0 {\displaystyle a\neq 0} .
    ( 1 / a ) ρ 1 ( 1 b / a c / a 0 d e f 0 g h i 0 ) g ρ 1 + ρ 3 d ρ 1 + ρ 2 ( 1 b / a c / a 0 0 ( a e b d ) / a ( a f c d ) / a 0 0 ( a h b g ) / a ( a i c g ) / a 0 ) ( a / ( a e b d ) ) ρ 2 ( 1 b / a c / a 0 0 1 ( a f c d ) / ( a e b d ) 0 0 ( a h b g ) / a ( a i c g ) / a 0 ) {\displaystyle {\begin{array}{rcl}{\xrightarrow[{}]{(1/a)\rho _{1}}}\left({\begin{array}{*{3}{c}|c}1&b/a&c/a&0\\d&e&f&0\\g&h&i&0\end{array}}\right)&{\xrightarrow[{-g\rho _{1}+\rho _{3}}]{-d\rho _{1}+\rho _{2}}}&\left({\begin{array}{*{3}{c}|c}1&b/a&c/a&0\\0&(ae-bd)/a&(af-cd)/a&0\\0&(ah-bg)/a&(ai-cg)/a&0\end{array}}\right)\\&{\xrightarrow[{}]{(a/(ae-bd))\rho _{2}}}&\left({\begin{array}{*{3}{c}|c}1&b/a&c/a&0\\0&1&(af-cd)/(ae-bd)&0\\0&(ah-bg)/a&(ai-cg)/a&0\end{array}}\right)\end{array}}}
    (where we've assumed for the moment that a e b d 0 {\displaystyle ae-bd\neq 0} in order to do the row reduction step). Then, under the assumptions, we get this.
    ( ( a h b g ) / a ) ρ 2 + ρ 3 ( 1 b a c a 0 0 1 a f c d a e b d 0 0 0 a e i + b g f + c d h h f a i d b g e c a e b d 0 ) {\displaystyle {\begin{array}{rcl}&{\xrightarrow[{}]{((ah-bg)/a)\rho _{2}+\rho _{3}}}&\left({\begin{array}{*{3}{c}|c}1&{\frac {b}{a}}&{\frac {c}{a}}&0\\0&1&{\frac {af-cd}{ae-bd}}&0\\0&0&{\frac {aei+bgf+cdh-hfa-idb-gec}{ae-bd}}&0\end{array}}\right)\end{array}}}
    shows that the original system is nonsingular if and only if the 3 , 3 {\displaystyle 3,3} entry is nonzero. This fraction is defined because of the a e b d 0 {\displaystyle ae-bd\neq 0} assumption, and it will equal zero if and only if its numerator equals zero. We next worry about the assumptions. First, if a 0 {\displaystyle a\neq 0} but a e b d = 0 {\displaystyle ae-bd=0} then we swap
    ( 1 b / a c / a 0 0 0 ( a f c d ) / a 0 0 ( a h b g ) / a ( a i c g ) / a 0 ) ρ 2 ρ 3 ( 1 b / a c / a 0 0 ( a h b g ) / a ( a i c g ) / a 0 0 0 ( a f c d ) / a 0 ) {\displaystyle {\begin{array}{rcl}\left({\begin{array}{*{3}{c}|c}1&b/a&c/a&0\\0&0&(af-cd)/a&0\\0&(ah-bg)/a&(ai-cg)/a&0\end{array}}\right)&{\xrightarrow[{}]{\rho _{2}\leftrightarrow \rho _{3}}}&\left({\begin{array}{*{3}{c}|c}1&b/a&c/a&0\\0&(ah-bg)/a&(ai-cg)/a&0\\0&0&(af-cd)/a&0\end{array}}\right)\end{array}}}
    and conclude that the system is nonsingular if and only if either a h b g = 0 {\displaystyle ah-bg=0} or a f c d = 0 {\displaystyle af-cd=0} . That's the same as asking that their product be zero:
    a h a f a h c d b g a f + b g c d = 0 a h a f a h c d b g a f + a e g c = 0 a ( h a f h c d b g f + e g c ) = 0 {\displaystyle {\begin{array}{rl}ahaf-ahcd-bgaf+bgcd&=0\\ahaf-ahcd-bgaf+aegc&=0\\a(haf-hcd-bgf+egc)&=0\end{array}}}
    (in going from the first line to the second we've applied the case assumption that a e b d = 0 {\displaystyle ae-bd=0} by substituting a e {\displaystyle ae} for b d {\displaystyle bd} ). Since we are assuming that a 0 {\displaystyle a\neq 0} , we have that h a f h c d b g f + e g c = 0 {\displaystyle haf-hcd-bgf+egc=0} . With a e b d = 0 {\displaystyle ae-bd=0} we can rewrite this to fit the form we need: in this a 0 {\displaystyle a\neq 0} and a e b d = 0 {\displaystyle ae-bd=0} case, the given system is nonsingular when h a f h c d b g f + e g c i ( a e b d ) = 0 {\displaystyle haf-hcd-bgf+egc-i(ae-bd)=0} , as required. The remaining cases have the same character. Do the a = 0 {\displaystyle a=0} but d 0 {\displaystyle d\neq 0} case and the a = 0 {\displaystyle a=0} and d = 0 {\displaystyle d=0} but g 0 {\displaystyle g\neq 0} case by first swapping rows and then going on as above. The a = 0 {\displaystyle a=0} , d = 0 {\displaystyle d=0} , and g = 0 {\displaystyle g=0} case is easy— a set with a zero vector is linearly dependent, and the formula comes out to equal zero.
  3. It is linearly dependent if and only if either vector is a multiple of the other. That is, it is not independent iff
    ( a d g ) = r ( b e h ) or ( b e h ) = s ( a d g ) {\displaystyle {\begin{pmatrix}a\\d\\g\end{pmatrix}}=r\cdot {\begin{pmatrix}b\\e\\h\end{pmatrix}}\quad {\text{or}}\quad {\begin{pmatrix}b\\e\\h\end{pmatrix}}=s\cdot {\begin{pmatrix}a\\d\\g\end{pmatrix}}}
    (or both) for some scalars r {\displaystyle r} and s {\displaystyle s} . Eliminating r {\displaystyle r} and s {\displaystyle s} in order to restate this condition only in terms of the given letters a {\displaystyle a} , b {\displaystyle b} , d {\displaystyle d} , e {\displaystyle e} , g {\displaystyle g} , h {\displaystyle h} , we have that it is not independent— it is dependent— iff a e b d = a h g b = d h g e {\displaystyle ae-bd=ah-gb=dh-ge} .
  4. Dependence or independence is a function of the indices, so there is indeed a formula (although at first glance a person might think the formula involves cases: "if the first component of the first vector is zero then ...", this guess turns out not to be correct).
This exercise is recommended for all readers.
Problem 23

Consider the set of functions from the open interval ( 1..1 ) {\displaystyle (-1..1)} to R {\displaystyle \mathbb {R} } .

  1. Show that this set is a vector space under the usual operations.
  2. Recall the formula for the sum of an infinite geometric series: 1 + x + x 2 + = 1 / ( 1 x ) {\displaystyle 1+x+x^{2}+\cdots =1/(1-x)} for all x ( 1..1 ) {\displaystyle x\in (-1..1)} . Why does this not express a dependence inside of the set { g ( x ) = 1 / ( 1 x ) , f 0 ( x ) = 1 , f 1 ( x ) = x , f 2 ( x ) = x 2 , } {\displaystyle \{g(x)=1/(1-x),f_{0}(x)=1,f_{1}(x)=x,f_{2}(x)=x^{2},\ldots \}} (in the vector space that we are considering)? (Hint. Review the definition of linear combination.)
  3. Show that the set in the prior item is linearly independent.

This shows that some vector spaces exist with linearly independent subsets that are infinite.

Answer
  1. This check is routine.
  2. The summation is infinite (has infinitely many summands). The definition of linear combination involves only finite sums.
  3. No nontrivial finite sum of members of { g , f 0 , f 1 , } {\displaystyle \{g,f_{0},f_{1},\ldots \}} adds to the zero object: assume that
    c 0 ( 1 / ( 1 x ) ) + c 1 1 + + c n x n = 0 {\displaystyle c_{0}\cdot (1/(1-x))+c_{1}\cdot 1+\dots +c_{n}\cdot x^{n}=0}
    (any finite sum uses a highest power, here n {\displaystyle n} ). Multiply both sides by 1 x {\displaystyle 1-x} to conclude that each coefficient is zero, because a polynomial describes the zero function only when it is the zero polynomial.

Find 3 Linearly Independent of the Solution of the Systems

Source: https://en.wikibooks.org/wiki/Linear_Algebra/Definition_and_Examples_of_Linear_Independence/Solutions