\(
\def\R{\mathbb{R}}
\def\N{\mathbb{N}}
\def\Z{\mathbb{Z}}
\def\Q{\mathbb{Q}}
\def\eps{\varepsilon}
\def\epsilon{\varepsilon}
\newcommand\bs[1]{\boldsymbol{#1}}
\renewcommand{\geq}{\geqslant}
\renewcommand{\leq}{\leqslant}
\)
Chapter 1 - Linear Equations in Linear Algebra
1.7 - Linear Independence
Results
We can re-express the homogeneous problems A
x =
0 from 1.5 as follows:
$$
x_1
\begin{bmatrix*}[r]
1 \\
2 \\
3
\end{bmatrix*}
+
x_2
\begin{bmatrix*}[r]
4 \\
5 \\
6
\end{bmatrix*}
+
x_3
\begin{bmatrix*}[r]
2 \\
1 \\
0
\end{bmatrix*}
=
\begin{bmatrix*}[r]
0 \\
0 \\
0
\end{bmatrix*}
$$
This has a trivial solution, but again the question is whether this is the only solution.
Definition
An indexed set of vectors {v1,...,vp} in ℝn
is said to be linearly independent if the vector equation
$$
x_1\bs{v}_1 + x_2\bs{v}_2 + \ldots + x_p\bs{v}_p = \bs{0}
$$
has only the trivial solution. The set {v1,...,vp} is
said to be linearly dependent if there exists weights $c_1,\ldots,c_p$, not all 0,
such that
$$
c_1\bs{v}_1 + c_2\bs{v}_2 + \ldots + c_p\bs{v}_p = \bs{0}.
$$
Linear Independence of Matrix Columns
Suppose that we have the matrix
[
a1 ...
an]. The matrix equation
A
x =
0 can be written as:
$$
x_1\bs{a}_1 + x_2\bs{a}_2 + \ldots + x_n\bs{a}_2 = \bs{0}.
$$
From the definition: each linear dependence relation among the columns of A corresponds to
a nontrivial solution of A
x =
0. This can be summarized as the fact:
Fact
The columns of a matrix A are linearly independent if and only if the equation
Ax = 0 has only the trivial solution.
Sets of One or Two Vectors
A vector
v is linearly independent iff
v≠
0 because
the vector equation $x_1\bs{v} = \bs{0}$ only has the trivial solution
when
v≠
0. The zero vector
0 is linearly dependent
because $x_1\bs{0} = \bs{0}$ has many nontrivial solutions; $x_1 = 2$ or $x_1 = 5$ for instance.
Example
The following are linearly dependent:
$$
\bs{v}_1 =
\begin{bmatrix*}[r]
3 \\
1
\end{bmatrix*},
\quad
\bs{v}_2 =
\begin{bmatrix*}[r]
6 \\
2
\end{bmatrix*}
$$
because $\bs{v}_2 = 2\bs{v}_1$ which means $-2\bs{v}_1 + \bs{v}_2 = \bs{0}$.
The following are linearly independent:
$$
\bs{v}_1 =
\begin{bmatrix*}[r]
3 \\
2
\end{bmatrix*},
\quad
\bs{v}_2 =
\begin{bmatrix*}[r]
6 \\
2
\end{bmatrix*}
$$
because there are no factor c such that $c\bs{v}_1 = \bs{v}_2$.
This shows that we can determine by inspection if two vectors are linearly independent
or not: just check if they are a multiple of each other.
Fact
A set of two vectors {v1, v2} is linearly dependent
if at least one of the vectors is a multiple of the other. The set is linearly independent
iff neither of the vectors is a multiple of the other.
Sets of Two or More Vectors
Some results relating to linear independent vectors.
Theorem 1.7 - Characterization of Linearly Dependent Sets
An indexed set S = {v1, ..., vp} of two or more
vectors is linearly dependent iff at least one of the vectors in S is a linear
combination of the others. In fact, if S is linearly dependent and v1
≠ 0, then some vj (j > 1) is a linear combination of
the preceding vectors v1, ..., vj-1.
Theorem 1.8
If a set contains more vectors than there are entries in each vector, then
the set is linearly dependent. That is, any set {v1, ..., vp}
in ℝn is linearly dependent if p > n.
Theorem 1.9
If a set S = {v1, ..., vp} in ℝn contains
the zero vector, then the set is linearly dependent.
Exercise 1
Determine if the vectors are linearly independent.
$$
\begin{bmatrix*}[r]
5 \\
0 \\
0
\end{bmatrix*},
\begin{bmatrix*}[r]
7 \\
2 \\
-6
\end{bmatrix*},
\begin{bmatrix*}[r]
9 \\
4 \\
-8
\end{bmatrix*}
$$
Answer
Neither of column 2 or 3 are multiples of column 1. Column 3 is not a multiple of columns 1 and 2.
By inspection, we see that these vectors are linearly independent. Verifying with row operations:
$$
\begin{bmatrix*}[rrr]
5 & 7 & 9 \\
0 & 2 & 4 \\
0 & -6 & -8
\end{bmatrix*}
\sim
\begin{bmatrix*}[rrr]
1 & 0 & 0 \\
0 & 1 & 0 \\
0 & 0 & 1
\end{bmatrix*}
$$
■
Exercise 7
Determine if the columns form a linearly independent set.
$$
\begin{bmatrix*}[rrrr]
1 & 4 & -3 & 0 \\
-2 & -7 & 5 & 1 \\
-4 & -5 & 7 & 5
\end{bmatrix*}
$$
Answer
Here we see that we have 4 columns and only 3 entries in each vector.
By Theorem 1.8, the set of columns is linearly dependent. Verifying with
row operations:
$$
\begin{bmatrix*}[rrrr]
1 & 4 & -3 & 0 \\
-2 & -7 & 5 & 1 \\
-4 & -5 & 7 & 5
\end{bmatrix*}
\sim
\begin{bmatrix*}[rrrr]
1 & 0 & 0 & -3 \\
0 & 1 & 0 & 0 \\
0 & 0 & 1 & -1
\end{bmatrix*}
$$
Now checking this result:
$$
(-3)\bs{a}_1 + (-1)\bs{a}_3
=
(-3)
\begin{bmatrix*}[r]
1 \\
-2 \\
-4
\end{bmatrix*}
+
(-1)
\begin{bmatrix*}[r]
-3 \\
5 \\
7
\end{bmatrix*}
=
\begin{bmatrix*}[r]
-3 \\
6 \\
12
\end{bmatrix*}
+
\begin{bmatrix*}[r]
3 \\
-5 \\
-7
\end{bmatrix*}
=
\begin{bmatrix*}[r]
0 \\
1 \\
5
\end{bmatrix*}
=
\bs{a}_4
$$
■
Exercise 21
Verifying statements. Answer True or False and justify the answer.
(a)
The columns of a matrix A are linearly independent if the equation
Ax = 0 has the trivial solution.
Answer: False. This would be true if we knew that was the only trivial solution.
(b)
If S is a linearly dependent set, then each vector is a linear combination of
the other vectors in S.
Answer: False. Not necessarily true for every vector. Only needs to be true for a single one.
(c)
The columns of any 4×5 matrix are linearly independent.
Answer: True. The number of vectors outnumber the number of entries, so by Theorem 1.8 this is
linearly independent.
(d)
If x and y are linearly independent, and if {x, y, z}
is linearly independent, then z is in Span{x, y}.
Answer: True. The vector
z can be written as
z = a
x + b
y, which by
definition means it is included in the plane spanned by the vectors.
■
Exercise 22
Verifying statements. Answer True or False and justify the answer.
(a)
Two vectors are linearly dependent iff they lie on a line through the origin.
Answer: True. Since they lie on the same line, they are multiples of each other.
(b)
If a set contains fewer vectors than there are entries in the vectors,
then the set is linearly independent.
Answer: False. Counterexample: (1, 0, 0, 0) and (2, 0, 0, 0). Only two vectors with four
entries, but they are linearly dependent.
(c)
If x and y are linearly independent, and if z is in Span{x, y}
then {x, y, z} is linearly dependent.
Answer: True. If
z is in Span{
x,
y}, then
z = a
x + b
y
which in turn means {
x,
y,
z} is linearly dependent.
(d)
If a set in ℝn is linearly dependent, then the set contains more vectors than
there are entries in each vector.
Answer: False. Counter example: (1, 0, 0), (0, 1, 0), (1, 1, 0). Linear dependence, but there
are an equal number of vectors and entries in each vector.
■
Exercise 39
Suppose A is an m×n matrix with the property that for all
b in ℝ
m, the equation A
x =
b has at most
one solution. Use the definition of linear independence to explain why
the columns of A must be linearly independent.
Answer
This holds for any vector, including
0, so: A
x =
0 has at most
one solution. Furthermore, since
x =
0 is a solution,
we can conlcude that the trivial solution is the only solution! Hence the columns of A are
linearly independent.
■
Exercise 40
Suppose an m×n matrix A has
n pivot columns. Explain why for each
b in ℝ
m the equation A
x =
b has at most one solution.
[
Hint: Explain why A
x =
b cannot have infinitely many solutions.]
Answer
Since we have n pivot columns, we can conclude that m ≥ n. (If we have 2 pivot columns,
we must have at least 2 rows).
In the case m = n, then we have one pivot column for each column and row, which means that it
is linearly independent, and so we have exactly one solution.
In the case m > n, we have more entries than vectors. Illustratory example:
$$
\begin{bmatrix*}[r]
1 \\
0 \\
0 \\
0 \\
0
\end{bmatrix*},
\begin{bmatrix*}[r]
0 \\
0 \\
1 \\
0 \\
0
\end{bmatrix*},
\begin{bmatrix*}[r]
0 \\
0 \\
0 \\
0 \\
1
\end{bmatrix*}
$$
Using the converse statement of Theorem 1.8: if a set is not linearly dependent, the number of entries
is greater or equal to the number of vectors. If a set is not linearly dependent, it must be
linearly independent. Hence, we can have at most one solution.
■