Ch 7.3: Systems of Linear Equations,
Linear Independence, Eigenvalues
A system of n linear equations in n variables,
can be expressed as a matrix equation Ax = b:
If b = 0, then system is homogeneous; otherwise it is
nonhomogeneous.














=




























nnnnnn
n
n
b
b
b
x
x
x
aaa
aaa
aaa





2
1
2
1
,2,1,
,22,21,2
,12,11,1
,,22,11,
2,222,211,2
1,122,111,1
nnnnnn
nn
nn
bxaxaxa
bxaxaxa
bxaxaxa
=+++
=+++
=+++




Nonsingular Case
If the coefficient matrix A is nonsingular, then it is
invertible and we can solve Ax = b as follows:
This solution is therefore unique. Also, if b = 0, it follows
that the unique solution to Ax = 0 is x = A-1
0 = 0.
Thus if A is nonsingular, then the only solution to Ax = 0 is
the trivial solution x = 0.
bAxbAIxbAAxAbAx 1111 −−−−
=⇒=⇒=⇒=
Example 1: Nonsingular Case (1 of 3)
From a previous example, we know that the matrix A below
is nonsingular with inverse as given.
Using the definition of matrix multiplication, it follows that
the only solution of Ax = 0 is x = 0:










−
−−
−−
=










−
= −
2/122/3
142
2/372/9
,
834
301
210
1
AA










=




















−
−−
−−
== −
0
0
0
0
0
0
2/122/3
142
2/372/9
1
0Ax
Example 1: Nonsingular Case (2 of 3)
Now let’s solve the nonhomogeneous linear system Ax = b
below using A-1
:
This system of equations can be written as Ax = b, where
Then
0834
2301
220
321
321
321
=+−
−=++
=++
xxx
xxx
xxx










−
−
=










−










−
−−
−−
== −
7
12
23
0
2
2
2/122/3
142
2/372/9
1
bAx










−=










=










−
=
0
2
2
,,
834
301
210
3
2
1
bxA
x
x
x
Example 1: Nonsingular Case (3 of 3)
Alternatively, we could solve the nonhomogeneous linear
system Ax = b below using row reduction.
To do so, form the augmented matrix (A|b) and reduce,
using elementary row operations.
( )










−
−
=→
=
=+
−=+
→









 −
→









 −
→










−−
−
→










−
−
→










−
−=
7
12
23
7
22
23
7100
2210
2301
14200
2210
2301
8430
2210
2301
0834
2210
2301
0834
2301
2210
3
32
31
x
bA
x
xx
xx
0834
2301
220
321
321
321
=+−
−=++
=++
xxx
xxx
xxx
Singular Case
If the coefficient matrix A is singular, then A-1
does not
exist, and either a solution to Ax = b does not exist, or there
is more than one solution (not unique).
Further, the homogeneous system Ax = 0 has more than one
solution. That is, in addition to the trivial solution x = 0,
there are infinitely many nontrivial solutions.
The nonhomogeneous case Ax = b has no solution unless
(b, y) = 0, for all vectors y satisfying A*
y = 0, where A*
is
the adjoint of A.
In this case, Ax = b has solutions (infinitely many), each of
the form x = x(0)
+ ξ, where x(0)
is a particular solution of
Ax = b, and ξ is any solution of Ax = 0.
Example 2: Singular Case (1 of 3)
Solve the nonhomogeneous linear system Ax = b below
using row reduction.
To do so, form the augmented matrix (A|b) and reduce,
using elementary row operations.
( )
solnno
10
153
12
1000
1530
1121
4000
1530
1121
3530
1530
1121
61060
1530
1121
1545
0651
1121
3
32
321
→
=
=+
=−−
→









 −−
→










−
−−
→










−
−−
→










−
−−
→










−−
−
−−
=
x
xx
xxx
bA
1545
0651
1121
321
321
321
−=+−
=++−
=−−
xxx
xxx
xxx
Example 2: Singular Case (2 of 3)
Solve the nonhomogeneous linear system Ax = b below
using row reduction.
Reduce the augmented matrix (A|b) as follows:
( )
072
2
7
2
1
000
530
121
2
5
2
1
530
530
121
51060
530
121
545
651
121
123
123
12
1
13
12
1
13
12
1
3
2
1
=−−→














−−
+
−−
→














−
+
−−
→










−
+
−−
→










−
−
−−
=
bbb
bbb
bb
b
bb
bb
b
bb
bb
b
b
b
b
bA
3321
2321
1321
545
651
121
bxxx
bxxx
bxxx
=+−
=++−
=−−
Example 2: Singular Case (3 of 3)
From the previous slide, we require
Suppose
Then the reduced augmented matrix (A|b) becomes:
ξxxxx +=










−
+










==→










−
−
+










=→










−
−
=→
=
=+
=−−
→














−−
+
−−
)0(
3
3
3
3
3
32
321
123
12
1
3
5
7
0
0
1
1
3/5
3/7
0
0
1
3/5
3/71
00
053
112
2
7
2
1
000
530
121
cx
x
x
x
x
xx
xxx
bbb
bb
b
072 123 =−− bbb
5,1,1 321 =−== bbb
Linear Dependence and Independence
A set of vectors x(1)
, x(2)
,…, x(n)
is linearly dependent if there
exists scalars c1, c2,…, cn, not all zero, such that
If the only solution of
is c1= c2 = …= cn = 0, then x(1)
, x(2)
,…, x(n)
is linearly
independent.
0xxx =+++ )()2(
2
)1(
1
n
nccc 
0xxx =+++ )()2(
2
)1(
1
n
nccc 
Example 3: Linear Independence (1 of 2)
Determine whether the following vectors are linear
dependent or linearly independent.
We need to solve
or










=




















−
⇔










=










+










−
+










0
0
0
834
301
210
0
0
0
8
3
2
3
0
1
4
1
0
3
2
1
21
c
c
c
ccc
0xxx =++ )3(
3
)2(
2
)1(
1 ccc










=










−
=










=
8
3
2
,
3
0
1
,
4
1
0
)3()2()1(
xxx
Example 3: Linear Independence (2 of 2)
We thus reduce the augmented matrix (A|b), as before.
Thus the only solution is c1= c2 = …= cn = 0, and therefore
the original vectors are linearly independent.
( )










=→
=
=+
=+
→










→










−
=
0
0
0
0
02
03
0100
0210
0301
0834
0301
0210
3
32
31
c
bA
c
cc
cc
Example 4: Linear Dependence (1 of 2)
Determine whether the following vectors are linear
dependent or linearly independent.
We need to solve
or









−
=










−
−
=










−=
5
6
1
,
4
5
2
,
5
1
1
)3()2()1(
xxx
0xxx =++ )3(
3
)2(
2
)1(
1 ccc










=




















−
−
−−
⇔










=









−
+










−
−
+










−
0
0
0
545
651
121
0
0
0
5
6
1
4
5
2
5
1
1
3
2
1
321
c
c
c
ccc
Example 4: Linear Dependence (2 of 2)
We thus reduce the augmented matrix (A|b), as before.
Thus the original vectors are linearly dependent, with
( )










−
=→










−
−
=→
=
=+
=−−
→









 −−
→










−
−
−−
=
3
5
7
3/5
3/7
00
053
012
0000
0530
0121
0545
0651
0121
3
3
3
3
32
321
k
c
c
c
c
cc
ccc
cc
bA










=









−
−










−
−
+










−
0
0
0
5
6
1
3
4
5
2
5
5
1
1
7
Linear Independence and Invertibility
Consider the previous two examples:
The first matrix was known to be nonsingular, and its column vectors
were linearly independent.
The second matrix was known to be singular, and its column vectors
were linearly dependent.
This is true in general: the columns (or rows) of A are linearly
independent iff A is nonsingular iff A-1
exists.
Also, A is nonsingular iff detA ≠ 0, hence columns (or rows)
of A are linearly independent iff detA ≠ 0.
Further, if A = BC, then det(C) = det(A)det(B). Thus if the
columns (or rows) of A and B are linearly independent, then
the columns (or rows) of C are also.
Linear Dependence & Vector Functions
Now consider vector functions x(1)
(t), x(2)
(t),…, x(n)
(t), where
As before, x(1)
(t), x(2)
(t),…, x(n)
(t) is linearly dependent on I if
there exists scalars c1, c2,…, cn, not all zero, such that
Otherwise x(1)
(t), x(2)
(t),…, x(n)
(t) is linearly independent on I
See text for more discussion on this.
( )
( )βα,,,,2,1,
)(
)(
)(
)(
)(
)(
2
)(
1
=∈=














= Itnk
tx
tx
tx
t
k
m
k
k
k


x
Ittctctc n
n ∈=+++ allfor,)()()( )()2(
2
)1(
1 0xxx 
Eigenvalues and Eigenvectors
The eqn. Ax = y can be viewed as a linear transformation
that maps (or transforms) x into a new vector y.
Nonzero vectors x that transform into multiples of
themselves are important in many applications.
Thus we solve Ax = λx or equivalently, (A-λI)x = 0.
This equation has a nonzero solution if we choose λ such
that det(A-λI) = 0.
Such values of λ are called eigenvalues of A, and the
nonzero solutions x are called eigenvectors.
Example 5: Eigenvalues (1 of 3)
Find the eigenvalues and eigenvectors of the matrix A.
Solution: Choose λ such that det(A-λI) = 0, as follows.






−
=
63
32
A
( )
( )( ) ( )( )
( )( )
7,3
73214
3362
63
32
det
00
01
63
32
detdet
2
−==⇒
+−=−+=
−−−−=






−−
−
=














−





−
=−
λλ
λλλλ
λλ
λ
λ
λλIA
Example 5: First Eigenvector (2 of 3)
To find the eigenvectors of the matrix A, we need to solve
(A-λI)x = 0 for λ = 3 and λ = -7.
Eigenvector for λ = 3: Solve
by row reducing the augmented matrix:
( ) 





=











−
−
⇔





=











−−
−
⇔=−
0
0
93
31
0
0
363
332
2
1
2
1
x
x
x
x
0xIA λ






=→





=





=→
=
=−
→




 −
→





−
−
→





−
−
1
3
choosearbitrary,
1
33
00
031
000
031
093
031
093
031
)1(
2
2)1(
2
21
xx cc
x
x
x
xx
Example 5: Second Eigenvector (3 of 3)
Eigenvector for λ = -7: Solve
by row reducing the augmented matrix:
( ) 





=











⇔





=











+−
+
⇔=−
0
0
13
39
0
0
763
372
2
1
2
1
x
x
x
x
0xIA λ





−
=→




−
=




−
=→
=
=+
→





→





→





3
1
choosearbitrary,
1
3/13/1
00
03/11
000
03/11
013
03/11
013
039
)2(
2
2)2(
2
21
xx cc
x
x
x
xx
Normalized Eigenvectors
From the previous example, we see that eigenvectors are
determined up to a nonzero multiplicative constant.
If this constant is specified in some particular way, then the
eigenvector is said to be normalized.
For example, eigenvectors are sometimes normalized by
choosing the constant so that ||x|| = (x, x)½
= 1.
Algebraic and Geometric Multiplicity
In finding the eigenvalues λ of an n x n matrix A, we solve
det(A-λI) = 0.
Since this involves finding the determinant of an n x n
matrix, the problem reduces to finding roots of an nth
degree polynomial.
Denote these roots, or eigenvalues, by λ1, λ2, …, λn.
If an eigenvalue is repeated m times, then its algebraic
multiplicity is m.
Each eigenvalue has at least one eigenvector, and a
eigenvalue of algebraic multiplicity m may have q linearly
independent eigevectors, 1 ≤ q ≤ m, and q is called the
geometric multiplicity of the eigenvalue.
Eigenvectors and Linear Independence
If an eigenvalue λ has algebraic multiplicity 1, then it is
said to be simple, and the geometric multiplicity is 1 also.
If each eigenvalue of an n x n matrix A is simple, then A
has n distinct eigenvalues. It can be shown that the n
eigenvectors corresponding to these eigenvalues are linearly
independent.
If an eigenvalue has one or more repeated eigenvalues, then
there may be fewer than n linearly independent eigenvectors
since for each repeated eigenvalue, we may have q < m.
This may lead to complications in solving systems of
differential equations.
Example 6: Eigenvalues (1 of 5)
Find the eigenvalues and eigenvectors of the matrix A.
Solution: Choose λ such that det(A-λI) = 0, as follows.










=
011
101
110
A
( )
1,1,2
)1)(2(
23
11
11
11
detdet
221
2
3
−=−==⇒
+−=
++−=










−
−
−
=−
λλλ
λλ
λλ
λ
λ
λ
λIA
Example 6: First Eigenvector (2 of 5)
Eigenvector for λ = 2: Solve (A-λI)x = 0, as follows.










=→










=










=→
=
=−
=−
→










−
−
→










−
−
→










−
−
−
→










−
−
−
→










−
−
−
1
1
1
choosearbitrary,
1
1
1
00
011
011
0000
0110
0101
0000
0110
0211
0330
0330
0211
0112
0121
0211
0211
0121
0112
)1(
3
3
3
)1(
3
32
31
xx cc
x
x
x
x
xx
xx
Example 6: 2nd
and 3rd
Eigenvectors (3 of 5)
Eigenvector for λ = -1: Solve (A-λI)x = 0, as follows.










−
=










−
=→









−
+









−
=









 −−
=→
=
=
=++
→










→










1
1
0
,
1
0
1
choose
arbitrary,where,
1
0
1
0
1
1
00
00
0111
0000
0000
0111
0111
0111
0111
)3()2(
3232
3
2
32
)2(
3
2
321
xx
x xxxx
x
x
xx
x
x
xxx
Example 6: Eigenvectors of A (4 of 5)
Thus three eigenvectors of A are
where x(2)
, x(3)
correspond to the double eigenvalue λ = - 1.
It can be shown that x(1)
, x(2)
, x(3)
are linearly independent.
Hence A is a 3 x 3 symmetric matrix (A = AT
) with 3 real
eigenvalues and 3 linearly independent eigenvectors.










−
=










−
=










=
1
1
0
,
1
0
1
,
1
1
1
)3()2()1(
xxx










=
011
101
110
A
Example 6: Eigenvectors of A (5 of 5)
Note that we could have we had chosen
Then the eigenvectors are orthogonal, since
Thus A is a 3 x 3 symmetric matrix with 3 real eigenvalues
and 3 linearly independent orthogonal eigenvectors.










−=










−
=










=
1
2
1
,
1
0
1
,
1
1
1
)3()2()1(
xxx
( ) ( ) ( ) 0,,0,,0, )3()2()3()1()2()1(
=== xxxxxx
Hermitian Matrices
A self-adjoint, or Hermitian matrix, satisfies A = A*
,
where we recall that A*
= AT
.
Thus for a Hermitian matrix, aij = aji.
Note that if A has real entries and is symmetric (see last
example), then A is Hermitian.
An n x n Hermitian matrix A has the following properties:
All eigenvalues of A are real.
There exists a full set of n linearly independent eigenvectors of A.
If x(1)
and x(2)
are eigenvectors that correspond to different
eigenvalues of A, then x(1)
and x(2)
are orthogonal.
Corresponding to an eigenvalue of algebraic multiplicity m, it is
possible to choose m mutually orthogonal eigenvectors, and hence
A has a full set of n linearly independent orthogonal eigenvectors.

More Related Content

PDF
Lesson 13: Rank and Solutions to Systems of Linear Equations
PDF
Chapter 3 solving systems of linear equations
PDF
Ch9-Gauss_Elimination4.pdf
PDF
Notes on eigenvalues
PPT
3.2 a solving systems algebraically
PPTX
160280102042 c3 aem
PPTX
Series solution to ordinary differential equations
Lesson 13: Rank and Solutions to Systems of Linear Equations
Chapter 3 solving systems of linear equations
Ch9-Gauss_Elimination4.pdf
Notes on eigenvalues
3.2 a solving systems algebraically
160280102042 c3 aem
Series solution to ordinary differential equations

What's hot (20)

PPT
Linear Algebra and Matrix
PPTX
Consistency of linear equations in two and three variables
PPTX
Eigenvalue problems .ppt
PDF
Sect4 5
PPTX
4 6 radical equations-x
PPTX
Linear equations
PPT
matrices and algbra
PPTX
rank of matrix
PDF
Eigen vector
PPTX
Series solutions at ordinary point and regular singular point
PPT
Solution of linear system of equations
PDF
Lesson 5: Matrix Algebra (slides)
DOCX
Ch6 series solutions algebra
PPT
Ch04 2
PPT
Ch05 4
PDF
Solution to linear equhgations
PDF
Week 6 [compatibility mode]
PDF
Spm Add Maths Formula List Form4
PDF
1. Linear Algebra for Machine Learning: Linear Systems
PDF
2. Linear Algebra for Machine Learning: Basis and Dimension
Linear Algebra and Matrix
Consistency of linear equations in two and three variables
Eigenvalue problems .ppt
Sect4 5
4 6 radical equations-x
Linear equations
matrices and algbra
rank of matrix
Eigen vector
Series solutions at ordinary point and regular singular point
Solution of linear system of equations
Lesson 5: Matrix Algebra (slides)
Ch6 series solutions algebra
Ch04 2
Ch05 4
Solution to linear equhgations
Week 6 [compatibility mode]
Spm Add Maths Formula List Form4
1. Linear Algebra for Machine Learning: Linear Systems
2. Linear Algebra for Machine Learning: Basis and Dimension
Ad

Similar to Ch07 3 (20)

PPTX
Matrices ppt
PDF
LA question Pool
PPTX
Linear Algebra.pptx
PDF
Rankmatrix
PPT
Ch07 7
PDF
UNIT I System of Linear Equations.pdfUNIT I System of Linear Equations.pdf
PDF
Linear Algebra for AI & ML
PPTX
Matrix PPT.pptx______________________________________________________________...
PPT
Ch07 8
PDF
Matlab eig
PDF
Eigenvalues - Contd
PPTX
MODULE_05-Matrix Decomposition.pptx
PPTX
linear algebra.pptx
PPTX
Matrix and linear algebra Introduced by: Khalid Jawad Kadhim
PPTX
eigenvalue
PPTX
Lecture 02
PPT
Linear algebra03fallleturenotes01
PDF
Foundations of Machine Learning - Module 1 (LINEAR ALGEBRA )
PDF
Partial midterm set7 soln linear algebra
PPTX
Eigenvalue problems-numerical methods.pptx
Matrices ppt
LA question Pool
Linear Algebra.pptx
Rankmatrix
Ch07 7
UNIT I System of Linear Equations.pdfUNIT I System of Linear Equations.pdf
Linear Algebra for AI & ML
Matrix PPT.pptx______________________________________________________________...
Ch07 8
Matlab eig
Eigenvalues - Contd
MODULE_05-Matrix Decomposition.pptx
linear algebra.pptx
Matrix and linear algebra Introduced by: Khalid Jawad Kadhim
eigenvalue
Lecture 02
Linear algebra03fallleturenotes01
Foundations of Machine Learning - Module 1 (LINEAR ALGEBRA )
Partial midterm set7 soln linear algebra
Eigenvalue problems-numerical methods.pptx
Ad

More from Rendy Robert (20)

PPT
Ch08 3
PPT
Ch08 1
PPT
Ch07 9
PPT
Ch07 6
PPT
Ch07 5
PPT
Ch07 4
PPT
Ch07 2
PPT
Ch07 1
PPT
Ch06 6
PPT
Ch06 5
PPT
Ch06 4
PPT
Ch06 3
PPT
Ch06 2
PPT
Ch06 1
PPT
Ch05 8
PPT
Ch05 7
PPT
Ch05 6
PPT
Ch05 5
PPT
Ch05 3
PPT
Ch05 2
Ch08 3
Ch08 1
Ch07 9
Ch07 6
Ch07 5
Ch07 4
Ch07 2
Ch07 1
Ch06 6
Ch06 5
Ch06 4
Ch06 3
Ch06 2
Ch06 1
Ch05 8
Ch05 7
Ch05 6
Ch05 5
Ch05 3
Ch05 2

Ch07 3

  • 1. Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues A system of n linear equations in n variables, can be expressed as a matrix equation Ax = b: If b = 0, then system is homogeneous; otherwise it is nonhomogeneous.               =                             nnnnnn n n b b b x x x aaa aaa aaa      2 1 2 1 ,2,1, ,22,21,2 ,12,11,1 ,,22,11, 2,222,211,2 1,122,111,1 nnnnnn nn nn bxaxaxa bxaxaxa bxaxaxa =+++ =+++ =+++    
  • 2. Nonsingular Case If the coefficient matrix A is nonsingular, then it is invertible and we can solve Ax = b as follows: This solution is therefore unique. Also, if b = 0, it follows that the unique solution to Ax = 0 is x = A-1 0 = 0. Thus if A is nonsingular, then the only solution to Ax = 0 is the trivial solution x = 0. bAxbAIxbAAxAbAx 1111 −−−− =⇒=⇒=⇒=
  • 3. Example 1: Nonsingular Case (1 of 3) From a previous example, we know that the matrix A below is nonsingular with inverse as given. Using the definition of matrix multiplication, it follows that the only solution of Ax = 0 is x = 0:           − −− −− =           − = − 2/122/3 142 2/372/9 , 834 301 210 1 AA           =                     − −− −− == − 0 0 0 0 0 0 2/122/3 142 2/372/9 1 0Ax
  • 4. Example 1: Nonsingular Case (2 of 3) Now let’s solve the nonhomogeneous linear system Ax = b below using A-1 : This system of equations can be written as Ax = b, where Then 0834 2301 220 321 321 321 =+− −=++ =++ xxx xxx xxx           − − =           −           − −− −− == − 7 12 23 0 2 2 2/122/3 142 2/372/9 1 bAx           −=           =           − = 0 2 2 ,, 834 301 210 3 2 1 bxA x x x
  • 5. Example 1: Nonsingular Case (3 of 3) Alternatively, we could solve the nonhomogeneous linear system Ax = b below using row reduction. To do so, form the augmented matrix (A|b) and reduce, using elementary row operations. ( )           − − =→ = =+ −=+ →           − →           − →           −− − →           − − →           − −= 7 12 23 7 22 23 7100 2210 2301 14200 2210 2301 8430 2210 2301 0834 2210 2301 0834 2301 2210 3 32 31 x bA x xx xx 0834 2301 220 321 321 321 =+− −=++ =++ xxx xxx xxx
  • 6. Singular Case If the coefficient matrix A is singular, then A-1 does not exist, and either a solution to Ax = b does not exist, or there is more than one solution (not unique). Further, the homogeneous system Ax = 0 has more than one solution. That is, in addition to the trivial solution x = 0, there are infinitely many nontrivial solutions. The nonhomogeneous case Ax = b has no solution unless (b, y) = 0, for all vectors y satisfying A* y = 0, where A* is the adjoint of A. In this case, Ax = b has solutions (infinitely many), each of the form x = x(0) + ξ, where x(0) is a particular solution of Ax = b, and ξ is any solution of Ax = 0.
  • 7. Example 2: Singular Case (1 of 3) Solve the nonhomogeneous linear system Ax = b below using row reduction. To do so, form the augmented matrix (A|b) and reduce, using elementary row operations. ( ) solnno 10 153 12 1000 1530 1121 4000 1530 1121 3530 1530 1121 61060 1530 1121 1545 0651 1121 3 32 321 → = =+ =−− →           −− →           − −− →           − −− →           − −− →           −− − −− = x xx xxx bA 1545 0651 1121 321 321 321 −=+− =++− =−− xxx xxx xxx
  • 8. Example 2: Singular Case (2 of 3) Solve the nonhomogeneous linear system Ax = b below using row reduction. Reduce the augmented matrix (A|b) as follows: ( ) 072 2 7 2 1 000 530 121 2 5 2 1 530 530 121 51060 530 121 545 651 121 123 123 12 1 13 12 1 13 12 1 3 2 1 =−−→               −− + −− →               − + −− →           − + −− →           − − −− = bbb bbb bb b bb bb b bb bb b b b b bA 3321 2321 1321 545 651 121 bxxx bxxx bxxx =+− =++− =−−
  • 9. Example 2: Singular Case (3 of 3) From the previous slide, we require Suppose Then the reduced augmented matrix (A|b) becomes: ξxxxx +=           − +           ==→           − − +           =→           − − =→ = =+ =−− →               −− + −− )0( 3 3 3 3 3 32 321 123 12 1 3 5 7 0 0 1 1 3/5 3/7 0 0 1 3/5 3/71 00 053 112 2 7 2 1 000 530 121 cx x x x x xx xxx bbb bb b 072 123 =−− bbb 5,1,1 321 =−== bbb
  • 10. Linear Dependence and Independence A set of vectors x(1) , x(2) ,…, x(n) is linearly dependent if there exists scalars c1, c2,…, cn, not all zero, such that If the only solution of is c1= c2 = …= cn = 0, then x(1) , x(2) ,…, x(n) is linearly independent. 0xxx =+++ )()2( 2 )1( 1 n nccc  0xxx =+++ )()2( 2 )1( 1 n nccc 
  • 11. Example 3: Linear Independence (1 of 2) Determine whether the following vectors are linear dependent or linearly independent. We need to solve or           =                     − ⇔           =           +           − +           0 0 0 834 301 210 0 0 0 8 3 2 3 0 1 4 1 0 3 2 1 21 c c c ccc 0xxx =++ )3( 3 )2( 2 )1( 1 ccc           =           − =           = 8 3 2 , 3 0 1 , 4 1 0 )3()2()1( xxx
  • 12. Example 3: Linear Independence (2 of 2) We thus reduce the augmented matrix (A|b), as before. Thus the only solution is c1= c2 = …= cn = 0, and therefore the original vectors are linearly independent. ( )           =→ = =+ =+ →           →           − = 0 0 0 0 02 03 0100 0210 0301 0834 0301 0210 3 32 31 c bA c cc cc
  • 13. Example 4: Linear Dependence (1 of 2) Determine whether the following vectors are linear dependent or linearly independent. We need to solve or          − =           − − =           −= 5 6 1 , 4 5 2 , 5 1 1 )3()2()1( xxx 0xxx =++ )3( 3 )2( 2 )1( 1 ccc           =                     − − −− ⇔           =          − +           − − +           − 0 0 0 545 651 121 0 0 0 5 6 1 4 5 2 5 1 1 3 2 1 321 c c c ccc
  • 14. Example 4: Linear Dependence (2 of 2) We thus reduce the augmented matrix (A|b), as before. Thus the original vectors are linearly dependent, with ( )           − =→           − − =→ = =+ =−− →           −− →           − − −− = 3 5 7 3/5 3/7 00 053 012 0000 0530 0121 0545 0651 0121 3 3 3 3 32 321 k c c c c cc ccc cc bA           =          − −           − − +           − 0 0 0 5 6 1 3 4 5 2 5 5 1 1 7
  • 15. Linear Independence and Invertibility Consider the previous two examples: The first matrix was known to be nonsingular, and its column vectors were linearly independent. The second matrix was known to be singular, and its column vectors were linearly dependent. This is true in general: the columns (or rows) of A are linearly independent iff A is nonsingular iff A-1 exists. Also, A is nonsingular iff detA ≠ 0, hence columns (or rows) of A are linearly independent iff detA ≠ 0. Further, if A = BC, then det(C) = det(A)det(B). Thus if the columns (or rows) of A and B are linearly independent, then the columns (or rows) of C are also.
  • 16. Linear Dependence & Vector Functions Now consider vector functions x(1) (t), x(2) (t),…, x(n) (t), where As before, x(1) (t), x(2) (t),…, x(n) (t) is linearly dependent on I if there exists scalars c1, c2,…, cn, not all zero, such that Otherwise x(1) (t), x(2) (t),…, x(n) (t) is linearly independent on I See text for more discussion on this. ( ) ( )βα,,,,2,1, )( )( )( )( )( )( 2 )( 1 =∈=               = Itnk tx tx tx t k m k k k   x Ittctctc n n ∈=+++ allfor,)()()( )()2( 2 )1( 1 0xxx 
  • 17. Eigenvalues and Eigenvectors The eqn. Ax = y can be viewed as a linear transformation that maps (or transforms) x into a new vector y. Nonzero vectors x that transform into multiples of themselves are important in many applications. Thus we solve Ax = λx or equivalently, (A-λI)x = 0. This equation has a nonzero solution if we choose λ such that det(A-λI) = 0. Such values of λ are called eigenvalues of A, and the nonzero solutions x are called eigenvectors.
  • 18. Example 5: Eigenvalues (1 of 3) Find the eigenvalues and eigenvectors of the matrix A. Solution: Choose λ such that det(A-λI) = 0, as follows.       − = 63 32 A ( ) ( )( ) ( )( ) ( )( ) 7,3 73214 3362 63 32 det 00 01 63 32 detdet 2 −==⇒ +−=−+= −−−−=       −− − =               −      − =− λλ λλλλ λλ λ λ λλIA
  • 19. Example 5: First Eigenvector (2 of 3) To find the eigenvectors of the matrix A, we need to solve (A-λI)x = 0 for λ = 3 and λ = -7. Eigenvector for λ = 3: Solve by row reducing the augmented matrix: ( )       =            − − ⇔      =            −− − ⇔=− 0 0 93 31 0 0 363 332 2 1 2 1 x x x x 0xIA λ       =→      =      =→ = =− →      − →      − − →      − − 1 3 choosearbitrary, 1 33 00 031 000 031 093 031 093 031 )1( 2 2)1( 2 21 xx cc x x x xx
  • 20. Example 5: Second Eigenvector (3 of 3) Eigenvector for λ = -7: Solve by row reducing the augmented matrix: ( )       =            ⇔      =            +− + ⇔=− 0 0 13 39 0 0 763 372 2 1 2 1 x x x x 0xIA λ      − =→     − =     − =→ = =+ →      →      →      3 1 choosearbitrary, 1 3/13/1 00 03/11 000 03/11 013 03/11 013 039 )2( 2 2)2( 2 21 xx cc x x x xx
  • 21. Normalized Eigenvectors From the previous example, we see that eigenvectors are determined up to a nonzero multiplicative constant. If this constant is specified in some particular way, then the eigenvector is said to be normalized. For example, eigenvectors are sometimes normalized by choosing the constant so that ||x|| = (x, x)½ = 1.
  • 22. Algebraic and Geometric Multiplicity In finding the eigenvalues λ of an n x n matrix A, we solve det(A-λI) = 0. Since this involves finding the determinant of an n x n matrix, the problem reduces to finding roots of an nth degree polynomial. Denote these roots, or eigenvalues, by λ1, λ2, …, λn. If an eigenvalue is repeated m times, then its algebraic multiplicity is m. Each eigenvalue has at least one eigenvector, and a eigenvalue of algebraic multiplicity m may have q linearly independent eigevectors, 1 ≤ q ≤ m, and q is called the geometric multiplicity of the eigenvalue.
  • 23. Eigenvectors and Linear Independence If an eigenvalue λ has algebraic multiplicity 1, then it is said to be simple, and the geometric multiplicity is 1 also. If each eigenvalue of an n x n matrix A is simple, then A has n distinct eigenvalues. It can be shown that the n eigenvectors corresponding to these eigenvalues are linearly independent. If an eigenvalue has one or more repeated eigenvalues, then there may be fewer than n linearly independent eigenvectors since for each repeated eigenvalue, we may have q < m. This may lead to complications in solving systems of differential equations.
  • 24. Example 6: Eigenvalues (1 of 5) Find the eigenvalues and eigenvectors of the matrix A. Solution: Choose λ such that det(A-λI) = 0, as follows.           = 011 101 110 A ( ) 1,1,2 )1)(2( 23 11 11 11 detdet 221 2 3 −=−==⇒ +−= ++−=           − − − =− λλλ λλ λλ λ λ λ λIA
  • 25. Example 6: First Eigenvector (2 of 5) Eigenvector for λ = 2: Solve (A-λI)x = 0, as follows.           =→           =           =→ = =− =− →           − − →           − − →           − − − →           − − − →           − − − 1 1 1 choosearbitrary, 1 1 1 00 011 011 0000 0110 0101 0000 0110 0211 0330 0330 0211 0112 0121 0211 0211 0121 0112 )1( 3 3 3 )1( 3 32 31 xx cc x x x x xx xx
  • 26. Example 6: 2nd and 3rd Eigenvectors (3 of 5) Eigenvector for λ = -1: Solve (A-λI)x = 0, as follows.           − =           − =→          − +          − =           −− =→ = = =++ →           →           1 1 0 , 1 0 1 choose arbitrary,where, 1 0 1 0 1 1 00 00 0111 0000 0000 0111 0111 0111 0111 )3()2( 3232 3 2 32 )2( 3 2 321 xx x xxxx x x xx x x xxx
  • 27. Example 6: Eigenvectors of A (4 of 5) Thus three eigenvectors of A are where x(2) , x(3) correspond to the double eigenvalue λ = - 1. It can be shown that x(1) , x(2) , x(3) are linearly independent. Hence A is a 3 x 3 symmetric matrix (A = AT ) with 3 real eigenvalues and 3 linearly independent eigenvectors.           − =           − =           = 1 1 0 , 1 0 1 , 1 1 1 )3()2()1( xxx           = 011 101 110 A
  • 28. Example 6: Eigenvectors of A (5 of 5) Note that we could have we had chosen Then the eigenvectors are orthogonal, since Thus A is a 3 x 3 symmetric matrix with 3 real eigenvalues and 3 linearly independent orthogonal eigenvectors.           −=           − =           = 1 2 1 , 1 0 1 , 1 1 1 )3()2()1( xxx ( ) ( ) ( ) 0,,0,,0, )3()2()3()1()2()1( === xxxxxx
  • 29. Hermitian Matrices A self-adjoint, or Hermitian matrix, satisfies A = A* , where we recall that A* = AT . Thus for a Hermitian matrix, aij = aji. Note that if A has real entries and is symmetric (see last example), then A is Hermitian. An n x n Hermitian matrix A has the following properties: All eigenvalues of A are real. There exists a full set of n linearly independent eigenvectors of A. If x(1) and x(2) are eigenvectors that correspond to different eigenvalues of A, then x(1) and x(2) are orthogonal. Corresponding to an eigenvalue of algebraic multiplicity m, it is possible to choose m mutually orthogonal eigenvectors, and hence A has a full set of n linearly independent orthogonal eigenvectors.