Professional Documents
Culture Documents
Peter J. Hammond
Corollary
The polynomial P(λ) canQbe factorized
as the product Pn (λ) ≡ nr=1 (λ − λr ) of exactly n linear terms.
Proof.
The proof will be by induction on n.
When n = 1 one has P1 (λ) = λ + p0 , whose only root is λ = −p0 .
Suppose the result is true when n = m − 1.
By the fundamental theorem of algebra,
there exists λ̂ ∈ C such that Pm (λ̂) = 0.
Polynomial division gives Pm (λ) ≡ Pm−1 (λ)(λ − λ̂), etc.
University of Warwick, EC9A0 Maths for Economists Peter J. Hammond 14 of 44
Characteristic Roots as Eigenvalues
Theorem
Every n × n matrix A ∈ Cn×n with complex elements
has exactly n eigenvalues (real or complex)
corresponding to the roots, counting multiple roots,
of the characteristic equation |A − λI| = 0.
Proof.
The characteristic equation can be written in the form Pn (λ) = 0
where Pn (λ) ≡ |λI − A| is a polynomial of degree n.
By the fundamental theorem of algebra, together with its corollary,
the polynomial |λI − A| equals the product nr=1 (λ − λr )
Q
of n linear terms.
For any of these roots λr the matrix A − λr I is singular,
so there exists x 6= 0 such that (A − λr I)x = 0 or Ax = λr x,
implying that λr is an eigenvalue.
gives
Xm−1
0= αk (λm − λk )xk
k=1
So we have Xm−1
0= αk (λm − λk )xk
k=1
A = SΛS−1 ⇐⇒ AS = SΛ ⇐⇒ S−1 AS = Λ
Theorem
Given any n × n matrix A:
1. The columns of any matrix S that diagonalizes A
must consist of n linearly independent eigenvectors of A.
2. The matrix A is diagonalizable if and only if
it has a set of n linearly independent eigenvectors.
3. The matrix A and its diagonalization Λ = S−1 AS
have the same set of eigenvalues.
University of Warwick, EC9A0 Maths for Economists Peter J. Hammond 20 of 44
Proof of Part 1
Suppose that AS = SΛ where A = (aij )n×n , S = (sij )n×n ,
and Λ = diag(λ1 , λ2 , . . . , λn ).
Then for each i, k ∈ {1, 2, . . . , n},
equating the elements in row i and column k
of the equal matrices AS and SΛ implies that
Xn Xn
aij sjk = sij δjk λk = sik λk
j=1 j=1
Example
0 1
The non-symmetric matrix A = cannot be diagonalized.
0 0
−λ 1
Its characteristic equation is 0 = |A − λI| =
= λ2 .
0 −λ
It follows that λ = 0 is the unique eigenvalue.
x1 0 1 x1 x
The eigenvalue equation is 0 = = 2
x2 0 0 x2 0
>
or x2 = 0, whose only solutions take the form x2 (1, 0) .
Thus, every eigenvector is a non-zero multiple
of the column vector (1, 0)> .
This makes it impossible to find
any set of two linearly independent eigenvectors.
Hence Py ∈ L.
(I − P)2 = I − 2P + P2 = I − 2P + P = I − P
Hence I − P is a projection.
This projection is also orthogonal because if (I − P)v = 0
and u = (I − P)x for some x ∈ Rn , then
so y ∈ L⊥ .
Lemma
Every n × n symmetric square matrix A:
1. has a maximum eigenvalue λ∗ at an eigenvector x∗
where f attains its maximum;
2. has a minimum eigenvalue λ∗ at an eigenvector x∗
where f attains its minimum;
3. satisfies A = λI if and only if λ∗ = λ∗ = λ.
Lemma
Let A be a symmetric n × n matrix.
Suppose that there are m < n eigenvectors {uk }m
k=1
which form an orthonormal set of vectors,
as well as the columns of an n × m matrix U.
Then there is at least one more eigenvector x
that satisfies U> x = 0
— i.e., it is orthogonal to each of the m eigenvectors uk .
Hence x is an eigenvector of A.
>
In both cases there is an eigenvector x of A satisfying U> x = 0m .
University of Warwick, EC9A0 Maths for Economists Peter J. Hammond 42 of 44
Spectral Theorem
Theorem
Given any symmetric n × n matrix A:
1. its eigenvectors span the whole of Rn ;
2. there exists an orthogonal matrix P that diagonalizes A.