You are on page 1of 3

1 Square Matrices

The matrix norm is induced by the vector norm. Suppose A is an n × n matrix


and k · k is any norm, kAk = max{x∈Rn |kxk=1} kAxk.

Facts. Let A be an n × n matrix.

1. kAk = kAT k.
2. A is nonsingular ⇔ AT is nonsingular.

3. A is singular if and only if it has an eigenvalue that is equal to zero.


4. The eigenvalues of a triangular matrix are equal to its diagonal entries.
5. If S is a nonsingular matrix and B = SAS −1 , then the eigenvalues of A
and B coincide.

6. The eigenvalues of cI + A are equal to c + λ1 , . . . , c + λn , where λ1 , . . . , λn


are eigenvalues of A.
7. The eigenvalues of Ak are equal to λk1 , . . . , λkn , where λ1 , . . . , λn are the
eigenvalues of A.
8. If A is nonsingular, then the eigenvalues of A−1 are equal to the reciprocals
of the eigenvalues of A.

The spectral radius ρ(A) of a square matrix A is defined as the maximum of the
magnitude of the eigenvalues of A.
For any induced matrix norm and any square matrix A we have lim kAk k1/k =
k→∞
ρ(A) ≤ kAk.
Let A be a square matrix. We have lim kAk k = 0 if and only if ρ(A) < 1.
k→∞

2 Symmetric and Positive Definite Matrices

Let A be a symmetric n × n matrix.Then:

1. The eigenvalues of A are real.


2. The matrix A has a set of n muturally orthogonal, real, and nonzero
eigenvectors x1 , . . . , xn .

3. Suppose that these eigenvectors have been normalized so that kxi k = 1


for each i. Then

1
n
X
A= λi xi xTi
i=1

where λi is the eigenvalue corresponding to xi .


4. Let λ1 ≤ . . . ≤ λn be its (real) eigenvalues, and let x1 , . . . , xn be associated
orthogonal eigenvectors, normalized so that kxi k = 1 for all i. Then:
• kAk = ρ(A) = max{|λ1 |, |λn |}.
• λ1 kyk2 ≤ y T Ay ≤ λn kyk2 for all y ∈ Rn .
• If A is nonsingular, then kA−1 k = 1/|λ|, where λ is the one among
λ1 , . . . , λn (i.e. eigenvalues of A) that has the smallest absolute value.
5. kAk k = kAkk for any positive integer k.
6. kAk2 = kAT Ak = kAAT k.

(a) For any m×n matrix A, the matrix AT A is symmetric and positive semidefi-
nite. AT A is positive definite if and only if A has rank n. In particular, if m = n,
AT A is positive definite if and only if A is nonsingular.

(b) A square symmetric matrix is p.d (p.s.d) if and only if all of its eigenvalues
are positive (nonnegative).

(c) The inverse of a symmetric positive definite matrix is symmetric and positive
definite. [Proof: The eigenvalues of A−1 are the reciprocal of the eigenvalues of
A, so the result follows using (b)].

Let A be a square symmetric positive semidefinite matrix.

1. There exists a symmetric matrix Q with the property Q2 = A. Such a


matrix is called a symmetric square root of A and is denoted by A1/2 .
2. A symmetric square root A1/2 is invertible if and only if A is invertible.
Its inverse is denoted by A−1/2 .
3. There holds A−1/2 A1/2 = A−1 .
4. There holds AA1/2 = A1/2 A.

A−1 U V T A−1
(A + U V T )−1 = A−1 −
1 + V T A−1 U

An orthogonal matrix is a square matrix with real entries whose columns (or
rows) are orthogonal unit vectors (i.e., orthonormal). A matrix Q is orthogonal
if its transpose is equal to its inverse:

2
QT Q = QQT = I

alternatively,

QT = Q−1 .

Eigendecomposition (Spectral Decomposition).


Let A be a n × n symmetric matrix, there is an orthogonal matrix Q and a
diagonal matrix Λ such that

A = QΛQT ,

where Λ = diag(λ1 , λ2 , . . . , λn ), λ1 , λ2 , . . . , λn are the eigenvalues of A and the


ith column of Q is its eigenvector corresponding to λi . If A is nonsingular (i.e.
none of its eigenvalues are zero), then its inverse is given by

A−1 = QΛ−1 QT

Useful Facts regarding Eigenvalues and Eigenvectors:

1. The product of the eigenvalues is equal to the determinant of A.

2. The sum of the eigenvalues is equal to the trace of A.


3. If the eigenvalues of A are λj , and A is invertible, then the eigenvalues of
A−1 are simply λ−1j .

4. If A is (real) symmetric, then the eigenvectors are real, mutually orthog-


onal and provide a basis for Rn .
5. The eigenvectors of A−1 are the same as the eigenvectors of A.
6. The two statements
• A can be eigendecomposed.
• A is invertible.
do not imply each other.

You might also like