Professional Documents
Culture Documents
Basics
Column vector x Rn , Row vector xT , Matrix A Rmn . Matrix Multiplication, (m n)(n k) m k , AB = BA. Transpose AT , (AB)T = B T AT , Symmetric A = AT Inverse A1 , doesnt exist always, (AB)1 = B 1 A1 .
xT x is a scalar, xxT is a matrix. Ax = b, three ways of expressing:
n j=1 aij xj
= bj , j
System of equations : Non-singular (unique solution), singular (no solution, innite solution).
LU factorization
2 1 1 4 6 0 2 7 2 2 1 1 0 8 2 0 0 1
U
x1 x2 = x3 x1 x2 = x3 1
5 2 Ax = b 9 5 12 U x = EF Gb 2
1 1 1 2 1 , 1 1 1 1 1 1
E F G
L = G1 F 1 E 1
LU factorization
(First non-singular case) If no row exchanges are required, then A = LU (unique). Solve Lc = b, then U x = c Another form A = LDU . (Second non-singular case) There exist a permutation matrix P that reorders the rows, so that P A = LU . (Singular Case) No such P exist. (Cholesky Decomposition) If A is symmetric, and A = LU can be found without any row exchanges, then A = LLT (also called square root of a matrix). (proof). Positive Denite matrix always have a Cholesky decompostion.
A Review of Linear Algebra p.4/13
Orthogonality
(Norm) ||x||2 = xT x = x2 + . . . + x2 n 1 (Inner Product) xT y = x1 y1 + . . . + xn yn (Orthogonal) xT y = 0 Orthogonal l.i. (proof). (Orthonormal basis) Orthogonal vectors with norm =1 (Orthogonal Subspaces) V W if v w, v V, w W (Orthogonal Complement) The space of all vectors orthogonal to V denoted as V . The row space is orthogonal to the nullspace (in Rn ) and the column space is orthogonal to the left nullspace (in Rm ).(proof).
A Review of Linear Algebra p.8/13
Finally...
Fundamental Theorem of Linear Algebra II 1. R(AT ) = N (A) 2. R(A) = N (AT ) Any vector can be expressed as
(1) (2)
x = x1 b1 + . . . + xr br + xr+1 br+1 + . . . + xn bn
xr xn
= xr + xn
Every matrix transforms its row space to its column space (Comments about pseudo-inverse and invertibility)
Gram-Schmidt Orthogonalization
(Projection) of b on a is
aT b a T a a,
(Schwartz Inequality) |aT b| ||a||||b|| (Orthogonal Matrix)Q = [q1 . . . qn ], QT Q = I . (proof). (Length preservation) ||Qx|| = ||x|| (proof). Given vectors {ak }, construct orthogonal vectors{qk } 1. q1 = a1 /||a1 || 2. for each j , aj = aj (qT aj )q1 . . . (qT aj )qj1 1 j1 3. qj = aj /||aj || QR Decomposition (Example)
A Review of Linear Algebra p.10/13
(A = SS 1 ) Suppose there exist n linear independent eigenvectors for A. If S is the matrix whose columns are those independent vectors, then A = SS 1 where = diag(1 , . . . , n ). Diagonalizability is concerned with eigenvectors, and invertibility is concerned with eigenvalues. (Real symmetric matrix) Eigenvectors are orthogonal. So A = QQT . (Spectral Theorem)
A Review of Linear Algebra p.11/13
Finish
Thanks to Maria (Marisol Flores Gorrido) for helping me with this tutorial.