You are on page 1of 11

MACM 409 - Lecture 01 2017-09-06

Friday Linear Algebra Quiz

What this class is about:


Algorithms for performing Linear Algebra Calculations such as

Multiplying Matrices
Solving Linear Systems
o Methods such as Gaussian Elimination
Matrix Factorizations
Eigenvalues
Checking if matrices are similar
Computing Inverses or pseudo-inverses
Compression
Least Squares Problems
Determinants
Computing orthogonal bases
Checking for consistency
Computing the rank
(error analysis)
(sparse matrices)
(stability)

Why is it important to study algorithms?


Answer 1:
Many problems are modelled by Differential Equations
Physical Problems require Differential Equations
Most PDEs require numerical methods

Numerical Analysis: Convert a continuous problem (Example PDE) to a discrete


problem (lin.sys)

Resonant Frequencies Eigen-values of matrices

This course is focused on solving the discrete problems.

Answer 2: Data science


Represent data in a matrix (that can be very large)
Compression
Extracting important information?
o E.G. Google Page Rank computing leading eigenvector
Completing missing data.
o Netflix Problem
Dealing with distributed data
Goals of a great algorithm
Accuracy Error to decrease with iterations
Efficiency
Robustness
o Stability and Conditioning
Simplicity
Parallelizability (Multiple Cores)

MACM 409 Lecture 02 2017-09-06


Quiz
MACM 409 Lecture 03 2017-09-11

1. Linear Algebra Review

1.1 Vectors & Matrices

Work over , not

1
= ( ) n-dim column vector

11 1
=( . )
1

Transpose:

Adjoint/Hermition:
= [
1 ]

11
1
=( . )
1

Vector Multiplication Notation:


Write A = [1 | | ], = [1 | | ], = [1 | | ]

Then = 1 1 + + , = 1, ,
Also: [1 | | ] = [1 | | ] = [1 | | ]
1.2 Fundamental Subspaces

A mxn, cols 1 , ,

Range: R(A) = { : = } = {1 , , }

Null space: N(A)={ : = 0}

= orthogonal complement (see later)

Rank:
Column Rank = dim(R(A))=# linearly independent columns
Row Rank = # linearly independent rows= (( ))

Row rank = Column Rank = rank(A)

In particular, rank(A) {, }

Nullity: null(A) = dim(N(A))

Rank-nullity formula: rank(A)+null(A)=n # of columns

A is full rank if rank(A) = {, }

Typically, in this course, .


() = = #
1.3 Inverse

A square matrix A is invertible/nonsingular if it is full rank, ie. m=n=rank(A)

Its inverse is the matrix 1 that satisfies 1 = 1 = where


is the identity matrix

Theorem: The following are equivalent:


A is invertible
rank(A) = m
R(A) =
() = ( )
null(A) = 0
0 is not an eigenvalue of A
det(A) 0

Note: = det()

Linear systems:

= , , 1
1
If A is invertible, then = is the unique solution

Example: An matrix A is upper triangular if = 0 for > .


Similarly, lower triangular if <
The eigenvalues of A are 11 , 22 , ,

det() = =1
A is invertible if and only if all its diagonal entries are nonzero

The inverse (when it exists) is upper triangular

Solving = with A upper triangular.


Use back substitution:

=

MACM 409 Lecture 04 2017-09-13

Problem Set 1 due Monday


Exercise 2.5: It should be by using 2.3

1.4 Orthogonality

Definition: A self-adjoint/Hermitian if = (must be square)

Inner Product: If , = (vector multiplication)


=1
This is a sesquilinear form:

(1 + 2 ) = 1 + 2
(1 + 2 ) = 1 + 2

() () =
1 , 2 ,
, , 2
, , ,

Euclidean Length: |||| =

Angle between two vectors:



cos() = , [0, ]
||||||||

Orthogonality: . are orthogonal if =


Equivalently, cos() = 0
In this case, write

A set S of nonzero vectors Is orthogonal if , .


It is orthogonal if |||| = 1,
Orthogonal decompositions:
Orthogonal vectors are linearly independent
Hence n orthogonal vectors , form a basis (an orthogonal
basis)
Any can be written as = 1 + + for scalars
1

Moreover, = 2 = 1, ,
|| ||
Hence we get the orthogonal decomposition of x:


= =


= || || = || ||

(row)(column)(column) (column)((row)(column)

Note: each
Note: if x,y then has rank 1
(( ) = {})
Unitary Matrices

is unitary if =

Notes: - Invertible: 1 =
- Inverse does not need to be computed
- Columns Q = [ ] [ ] form an orthonormal basis

Unitary matrices preserve inner products and Euclidean length and therefore
also angles
() () = = =
setting x=y gives ||Qx||=||x||, Energy Conservation

Examples: rotations, reflections are unitary matrices

1.5 Norms

Definition: A norm is a function ||-||: , |||| that satisfies


|||| 0 |||| = 0 =
|| + || |||| + |||| (triangle inequality)
|||| = ||||||
,

The Euclidean Length |||| = is a norm.

However, there are many other norms


Example: p-norms:
|||| =
1 =1| | , = ( )
1
||||2 = ( 2 2
=1| | ) =
In general for 1
1
|||| = (
=1| | )
If = , |||| = max | |
=1,
Example: |||| is a norm
Certainly, |||| ,
o max | | = 0 = 0, =
=1,
max | + | max(| | + | |) |||| + ||||
=1,
|||| = |||| = max|| |||| = ||||||

Example: |||| is not a norm for 0<p<1


1 0
= 2, = ( ) , = ( )
0 1
1 1
1
|| + || = || || = (1 + 1 ) = 2

1

|||| + |||| = 1 + 1 = 2
1
But 2 > 2 when 0<p<1 in equality does not hold

Cauchy-Schwarz Inequality: | | |||| |||| ,

Proof:
If y=0 the result is true.
Assume that .

Set =
||||

Then, = = 0
||||

Rearranging gives = +
||||


Therefore |||22 = = ( +
) ( + )
|||| ||||




| |
= +
+
+
,
|||| |||| ||||
2
2nd and 3rd terms =0 as , and = ||||2
| | | |
= |||| +
+
|||| ||||

Rearranging gives |||| |||| | | , this implies the result
Holders Inequality:
1 1
| | |||| |||| , , , for p, q satisfying 1 , and + = 1

1 1 1 1
Example: If p =2 then + = + = 1 = 2 | | |||| ||||
2 2
Which is the Cauchy-Schwarz Inequality
1
Example: If = , = 0 = 1

So we get, | | |||| ||||

Norm Equivalence:
Any two norms ||||() &||||() are equivalent. That is, there are
constants (,) &(,) such that
(,) ||||() ||||() (,) ||||()

See Problem Set 2 for Examples


Note: the constants usually depend on m. (can be a bad thing)
1.6 Matrix Norms

Induced matrix norms:


Let and ||||() & ||||() be norms on & respectively. Then we
||||()
define: ||||(,) = sup = max ||||()
( )0 ||||() ( )||||
()

Notation: If ||||() = ||||() = |||| are the p-norm then we write ||||

Lemma: (Matrix Product Bound)


Let ||||() , ||||() , ||||() on , , & respectively. Let A and B be
and respectively. Then,
||||(,) ||||(,) ||||,
In particular, |||| |||| ||||

Proof: see book.

The Frobenius norm: turn matrix into one long vector and take Euclidean
Length
2
|||| = | | , = ( )

Euclidean Length of the vectorized version of A.

Identities:
2
|||| = || || , = [ | | ]
2
|||| = ( ) = ( )
Matrix Product Bound: |||| |||| ||||

Invariance under unitary matrices:


Let be unitary and . Then
||||2 = ||||2 , |||| = ||||
(not true in general for other norms)

You might also like