You are on page 1of 29

2.

2 Gaussian Elimination with Scaled


Partial Pivoting

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 87
Observation
• Not only pivot elements of size 0 cause a problem, but
also pivot elements of small size є.

• Example:

For small є, the solution is x1 ≈ x2 ≈ 1.

Gaussian elimination provides the solution

which for small є leads to x2 ≈ 1 and x1 ≈ 0.


Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 88
Pivoting

• Observation: Pivot element of last row is never used.


• Idea: Switch order of rows.

• Example revisited:
Reordering:

Solution:

This is correct, even for small є (and even for є = 0).

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 89
Vandermonde revisited

• There was no small value є in the equation system.


• Why was it ill-conditioned?
• Ratio in the first row of the matrix was

• It is not the absolute size that matters but the


relative size!

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 90
Scaled partial pivoting

• Process the rows in the order such that the relative


pivot element size is largest.
• The relative pivot element size is given by the ratio
of the pivot element to the largest entry in (the left-
hand side of) that row.

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 91
Algorithm
1. Initialize a permutation vector l with its natural order,
i.e., l = (1,2,…,n).
2. Compute the maximum vector s with .
3. // Forward elimination
for k = 1, … , n-1 // for all (permuted) pivot rows
a) for i = k, … , n // for all rows below (permuted) pivot
Compute relative pivot elements .
b) Find row j with largest relative pivot element.
c) Switch lj and lk in permutation vector.
d) Execute forward elimination step with row lk (former lj)
4. Execute back substitution using inverse order of l.
Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 92
Example
system:

initialization:
permutation vector l = (1,2,3,4)
maximum vector s = (13,18,6,12)
1st iteration: l1 = 1
relative pivot elements:
l = (3,2,1,4)
Execute first forward elimination step for l1 = 3.
Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 93
Example
we obtain

2nd iteration: l2 = 2
relative pivot elements:
l = (3,1,2,4)
Execute first forward elimination step for l2 = 1.

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 94
Example

with l = (3,1,4,2)

Execute backsubstition in the reverse order, i.e., for


rows 2, 4, 1, and 3.

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 95
Remark

• Gaussian elimation with scaled partial pivoting always


works, if a unique solution exists.

• A square linear equation system has a unique solution,


if the left-hand side is a non-singular matrix.
• A non-singular matrix is also referred to as regular.
• A non-singular matrix has an inverse matrix.
• A non-singular matrix has full rank.

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 96
Checking non-singularity

• A square matrix is non-singular, iff its determinant is


non-zero.

• The Gaussian elimination algorithm (with or without


scaled partial pivoting) will fail for a singular matrix
(division by zero).
• We will never get a wrong solution, such that checking
non-singularity by computing the determinant is not
required.
• Non-singularity is implicitly verified by a successful
execution of the algorithm.

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 97
Time complexity

1. Initialize a permutation vector l with its natural order,


i.e., l = (1,2,…,n).
time complexity O(n)
2. Compute the maximum vector s.
time complexity O(n2)

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 98
Time complexity
3. // Forward elimination
for k = 1, … , n-1 // for all (permuted) pivot rows
a) for i = k, … , n // for all rows below (permuted) pivot
Compute relative pivot elements.
time complexity: n-k+1 divisions
b) Find row j with largest relative pivot element.
time complexity: included in a)
c) Switch lj and lk in permutation vector.
time complexity: O(1)
so far: time complexity O(n2)
d) Execute forward elimination step with row lk.

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 99
Time complexity
d) Execute forward elimination step with row lk.
Computation of left-hand side:
k=1: As generated zero entries are not computed, we
have n-1 multiplications and subtractions per row.
Including the computation of the multiplier, we
have n multiplications per row.
There are n-1 rows.
Hence, we have n (n-1) operations.
k=2: Analogously, we obtain (n-1) (n-2) operations.
for k=1,…,n: In total, we get operations.

As we obtain operations.

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 100
Time complexity
Computation of right-hand side:
We have (n-1) + (n-2) + … + 1 = n(n-1)/2 operations.
4. Execute back substitution using inverse order of l.
We have 1 + 2 + … + n = n(n+1)/2 operations.

Conclusion:
Overall, we have that the algorithm has the time
complexity f(n) = Θ(n3),
i.e., there exist constants c and C such that
c |n3| ≤ |f(n)| ≤ C |n3|.

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 101
Remark

• The derived time complexity is not a worst-case


scenario, but is what we always have to execute.
• Can we do better?
• For special cases: yes!

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 102
2.3 Banded Systems

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 103
Definition

• A system is called banded, if aij = 0 for all |i-j| ≥ k for


some k < n.

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 104
Example

• For k=2, the banded system is called tridiagonal.


It is of the form

Gaussian elimination algorithm becomes simple.

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 105
Gaussian elimination for tridiagonal system
1. // Forward elimination:
for i = 2, … , n
// all ai become 0 -> no need to compute
// all ci do not change

2. // Backward substitution:

for i = n-1, … , 1

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 106
Remarks

• The time complexity becomes Θ(n).

• The method can be generalized to any banded system.


• Time complexity stays Θ(n).

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 107
2.4 LU Decomposition

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 108
Motivation

• In many applications, one does not have to solve a


linear equation system for one object but for a large
number.
• If the system is not banded, but has full entries, can
we still make computations faster as Θ(n3)?
• When solving a system for many objects, only the
right-hand side changes.
• Hence, many computations stay the same.
• We can make use of this.

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 109
Example
• In forward elimination, we execute the first step

• This can be written in the form

with

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 110
Example
• After this first step, we execute a second step

• This is equivalent to

with

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 111
Example
• After this second step, we execute a third step

• This is equivalent to

with

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 112
Upper triangular matrix

• Overall, we obtained
M3M2M1Ax = M3M2M1b
with U := M3M2M1A being an upper triangular matrix.

• Here,

• U is the result of the forward elimation procedure.

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 113
Lower triangular matrix
• Moreover, from U = M3M2M1A we get
A = (M3M2M1)-1U = M1-1M2-1M3-1U
with L := M1-1M2-1M3-1 being a lower triangular matrix.
• Here,

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 114
Remarks
• L is a lower triangular matrix with entries 1 on the
diagonal.
• Because of their simple structure, the inverse of the
matrices M1, M2, and M3 is just the same matrix,
where all non-zero entries below the diagonal become
their additive inverse.
• Moreover, because of their simple structure, the
product of the three matrices is obtain by just
component-wise adding all non-zero entries below the
diagonal.
• Hence, L can be directly retrieved from the forward
elimination step without further computations. The
entries are the negative Gaussian multipliers.

Jacobs University
Visualization and Computer Graphics Lab
120202: ESM4A - Numerical Methods 115

You might also like