You are on page 1of 4

THE UNIVERSITY OF NEW SOUTH WALES

SCHOOL OF MATHEMATICS AND STATISTICS


SESSION 1, 2012
MATH3161/MATH5165
Optimization
(1) TIME ALLOWED 2 hours
(2) TOTAL NUMBER OF QUESTIONS 3
(3) ANSWER ALL QUESTIONS
(4) THE QUESTIONS ARE NOT OF EQUAL VALUE
(5) ALL STUDENTS MAY ATTEMPT ALL QUESTIONS. MARKS GAINED
ON ANY QUESTION WILL BE COUNTED. GRADES OF PASS AND
CREDIT CAN BE GAINED BY SATISFACTORY PERFORMANCE ON
UNSTARRED QUESTIONS.
GRADES OF DISTINCTION AND HIGH DISTINCTION WILL REQUIRE
SATISFACTORY PERFORMANCE ON ALL QUESTIONS, INCLUDING
STARRED QUESTIONS
(6) THIS PAPER MAY BE RETAINED BY THE CANDIDATE
(7) ONLYCALCULATORS WITH AN AFFIXED UNSW APPROVED STICKER
MAY BE USED
All answers must be written in ink. Except where they are expressly required pencils
may only be used for drawing, sketching or graphical work.
SESSION 1, 2012 MATH3161/MATH5165 Page 2
1. [25 marks]
i) Consider the following equality constrained optimization problem
(P
1
) min
xR
3
x
2
1
x
2
x
3
+ 1
s.t. 4x
2
1
+ x
2
2
= 16,
2x
2
+ 3x
3
= 25.
Let x

= [0, 4, 17/3]
T
.
a) Write the problem (P
1
) in standard form.
b) Show that the point x

is a regular feasible point for (P


1
).
c) Verify that the point x

is a constrained stationary point for (P


1
).
d) Evaluate the Hessian of the Lagrangian at the point x

.
e)* Using second-order sucient optimality conditions, determine whether
or not x

is a strict local minimizer for (P


1
).
ii) Consider the following inequality constrained optimization problem
(P
2
) min
xR
n
a
T
x
s.t. 1 x
T
Qx 0,
where n 2, Q is a symmetric and positive denite n n matrix,
a R
n
, a = 0 and the constraint function c(x) = 1 x
T
Qx.
a) Write down the gradient c(x) and the Hessian
2
c(x).
b) Stating clearly any results that you use, show that c(x) is a concave
function.
c) Show that (P
2
) is a convex optimization problem.
d) Show that x

=
Q
1
a

a
T
Q
1
a
is a constrained stationary point for (P
2
).
e) Determine whether x

=
Q
1
a

a
T
Q
1
a
is a local minimizer, global min-
imizer or neither for the problem (P
2
).
f) Write down the Wolfe dual maximization problem for (P
2
).
g) What is the maximum value of the Wolfe dual maximization problem
in part f)? Give reasons for your answer. [NOTE: You do not need
to solve the dual problem.]
h)* Suppose the constraint in (P
2
) becomes an equality constraint, so
the new problem (P

2
) has the single equality constraint c(x) = 0.
Establish rigorously whether the new feasible region
:= {x R
n
: x
T
Qx 1 = 0}
is convex or not.
Please see over . . .
SESSION 1, 2012 MATH3161/MATH5165 Page 3
2. [25 marks] Consider minimizing the strictly convex quadratic function
q(x) =
1
2
x
T
Gx +d
T
x + c
where G is an (n n) symmetric positive denite constant matrix, d is a
constant n 1 vector and c is a scalar. Let x
(1)
be the starting point, where
q(x
(1)
) = 0, x
(1)
= x

and x

is the minimizer of q(x).


i) Consider applying Newtons method to the quadratic function q(x).
a) Write down the Newton direction s
(1)
N
at x
(1)
.
b) Show that the Newton direction s
(1)
N
is a descent direction at x
(1)
.
c) Show that Newtons method terminates in one iteration.
ii) Consider applying the steepest descent method with exact line searches to
the quadratic function q(x). Suppose that x
(k)
and x
(k+1)
are two consecu-
tive points generated by the steepest descent method where q(x
(k)
) = 0.
a) Write down the steepest descent direction s
(k)
D
at x
(k)
.
b) Conrm that s
(k)
D
is a descent direction.
c) Write down the line search condition that must be satised by the
exact minimizer
(k)
of () = q(x
(k)
+ s
(k)
D
).
d)* Hence or otherwise show that q(x
(k+1)
)
T
q(x
(k)
) = 0.
iii) It is given that the gradient of q(x) is q(x) =

3x
1
x
2
2
x
1
+ x
2

.
a) Write down the Hessian matrix G of q(x).
b) Find the condition number of G.
c)* Given that x
(1)
x

1, estimate the least number of iterations


of the steepest descent method with exact line searches required to
get x
(k+1)
x

10
10
.
d) Consider applying a conjugate gradient method with the Fletcher-
Reeves updating formula to the quadratic function q(x). It is given
that
x
(2)
=
1
17

26
38

and s
(2)
=
30
17
2

3
7

,
where x
(2)
is the current iterate and s
(2)
is the current search direc-
tion.
) Show that
(2)
=
17
10
is the exact minimizer of q(x
(2)
+ s
(2)
).
) Find x
(3)
using exact arithmetic.
) Is x
(3)
a minimizer of q(x)? Give reasons for your answer.
Please see over . . .
SESSION 1, 2012 MATH3161/MATH5165 Page 4
3. [15 marks]
Consider the optimal control problem
minimize

t
1
0
(x
1
+ u
2
1
) dt
subject to x
1
= x
1
+ u
1
,
x
1
(0) = 4, x
1
(t
1
) = 0,
where u
1
U
u
, the unrestricted control set and t
1
> 0 is free
i) Write down the Hamiltonian function H for this problem. [You may
assume the problem is normal and set the co-state variable z
0
= 1.]
ii) Write down the dierential equation for the co-state z
1
and nd z
1
as a
function of t.
iii) State clearly the Pontryagin Maximum Principle conditions that an op-
timal solution satises, including a statement about the value of H along
the optimal path.
iv)* Assuming that a solution exists, apply the Pontryagin Maximum Prin-
ciple conditions to nd the optimal state x

1
(t), the optimal control u

1
(t)
and the optimal time t

1
.

You might also like