Professional Documents
Culture Documents
In the previous lecture the optimization of functions of multiple variables subjected to equality constraints using the method of constrained variation and the method of Lagrange multipliers was dealt. In this lecture the Kuhn-Tucker conditions will be discussed with examples for a point to be a local optimum in case of a function subject to inequality constraints.
Kuhn-Tucker Conditions It was previously established that for both an unconstrained optimization problem and an optimization problem with an equality constraint the first-order conditions are sufficient for a global optimum when the objective and constraint functions satisfy appropriate concavity/convexity conditions. The same is true for an optimization problem with inequality constraints. The Kuhn-Tucker conditions are both necessary and sufficient if the objective function is concave and each constraint is linear or each constraint function is concave, i.e. the problems belong to a class called the convex programming problems. Consider the following optimization problem: Minimize f(X) subject to gj(X) 0 for j = 1,2,,p ; where X = [x1 x2 . . . xn] Then the Kuhn-Tucker conditions for X* = [x1* x2* . . . xn*] to be a local minimum are
m f g + j =0 xi j =1 xi
jg j = 0
gj 0
(1)
j 0
M2L5
In case of minimization problems, if the constraints are of the form gj(X) 0, then j have to be nonpositive in (1). On the other hand, if the problem is one of maximization with the constraints in the form gj(X) 0, then j have to be nonnegative. Example (1)
2 2 Minimize f = x12 + 2 x2 + 3 x3 subject to the constraints
g1 = x1 x2 2 x3 12 g 2 = x1 + 2 x2 3 x3 8 using Kuhn-Tucker conditions. Solution: The Kuhn Tucker conditions are given by
f g g + 1 1 + 2 2 = 0 xi xi xi
a)
1 ( x1 x2 2 x3 12) = 0 2 ( x1 + 2 x2 3x3 8) = 0
c) g j 0 i.e.,
(5) (6)
M2L5
x1 x2 2 x3 12 0 x1 + 2 x2 3 x3 8 0
(7) (8)
d) j 0 i.e.,
1 0 2 0
(9) (10)
From (5) either 1 = 0 or, x1 x2 2 x3 12 = 0 Case 1: 1 = 0 From (2), (3) and (4) we have x1 = x2 = 2 / 2 and x3 = 2 / 2 . Using these in (6) we get 22 + 82 = 0, 2 = 0 or 8 From (10), 2 0 , therefore, 2 =0, X* = [ 0, 0, 0 ], this solution set satisfies all of (6) to (9) Case 2: x1 x2 2 x3 12 = 0
1 2 1 22 21 + 32 12 = 0 or, 2 4 3
171 + 122 = 144 . But conditions (9) and (10) give us 1 0 and 2 0 simultaneously, which cannot be possible with 171 + 122 = 144 . Hence the solution set for this optimization problem is X* = [ 0 0 0 ]
M2L5
g1 = x1 80 0 g 2 = x1 + x2 120 0 using Kuhn-Tucker conditions. Solution The Kuhn Tucker conditions are given by a)
g f g g + 1 1 + 2 2 + 3 3 = 0 xi xi xi xi
1 ( x1 80) = 0 2 ( x1 + x2 120) = 0
c) g j 0 i.e., x1 80 0 x1 + x2 + 120 0 d) j 0 i.e.,
(13) (14)
(15) (16)
M2L5
5 (17) (18)
1 0 2 0
From (13) either 1 = 0 or, ( x1 80) = 0 Case 1: 1 = 0 From (11) and (12) we have x1 =
30 and x2 =
Using these in (14) we get 2 ( 2 150 ) = 0 ; 2 = 0 or 150 Considering 2 = 0 , X* = [ 30, 0]. But this solution set violates (15) and (16) For 2 = 150 , X* = [ 45, 75]. But this solution set violates (15) . Case 2: ( x1 80) = 0 Using x1 = 80 in (11) and (12), we have
2 = 2 x2 1 = 2 x2 220
Substitute (19) in (14), we have
(19)
2 x2 ( x2 40 ) = 0 .
For this to be true, either x2 = 0 or x2 40 = 0 For x2 = 0 , 1 = 220 . This solution set violates (15) and (16)
M2L5
Optimization Methods: Optimization using Calculus Kuhn-Tucker Conditions For x2 40 = 0 , 1 = 140 and 2 = 80 . This solution set is satisfying all equations from (15) to (19) and hence the desired. Therefore, the solution set for this optimization problem is X* = [ 80 40 ].
M2L5