1 views

Uploaded by MahmoudAbdulGalil

Lecture notes of Optimization course in Zewail City

- Physical Meaning of Lagrange Multipliers - Hasan Karabulut
- ContentServer.pdf
- LP ASoln
- IPC2012-90288
- 25146884
- Content Server
- wangtao105753-self-201209-3
- Optimal Power Flow by Enhanced Genetic Algorithm
- Optimization Methods for the Design of Spatila Structures
- Final Practice Sol (1)
- A multi-year pavement maintenance program using a stochastic simulation-based genetic algorithm approach.pdf
- 1111.2988
- [B] Advanced SVC models for NewtonRaphson.pdf
- 2152.pdf
- Nr310206 Optimization Techniques
- v113n10p775
- Question
- ijsrp-p2491.pdf
- Cfo
- c6

You are on page 1of 28

Classical Optimization Techniques

Inequality Constraints

Giza, Egypt, Spring 2018

aabdelsamea@zewailcity.edu.eg

Classical Optimization Techniques

Multivariable Optimization with Equality Constraints

Solution by the method of Lagrange Multipliers

mathematician and astronomer. He made significant

contributions to the fields of analysis, number theory,

variational calculus, mathematical physics and both classical

and celestial mechanics.

2

Classical Optimization Techniques

Multivariable Optimization with Equality Constraints

Solution by the method of Lagrange Multipliers

Problem with 2 variables and 1 constraints (n = 2, m = 1)

min 𝑓 𝑥1 , 𝑥2

s. t. 𝑔 𝑥1 , 𝑥2 = 0

construct the Lagrange function 𝐿 𝑥1 , 𝑥2 , 𝜆 where 𝜆 is the

Lagrange multiplier

𝐿 𝑥1 , 𝑥2 , 𝜆 = 𝑓 𝑥1 , 𝑥2 + 𝜆𝑔 𝑥1 , 𝑥2

3

Classical Optimization Techniques

Multivariable Optimization with Equality Constraints

Solution by the method of Lagrange Multipliers

The necessary conditions for its extreme point are:

𝜕𝐿 𝜕𝑓 𝜕𝑔

𝑥1 , 𝑥2 , 𝜆 = 𝑥1 , 𝑥2 + 𝜆 𝑥1 , 𝑥2 = 𝟎

𝜕𝑥1 𝜕𝑥1 𝜕𝑥1

𝜕𝐿 𝜕𝑓 𝜕𝑔

𝑥1 , 𝑥2 , 𝜆 = 𝑥1 , 𝑥2 + 𝜆 𝑥1 , 𝑥2 = 𝟎

𝜕𝑥2 𝜕𝑥2 𝜕𝑥2

𝜕𝐿

𝑥1 , 𝑥2 , 𝜆 = 𝑔 𝑥1 , 𝑥2 = 0

𝜕𝜆

Example

min 𝑓 𝑥, 𝑦 = 𝑘𝑥 −1 𝑦 −2

s. t. 𝑔 𝑥, 𝑦 = 𝑥 2 + 𝑦 2 − 𝑎2 = 0

4

Classical Optimization Techniques

Multivariable Optimization with Equality Constraints

Solution by the method of Lagrange Multipliers

5

Classical Optimization Techniques

Multivariable Optimization with Equality Constraints

Solution by the method of Lagrange Multipliers

6

Classical Optimization Techniques

Multivariable Optimization with Equality Constraints

Solution by the method of Lagrange Multipliers

min 𝑓 𝑥1 , 𝑥2

s. t. 𝑔1 𝑥1 , 𝑥2 = 0

𝑔2 𝑥1 , 𝑥2 = 0

the necessary condition to have

𝑥1∗ , 𝑥2∗ on C as an extreme is:

−𝛻𝑓 = 𝜆1 𝛻𝑔1 + λ2 𝛻𝑔2

7

Classical Optimization Techniques

Multivariable Optimization with Equality Constraints

Solution by the method of Lagrange Multipliers

Problem with n variables and m constraints

min 𝑓 𝒙

𝒙

s. t. 𝑔𝑗 𝒙 = 0, 𝑗 = 1,2, . . , 𝑚

construct the Lagrange function 𝐿 with Lagrange multiplier

𝜆𝑗 for each constraint 𝑔𝑗

𝑚

𝐿 𝑥1 , 𝑥2 , … , 𝑥𝑛 , 𝜆1 , 𝜆2 , … , 𝜆𝑚 = 𝑓 𝒙 + 𝜆𝑗 𝑔𝑗 𝒙

𝑗=1

𝐿 𝒙, 𝝀 = 𝑓 𝒙 + 𝝀𝑻 𝒈 𝒙 , 𝒙 ∈ ℝ𝑛 , 𝝀, 𝒈 ∈ ℝ𝑚

8

Classical Optimization Techniques

Multivariable Optimization with Equality Constraints

Solution by the method of Lagrange Multipliers

Necessary Conditions for a General Problem (n , m )

By treating 𝐿 as a function of n + m unknowns,

𝑚

𝜕𝐿 𝜕𝑓 𝜕𝑔𝑗

= + 𝜆𝑗 = 0, 𝑖 = 1,2, … , 𝑛 (𝑛 equations)

𝜕𝑥𝑖 𝜕𝑥𝑖 𝜕𝑥𝑖

𝑗=1

𝜕𝐿

= 𝑔𝑗 𝒙 = 0, 𝑗 = 1,2, … , 𝑚 (𝑚 equations)

𝜕𝜆𝑗

Sufficient Conditions for a General Problem (n , m )

𝑛 𝑛

𝜕 2𝐿

𝑄 = 𝑑𝒙 𝑇 𝛻𝑥2 𝐿 𝒙∗ , 𝝀∗ 𝑑𝒙 = 𝒙∗ , 𝝀∗ 𝑑𝑥𝑖 𝑑𝑥𝑗 > 0

𝜕𝑥𝑖 𝑥𝑗

𝑖=1 𝑗=1

for all the admissible variations 𝑑𝑥𝑖 and 𝑑𝑥𝑗 (𝛻𝑥2 𝐿 ≻ 0) 9

Classical Optimization Techniques

Multivariable Optimization with Equality Constraints

Solution by the method of Lagrange Multipliers

Sufficient Conditions for a General Problem (n , m )

10

Classical Optimization Techniques

Multivariable Optimization with Equality Constraints

Solution by the method of Lagrange Multipliers

Example

Find the dimension of a cylindrical tip with top and bottom

made up a sheet of metal to maximize its volume such that the

total surface area = 24.

Solution

Let 𝑥1 , 𝑥2 denote the radius of the base and length of the tin,

respectively, the problem can be stated as

max 𝑓 𝑥1 , 𝑥2 = 𝜋𝑥12 𝑥2

𝒙

s. t. 2𝜋𝑥12 + 2𝜋𝑥1 𝑥2 = 24𝜋,

The optimum is at 𝑥1∗ = 2, 𝑥2∗ = 4, 𝜆∗ = −1 and 𝑓 ∗ = 16𝜋.

(Check for sufficiency condition! z < 0) 11

Classical Optimization Techniques

Multivariable Optimization with Equality Constraints

Solution by the method of Lagrange Multipliers

Interpretation of Lagrange Multipliers

Consider the problem with a single equality constraint

min 𝑓 𝒙

𝒙

s. t. 𝑔 𝒙 =𝑏 →𝑔 𝒙 =𝑏−𝑔 𝒙 =0

It can be shown that

𝑑𝑓 ∗ = 𝜆∗ 𝑑𝑏

Thus, 𝜆∗ denotes the sensitivity (or rate of change) of 𝑓 w.r.t. 𝑏

(or the marginal or incremental change in 𝑓 ∗ w.r.t. 𝑏 at 𝒙∗ ). In

other words, 𝜆∗ indicates how tightly the constraint is binding

at the optimum point. (small relaxation or tightening of the

constraint effect on the optimum value of the objective 12

Classical Optimization Techniques

Multivariable Optimization with Equality Constraints

Solution by the method of Lagrange Multipliers

Interpretation of Lagrange Multipliers

Proof

It is required to find the effect of a small change in 𝑏 on 𝑓 ∗ :

𝑑𝑓 𝒙∗ 𝑏

= 𝛻𝒙 𝑓 𝑇 𝒙∗ 𝑏 ∙ 𝛻𝑏 𝒙∗ 𝑏

𝑑𝑏

But using Lagrange multiplier, 𝛻𝒙 𝑓 𝒙∗ 𝑏 = −𝜆∗ 𝛻𝒙 𝑔 𝒙∗ 𝑏

However, 𝑔 𝒙∗ = 𝑏 − 𝑔 𝒙∗ → 𝛻𝒙 𝑔 𝒙∗ 𝑏 = −𝛻𝒙 𝑔 𝒙∗ 𝑏

Then

𝑑𝑓 𝒙∗ 𝑏 ∗ 𝑻 ∗ ∗ ∗

𝑑 𝑔 𝒙 ∗

𝑏

= 𝜆 𝛻𝒙 𝑔 𝒙 𝑏 ∙ 𝛻𝑏 𝒙 𝑏 = 𝜆 = 𝜆∗ (1)

𝑑𝑏 𝑑𝑏

𝑑𝑓 ∗ = 𝜆∗ 𝑑𝑏 13

Classical Optimization Techniques

Multivariable Optimization with Equality Constraints

Solution by the method of Lagrange Multipliers

Interpretation of Lagrange Multipliers

The sensitivity of the optimal value 𝑓 ∗ w.r.t. the RHS 𝑏 at 𝒙∗ is

𝑑𝑓 ∗ = 𝜆∗ 𝑑𝑏

where the following physical meaning can be attributed to 𝜆∗ as

1. 𝜆∗ > 0: a unit decrease in 𝑏 is positively valued since one gets a

smaller min value of the objective function 𝑓. Hence 𝜆∗ may be

interpreted as the marginal gain (further reduction) in 𝑓 ∗ due to

the tightening of the constraint. In case of unit increase in 𝑏, 𝜆∗

may be thought of as the marginal cost (increase) in 𝑓 ∗ due to the

relaxation of the constraint.

2. 𝜆∗ < 0: opposite.

3. 𝜆∗ = 0: No effect and hence the constraint will not be binding.14

Classical Optimization Techniques

Multivariable Optimization with Equality Constraints

Solution by the method of Lagrange Multipliers

Example

Solve max 𝑓 𝑥1 , 𝑥2 = 2𝑥1 + 𝑥2 + 10

𝒙

s. t. 𝑥1 + 2𝑥22 = 3

Find the effect of changing the right-hand side of the constraint

on the optimum value of 𝑓 ∗ .

Solution

The optimum: 𝑥1∗ = 2.97, 𝑥2∗ = 0.13, 𝜆∗ = 2 and 𝑓 ∗ = 16.07, z = −6.29.

• If original constraint is tightened by 1 unit, then 𝑑𝑏 = −1 and

𝑑𝑓 ∗ = 𝜆∗ 𝑑𝑏 = −2 → 𝑓 ∗ + 𝑑𝑓 ∗ = 14.07

• If original constraint is relaxed by 2 units, then 𝑑𝑏 = 2 and

𝑑𝑓 ∗ = 𝜆∗ 𝑑𝑏 = 4 → 𝑓 ∗ + 𝑑𝑓 ∗ = 20.07.

15

Classical Optimization Techniques

Multivariable Optimization with Inequality Constraints

Karush Kuhn Tucker (KKT) conditions

Albert W. Tucker (Canadian mathematician, 1905 –1995), who

first published the conditions in 1951.

Later scholars discovered that the necessary conditions for this

problem had been stated by William Karush (American

mathematician, 1917–1997) in his master's thesis in 1939.

16

Classical Optimization Techniques

Multivariable Optimization with Inequality Constraints

Karush Kuhn Tucker (KKT) conditions

Consider the problem

min 𝑓 𝒙

𝒙

𝐬. 𝐭. 𝑔𝒋 𝒙 ≤ 0, 𝑗 ∈ 𝐽 = {1,2, . . , 𝑚}

The necessary conditions to be satisfied at a relative min of 𝑓

𝑚

𝜕𝑓 𝜕𝑔𝑗

+ 𝜆𝑗 =0

𝜕𝑥𝑖 𝜕𝑥𝑖

𝑗∈𝐽1

𝜆𝑗 > 0; 𝑗 ∈ 𝐽1

where 𝐽1 is the set of active constraints.

17

Classical Optimization Techniques

Multivariable Optimization with Inequality Constraints

Karush Kuhn Tucker (KKT) conditions

Feasible directions

A vector 𝑺 is called a feasible direction from a point 𝒙 if at least

a small step can be taken along 𝑺 that does not immediately

leave the feasible region. Thus for problems with sufficiently

smooth constraint surfaces, vector 𝑺 satisfying the relation

𝑺T 𝛻𝑔𝑗 < 0

18

Classical Optimization Techniques

Multivariable Optimization with Inequality Constraints

Karush Kuhn Tucker (KKT) conditions

Feasible directions

𝑺T 𝛻𝑔𝑗 ≤ 0

19

Classical Optimization Techniques

Multivariable Optimization with Inequality Constraints

Karush Kuhn Tucker (KKT) conditions

Feasible directions

𝑗∈𝐽1

For positive 𝜆’s, we will not be able to find any direction in the

feasible domain along which the function value can be

decreased further. 20

Classical Optimization Techniques

Multivariable Optimization with Inequality Constraints

Karush Kuhn Tucker (KKT) conditions

Notes

o The KKT condition in vector form is

𝑚

𝑗∈𝐽1

It indicates that the negative grad of the OF can be expressed as a

linear combination of the grad of active constraints.

o These conditions are, in general, not sufficient to ensure a relative

(local) minimum.

o For convex programming problems, KKT conditions are

necessary and sufficient for a global minimum.

o For relative maximum 𝜆𝑗 < 0; 𝑗 ∈ 𝐽1 21

Classical Optimization Techniques

Multivariable Optimization with Inequality Constraints

Karush Kuhn Tucker (KKT) conditions

Notes (continued)

o If the set of active constraints is not known, KKT will be

𝑚

𝜕𝑓 𝜕𝑔𝑗

+ 𝜆𝑗 = 0; 𝑖 = 1,2, … , 𝑛

𝜕𝑥𝑖 𝜕𝑥𝑖

𝑗=1

𝜆𝑗 𝑔𝑗 = 0; 𝑗 = 1,2, . . , 𝑚

𝑔𝒋 𝒙 ≤ 0; 𝑗 = 1,2, . . , 𝑚

𝜆𝑗 ≥ 0; 𝑗 = 1,2, . . , 𝑚

o If the problem is one of maximization or if the constraints

are of the type 𝑔𝒋 𝒙 ≥ 0, the 𝜆𝑗 have to be non-positive.

o if the problem is one of maximization with constraints in the

form 𝑔𝒋 𝒙 ≥ 0, the 𝜆𝑗 have to be non-negative. 22

Classical Optimization Techniques

Multivariable Optimization with Inequality Constraints

Karush Kuhn Tucker (KKT) conditions

Example

Consider the following optimization problem:

min 𝑓 𝑥1 , 𝑥2 = 𝑥12 + 𝑥22

𝒙

s. t. 𝑥1 + 2𝑥2 ≤ 15

1 ≤ 𝑥𝑖 ≤ 10; 𝑖 = 1,2

derive the conditions to be satisfied at x = (1, 7) by the search

direction to be

a) a usable direction (𝑆 T 𝛻𝑓 < 0)

b) a feasible direction (𝑆 T 𝛻𝑔𝑗 < 0)

Solution

One direction is 𝑆 = ( 1 − 1). Is there more? 23

Classical Optimization Techniques

Multivariable Optimization with Mixed Constraints

and inequality constraints:

min 𝑓 𝒙

𝒙

𝐬. 𝐭. ℎ𝑘 𝒙 = 0, 𝑘 = 1,2, . . , 𝑝

𝑔𝒋 𝒙 ≤ 0, 𝑗 = 1,2, . . , 𝑚

24

Classical Optimization Techniques

Multivariable Optimization with Mixed Constraints

Karush-Kuhn-Tucker (KKT) conditions

The necessary conditions to be satisfied at a relative min of 𝑓:

𝑚 𝑝

𝜕𝑓 𝜕𝑔𝑗 𝜕ℎ𝑘

+ 𝜆𝑗 − 𝛽𝑘 = 0; 𝑖 = 1,2, … , 𝑛

𝜕𝑥𝑖 𝜕𝑥𝑖 𝜕𝑥𝑘

𝑗=1 𝑘=1

𝜆𝑗 𝑔𝑗 = 0; 𝑗 = 1,2, . . , 𝑚

𝑔𝒋 𝒙 ≤ 0; 𝑗 = 1,2, . . , 𝑚

ℎ𝑘 𝒙 = 0, 𝑘 = 1,2, . . , 𝑝

𝜆𝑗 ≥ 0; 𝑗 = 1,2, . . , 𝑚

25

Classical Optimization Techniques

Multivariable Optimization with Mixed Constraints

Constraint Qualification

Theorem

Let 𝒙∗ be a feasible solution to the problem of mixed constraints

problem. If 𝛁𝒈𝒋 (𝒙∗ ), 𝑗 ∈ 𝐽1 and 𝛁𝒉𝒌 (𝒙∗ ), 𝒌 = 𝟏, 𝟐, … , 𝒑, are linearly

independent, there exist 𝝀∗ and 𝜷∗ such that (𝒙∗ , 𝝀∗ , 𝜷∗ ) satisfy:

𝑚 𝑝

𝜕𝑓 𝜕𝑔𝑗 𝜕ℎ𝑘

+ 𝜆𝑗 − 𝛽𝑘 = 0; 𝑖 = 1,2, … , 𝑛

𝜕𝑥𝑖 𝜕𝑥𝑖 𝜕𝑥𝑘

𝑗=1 𝑘=1

𝜆𝑗 𝑔𝑗 = 0; 𝑗 = 1,2, . . , 𝑚

𝑔𝒋 𝒙 ≤ 0; 𝑗 = 1,2, . . , 𝑚

ℎ𝑘 𝒙 = 0, 𝑘 = 1,2, . . , 𝑝

𝜆𝑗 ≥ 0; 𝑗 = 1,2, . . , 𝑚 26

Classical Optimization Techniques

Multivariable Optimization with Mixed Constraints

Constraint Qualification

Notes

• The requirement of 𝛁𝒈𝒋 (𝒙∗ ) , 𝑗 ∈ 𝐽1 and 𝛁𝒉𝒌 (𝒙∗ ) , 𝒌 =

𝟏, 𝟐, … , 𝒑, are linearly independent, is called the constraint

qualification .

• If 𝛁𝒈𝒋 (𝒙∗ ), 𝑗 ∈ 𝐽1 and 𝛁𝒉𝒌 (𝒙∗ ), 𝒌 = 𝟏, 𝟐, … , 𝒑, are linearly

independent then, there exist ( 𝒙∗ , 𝝀∗ , 𝜷∗ ) satisfy KTT,

however, converse may not be true.

• The constraint qualification is always satisfied for problems:

– All the inequality and equality constraint functions are linear.

– All the inequality constraint functions are convex, and all the equality

constraint functions are linear.

27

Classical Optimization Techniques

Multivariable Optimization with Mixed Constraints

Constraint Qualification

Example

Consider the following optimization problem:

min 𝑓 𝑥1 , 𝑥2 = 𝑥1 − 1 2 + 𝑥22

𝒙

s. t. 𝑥13 + 2𝑥2 ≤ 0

𝑥13 − 2𝑥2 ≤ 0

conditions are satisfied at the optimum point.

Solution

Both are violated at (0,0) although the point is minimum.

28

- Physical Meaning of Lagrange Multipliers - Hasan KarabulutUploaded byAlexandros Tsouros
- ContentServer.pdfUploaded byMartin
- LP ASolnUploaded byspectrum_48
- IPC2012-90288Uploaded byMarcelo Varejão Casarin
- 25146884Uploaded byDeedee Sandu
- Content ServerUploaded byMartin Garcia
- wangtao105753-self-201209-3Uploaded byKathleen Franklin
- Optimal Power Flow by Enhanced Genetic AlgorithmUploaded byS Bharadwaj Reddy
- Optimization Methods for the Design of Spatila StructuresUploaded byhumejias
- Final Practice Sol (1)Uploaded byVirajitha Maddumabandara
- A multi-year pavement maintenance program using a stochastic simulation-based genetic algorithm approach.pdfUploaded byRacun Bisa
- 1111.2988Uploaded bypragatinaresh
- [B] Advanced SVC models for NewtonRaphson.pdfUploaded byFrancisco Javier Olmos Lozano
- 2152.pdfUploaded byLivia Marsa
- Nr310206 Optimization TechniquesUploaded bySRINIVASA RAO GANTA
- v113n10p775Uploaded byhehusa25
- QuestionUploaded byKayla Camille A. Miguel
- ijsrp-p2491.pdfUploaded byIJSRP ORG
- CfoUploaded byMiguel Valderrama
- c6Uploaded byvignesh_march1994
- Lambert's Problem for Exponential SinusoidsUploaded bystargate1200
- 1. Industrial Engg - IJIET - Implementing Cellular Manufacturing With - RAJENDRA GOWDA - SANJAY KUMARUploaded byTJPRC Publications
- 06338327Uploaded byManohar Chamana
- PGDM 201 - Quantitative Techniques - IIUploaded bySeble Getachew
- Algorithms 08 00697Uploaded byLuis Santos
- Course NotesUploaded bymustafa_yildiz06
- l i n e a r p Ro g r a m m i n g w i t h t h eUploaded byraheel1991
- UntitledUploaded byClay Morgan
- A Pseudo Polynomial Algorithm for Optimal Capacitor Placement on Electric Power Distribution NetworksUploaded byOreana Garcia Alarcon
- 7 Chapter 6Uploaded byCharan Teja Devarapalli

- MATH404 ProjectUploaded byMahmoudAbdulGalil
- unit2Uploaded byMahmoudAbdulGalil
- p10Uploaded byMahmoudAbdulGalil
- project-2Uploaded byMahmoudAbdulGalil
- batteryUploaded byJaffar Tayyar
- الباب المفتوح #إليك كتابي.pdfUploaded byMahmoudAbdulGalil
- unit3Uploaded byMahmoudAbdulGalil
- burgard09iros.pdfUploaded byMahmoudAbdulGalil
- chpUploaded byMahmoudAbdulGalil
- SCE Curriculum June30thUploaded byMahmoudAbdulGalil
- Aly 11 MultipleUploaded byMahmoudAbdulGalil
- Cascaded ClassificationUploaded byMahmoudAbdulGalil
- Debian TutorialUploaded bypoetra90

- Baldani cap 01 e 02 (1) (1).pdfUploaded byjulidioni
- 00 Lecture Notes KuhntuckerUploaded byhimu6749721
- 691325.pdfUploaded byGauna Anuga
- optimizacaoUploaded byAnaPanasco
- Network Utility MaximizationUploaded byduh1988
- Erleben.13.Siggraph.course.notesUploaded byamyounis
- A First Course in Optimization Theory - ContentUploaded bySuraj Kumar
- Per Book 2010Uploaded byMan Kit
- DEAGIC2A6Uploaded byTheoretical Physics
- Economic OperationsUploaded byuplbseles
- GlosarioUploaded byJosé Luis
- The Braess Paradox in Electric Power Systems Braess_paradoxUploaded byNick
- Chapter_4Uploaded byPeilin Tee
- 7010ps13solUploaded byjosergond
- Answers of Assignment 2Uploaded bySonal Bhatia
- L32_Quadratic Programming - Modified Simplex algorithm.pptUploaded byNirmit
- Clsu ThesisUploaded byJerwin Martinez Rodriguez
- Optimization QuestionsUploaded byaditvas
- Lecture Notes Kuhn Tucker 2Uploaded bysilvio de paula
- Kunch Tucker ConditionUploaded byShweta Joshi
- NLP NCTUUploaded bylarasmoyo
- Lecture Notes October 30Uploaded byeco_alvaro
- Power System Operation and ControlUploaded byAaron Merrill
- EE 303 Section E3 AnswersUploaded byianon29
- sdsdUploaded byAjay Sahu
- Applied Mathematics & Information SciencesUploaded bySpin Fotonio
- 01_zhang_handoutUploaded byMichał Gromisz
- Kuhn Tucker ConditionsUploaded byBarath
- Lecture_1Uploaded byAnonymous njZRxR
- Nonlinear Multi-objective Optimization by Kaisa MiettinenUploaded byGustavo Libotte