You are on page 1of 13

Optimum Design

with
MATLAB
Cheng-Liang Chen
PSE
LABORATORY
Department of Chemical Engineering
National TAIWAN University
Chen CL 1
Optimization Toolbox
[x, FunValue, ExitFlag, Output] = fminX(ObjFun,...,options)
Type of Problem Formulation Function
Uncons Min (1-var) Find x [x

, x
u
] to min f(x) fminbnd
Unconstrained Min Find x to minimize f(x) fminunc
fminsearch
Constrained Min Find x to minimize f(x) fmincon
s.t. Nx = e, Ax b
h
j
= 0, j = 1, . . . , p
g
k
= 0, k = 1, . . . , m
x
i
x
i
x
iu
Linear Programming Find x to minimize f(x) = c
T
x linprog
s.t. Nx = e, Ax b
Quadratic Programming Find x to min f(x) = x
T
x +
1
2
x
T
Hx quadprog
s.t. Nx = e, Ax b
Chen CL 2
Explanation of Output
from Optimization Function
Argument Description
x The solution vector or matrix found by the optimization function.
If ExitFlag> 0 then x is a solution, otherwise x is the latest value
from the optimization routine.
FunValue Value of the objective function, ObjFun, at the solution x.
ExitFlag The exit condition for the optimization function. If ExitFlag is
positive, then the optimization routine converged to a solution x. If
ExitFlag is zero, then the maximum number of function evaluations
was reached. If ExitFlag is negative, then the optimization routine
did not converge to a solution.
Output The output vector contains several pieces of information about
the optimization process. It provides the nunmber of function
evaluations (Output.iterations) and the name of the algorithm
used to solve the problem (Output.algorithm), etc.
Chen CL 3
Single-Variable Unconstrained Minimization
Find x to minimize f(x) = 2 4x + e
x
, 10 x 10.
% File Name: Example12_1.m
% Problem: min f(x)=2-4x+exp(x)
clear all
Lb = -10; Ub = 10;
[x,FunVal,ExitFlag,Output]...
=fminbnd(ObjFunction12_1,Lb,Ub)
% File Name: ObjFunction12_1.m
% Objective: f(x)=2-4x+exp(x)
function f = ObjFunction12_1(x)
f = 2 - 4*x + exp(x);
x =
1.3863
FunVal =
0.4548
ExitFlag =
1
Output =
iterations: 14
funcCount: 16
algorithm:[1x46 char]
Output.algorithm
ans =
golden section search,
parabolic interpolation
Chen CL 4
Multi-Variable Unconstrained Minimization
Find x to minimize
f(x) = 100(x
2
x
2
1
)
2
+ (1 x
1
)
2
, x
(0)
= (1.2, 1.0)
% File Name: ObjAndGrad12_2.m (Rosenbruck Valley Function)
function [f, df] = ObjAndGrad12_2(x)
x1 = x(1); x2 = x(2);
f = 100*(x2-x1^2)^2+(1-x1)^2;
df(1)= -400*(x2-x1^2)*x1-2*(1-x1);
df(2)= 200*(x2-x1^2);
Chen CL 5
% File Name: Example12_2.m
% Problem: min Rosenbruck Valley Function
% f(x)=100(x_2-x_1^2)^2+(1-x_1)^2, x^{(0)}=(-1.2,1.0)
clear all
x0 = [-1.2 1.0];
% 1. Nelder-Mead Simplex Method, fminsearch
options=optimset(LargeScale,off,MaxFunEvals,300);
[x1, FunValue1, ExitFlag1, Output1] = ...
fminsearch(ObjAndGrad12_2, x0, options)
%
%
% 2. BFGS Method, fminunc, default option
options=optimset(LargeScale,off,MaxFunEvals,300,...
GradObj,on);
[x2, FunValue2, ExitFlag2, Output2] = ...
fminunc(ObjAndGrad12_2, x0, options)
Chen CL 6
% File Name: Example12_2.m
% Problem: min Rosenbruck Valley Function
% f(x)=100(x_2-x_1^2)^2+(1-x_1)^2, x^{(0)}=(-1.2,1.0)
% (continued)
% 3. DFP Method, fminunc, HessUpdate = dfp
options=optimset(LargeScale,off,MaxFunEvals,300,...
GradObj,on,HessUpdate,dfp);
[x3, FunValue3, ExitFlag3, Output3] = ...
fminunc(ObjAndGrad12_2, x0, options)
%
% 4. Steepdest descent, fminunc, HessUpdate = steepdesc
options=optimset(LargeScale,off,MaxFunEvals,3000,...
GradObj,on,HessUpdate,steepdesc);
[x4, FunValue4, ExitFlag4, Output4] = ...
fminunc(ObjAndGrad12_2, x0, options)
Chen CL 7
x1 =
1.0000
1.0000
FunValue1 =
8.1777e-010
ExitFlag1 =
1
Output1 =
iterations: 85
funcCount: 159
algorithm: [1x33 char]
Output1.algorithm
ans =
Nelder-Mead simplex direct search
x2 =
1.0000
1.0000
FunValue2 =
5.4185e-011
ExitFlag2 =
1
Output2 =
iterations: 26
funcCount: 111
stepsize: 1.2994
firstorderopt: 2.8785e-004
algorithm: [1x38 char]
Output2.algorithm
ans =
medium-scale: Quasi-Newton line search
Chen CL 8
x3 =
1.0000
1.0000
FunValue3 =
3.8112e-011
ExitFlag3 =
1
Output3 =
iterations: 27
funcCount: 124
stepsize: 2.4667
firstorderopt: 3.4852e-004
algorithm: [1x38 char]
Output3.algorithm
ans =
medium-scale: Quasi-Newton line search
x4 =
0.8848
0.7819
FunValue4 =
0.0134
ExitFlag4 =
0
Output4 =
iterations: 401
funcCount: 1791
stepsize: 0.0103
firstorderopt: 1.6641
algorithm: [1x38 char]
Output4.algorithm
ans =
medium-scale: Quasi-Newton line search
Chen CL 9
Multi-Variable Constrained Minimization
min f(x) = (x
1
10)
3
+ (x
2
20)
3
, x
(0)
= (20.1, 5.84)
s.t. g
1
(x) = 100 (x
1
5)
2
(x
2
5)
2
0
g
2
(x) = 82.81 (x
1
6)
2
(x
2
5)
2
0
13 x
1
100, 0 x
2
100
% File Name: ObjAndGrad12_3.m
function [f, gf] = ObjAndGrad12_3(x)
x1 = x(1); x2 = x(2);
f = (x1-10)^3+(x2-20)^3;
if nargout > 1
gf(1,1) = 3*(x1-10)^2;
gf(2,1) = 3*(x2-20)^2;
end
Chen CL 10
% File Name: ConstAndGrad12_3.m
function [g,h,gg,gh] = ConstAndGrad12_3(x)
x1 = x(1); x2 = x(2);
f = (x1-10)^3+(x2-20)^3;
g(1) = 100-(x1-5)^2-(x2-5)^2;
g(2) = -82.81+(x1-6)^2+(x2-5)^2;
h = [];
if nargout > 2
gg(1,1) = -2*(x1-5);
gg(2,1) = -2*(x2-5);
gg(1,2) = 2*(x1-6);
gg(2,2) = 2*(x2-5);
gh = [];
end
Chen CL 11
% File Name: Example12_3.m
% Constrained minimization with gradients available
%
clear all
options=optimset(LargeScale,off,GradObj,on,...
GradConstr,on,TolCon,1e-8,TolX,1e-8);
Lb = [13; 0]; Ub = [100; 100]; x0 = [20.1; 5.84];
[ x, FunValue, ExitFlag, Output] = ...
fmincon(ObjAndGrad12_3, x0, [],[],[],[],...
Lb,Ub,ConstAndGrad12_3,options)
Chen CL 12
x =
14.0950
0.8430
FunValue =
-6.9618e+003
ExitFlag =
1
Output =
iterations: 6
funcCount: 13
stepsize: 1
algorithm: [1x44 char]
firstorderopt: 4.3682e-012
cgiterations: []
Output.algorithm
ans =
medium-scale: SQP, Quasi-Newton, line-search

You might also like