You are on page 1of 81

Optimization Methods

Lecture 4

L4.2

Overview
Optimization Definition
Performance Criterion
Constraints & Penalties
Numeric Techniques
Exploratory Optimization Techniques
Genetic Algorithms

Engineering Data Mining

Introduction to Isight

L4.3

Optimization Definition
Optimize: To make as perfect, effective or functional as possible Websters New Collegiate Dictionary

In the most general terms, optimization theory is a body of
mathematical results and numerical methods for finding and identifying
the best candidate from a collection of alternatives without having to
explicitly enumerate and evaluate all possible alternatives. Engineering Optimization Methods by Reklaitis, Ravindran & Ragsdell

Reklaitis,
Ravindran & Ragsdell

Introduction to Isight

L4.4

Optimization Terminology

Design Variables (Independent variables)

Distinguish between variables which can change and fixed variables.

Perturbed parameters are called Design variables

Fixed parameters can be considered Potential variables

Objective Function

The function the design variables are used to minimize (maximized)

Usually consists of outputs, can include inputs

Constraints

Boundaries on optimization problem

Inequality Constraints - One sided conditions that must be met for the design to be acceptable.

Equality Constraints - Precise conditions that must be satisfied for the design to be acceptable.

Side Constraints Lower/Upper Bounds on the design variables that defines the design space.

Introduction to Isight

L4.5

Optimization Terminology

Unconstrained Optimization Neither equality nor inequality constraints are present. (Usually side
constraints are not considered as inequality constraints.)

Constrained Optimization At least one equality or inequality constraint is presented.

Feasible Design A design that satisfies all constraints.

Infeasible Design A design that violates one or more constraints.

Optimum Design Minimum (maximum) objective satisfying all constraints.

Active Constraints A design on a constraint boundary. Some algorithms will allow for a tolerance

Violated Constraints A design on the infeasible side of constraint boundary.

Introduction to Isight

L4.6

Optimization Design Space Example

Mass = 300
Mass = 400
Loads at
free end

Flange
Width
Design Variables:

Feasible
Design
Space

40

30

10 Beam Height 80mm


10 Flange Width 50 mm
Constraint:

50
Flange Width, mm
mm

Beam
Height

20

Design
Space

10

Stress 16 MPa

Objective:
Minimize Mass

Solution:
X
Beam Height = 38.4
Flange Width= 22.7
Stress
= 16
Mass
= 233.4
Introduction to Isight

10 20 30 40 50 60 70 80

Beam Height, mm
,mm

Stress
= 16

Defining the Optimization Problem

L4.8

Optimization Problem Formulation


Elements of Optimization Problem Formulation:
1.

System Model: What simulation predicts behavior of design?

2.

3.

Independent Variables: What design values can be modified?

4.

5.

System Boundaries: What limits or requirements must be satisfied or obeyed?

6.

7.

Performance Criterion: What performance are you looking to achieve?

8.

Mathematical Formulation:
Objective: Minimize
Subject To:
Equality Constraints:
Inequality Constraints:
Side Constraints:

F x

hk ( x ) 0 k 1, K

g j ( x ) 0 j 1,, J


xi( L ) xi xi(U ) i 1,, N

Introduction to Isight

L4.9

Performance Criterion Objective

Weighted sum of the current value of all parameters designated by the user as objectives according to the formula:

Objective = S (W iXi)/SFi

Where: W is Weighting numerator and SF is scaling factor denominator

wSF

The optimization will focus on minimizing the objective functions value.

If a parameter is to be maximized, the negative value of the parameter will be used in the objective function.

Example: Maximize Velocity

Velocity = 3; W=1.0 ; SF =1.0 3 1 1

Objective function = 1.0 -(3) / 1.0 =-3

Velocity = 55; W=1.0 ; SF =1.0 55 1 55

Objective function = 1.0 (55) / 1.0 = -55

Introduction to Isight

L4.10

Why Both Scale Factor and Weight?

We cannot answer following question:

Is 1kg bigger than 1 dollar? 11

Scale factors and parameters use same units and usually set to have similar magnitudes

Weight is a dimensionless coefficient that defines importance of an objective

Example:

Goal: Minimize Mass and maximize Efficiency


Mass = 640kg Scale Factor = 1000kg

Weight = 2.0

Efficiency = 0.83% Scale Factor = $1.0

Weight = 1.0

Dimensionless Objective = S (WiXi)/SFi


= (2 * 640) / 1000 - (1 * 0.83) / 1.0
= 0.45

Introduction to Isight

L4.11

System Boundaries Penalties/Feasibilities

Constrained vs. Unconstrained domains

Design Variables - Define the region of interest

Outputs - Defines design requirements

Feasible vs. Infeasible

Constraint Violations= apply Penalty; infeasible Design

Failed Designs

X2

Optimization
Constraint
Boundary

Initial Design

Feasible Infeasible
(safe)
(failed)

X1
Introduction to Isight

L4.12

Isight Constraints and Violations


Isight
Constraint Conditions:
Equality constraint: (hk(x) - Target) Wk/SFk = 0
Lower Bound inequality constraint : (LB - gj(x)) Wj/SFj<= 0
Upper Bound inequality constraint: (gj(x) - UB) Wj/SFj<= 0
Constraint Violations:

Equality constraint violation: (hk(x) - Target) Wk/SFk


Lower Bound inequality constraint violation: (LB - gj(x))
Wj/SFj
Upper Bound inequality constraint violation: (gj(x) - UB)
Wj/SFj

Introduction to Isight

L4.13

Isight Penalties
Isight calculates a penalty when a models output violates a constraint.
Isight
The Penalty Function is:
Penalty = base + multiplier* (violationexponent)
* ()
default values of base = 10, multiplier = 1000, exponent = 2

1010002
apis are available to change penalty base values

Penalty = 10+1000* (violation2)

Introduction to Isight

L4.14

Objective and Penalty


This parameter is used by Isight to compare designs. It is not used by
the optimization techniques.
Isight
The sum of the objective and penalty function values for a particular
design point.

ObjectiveAndPenalty = Objective + Penalty
Objective = 0.45
Penalty = 730.0
ObjectiveAndPenalty = 0.45 + 730.0 = 730.45

Introduction to Isight

L4.15

Isight Internal Parameter Feasibility

An integer. Indicates the feasibility of the current design point and evaluates it with respect to previous
points. Higher numbers are better results.

Feasibility
1

Meaning

infeasible (constraint violations)

hard infeasible tie

hard infeasible better

4-6

Values not used for feasibility

feasible (constraints met)

feasible tie (as good as previous best)


feasible better (best so far)

Introduction to Isight

L4.16

Feasibility: How it works

Feasibility

9-1

Run #
Task: minxY

3-6

Y > Ymin

9-2

1-7

8-3

Infeasible area

9-4

7-5

Ymin

9-8
1-7
3-6
Introduction to Isight

Searching The Design Space

L4.18

Optimization Technique Classification


In general, Optimization techniques can be grouped in 3 general categories:
3

Gradient Techniques

Generally assume the design space is continuous and unimodal

Fast and efficient local search

Hill Climbers

Require accurate first derivatives of f(x)

2nd order techniques require second derivative

Exploratory

Direct Methods Techniques

Use only function evaluations

Typically evaluate a performance index in


some pattern around the base point.

Require only one function value to proceed

Exploratory Techniques

Look for global optimum

Computationally expensive

Introduction to Isight

Direct

Gradient

L4.19

Optimization: Handling of Constraints


Constrained methods

Deal with constraints directly during the numerical search process

Uses two functions


Objective Function for performance
Penalty Function for constraints

Many methods will focus on attaining a feasible design or minimizing


penalty functions before minimizing the objective function.

Techniques:
Gradient
NLPQL, LSGRG, MMFD

Minimize
F x
Subject to:

hk ( x ) 0 k 1, K

g j ( x ) 0 j 1,, J


xi( L ) xi xi(U ) i 1,, N

Introduction to Isight

Exploratory

LSGRG
MMFD
NLPQL

Constrained

Direct

Gradient

L4.20

Optimization Handling of Constraints


Unconstrained or Penalty Methods

Uses one functions by adding a penalty term to the objective function

Converts a constrained optimization problem to an unconstrained problem.

The resulting single function is minimized to find best performance and feasibility.

Techniques:

Exploratory

Exploratory

Neighborhood Cultivation Genetic Algorithm


(NCGA)

Non-dominated Sorting Genetic Algorithm


(NSGA-II)

Multi-Island Genetic Algorithm (MIGA)

Adaptive Simulated Annealing (ASA)

ASA
NCGA
NSGA-II
MIGA

Unconstrained

Direct

Downhill Simplex
Hookes Jeeves

Hookes
Jeeves
Downhill
Simplex

Introduction to Isight

Direct

Gradient

Numeric Techniques
Gradient Methodology

L4.22

Gradient Methods-Bouncing Ball Analogy

Bouncing Ball Analogy

Drop a Ball on the continuous design space, the ball will bounce a few times and
finally settle down at the bottom of a valley

Gradient Based Techniques:

Bouncing Follows Steepest Direction (based on gradient at contact point)

Tends to settle at a Nearby Valley (highly dependent on starting point)

Would not Explore other valleys

May get stuck at a plateau or local Valley Drop a Ball on the continuous design
space, the ball will bounce a few times and finally settle down at the bottom of a
valley

Introduction to Isight

L4.23

Gradient Methodology
1 Iteration = 1 Gradient Calculation + 1-D Search
111
Main Assumption: Local Design Space is Continuous and Convex
P()

Gradients are calculated around a baseline point.

S1
S2

Gradient defines 1-d search direction, direction of


steepest decent

Gradient & 1-d search calculations continue until a


minimum is found, convergence is detected, or max
number of iterations achieved.

S3

2
1
Introduction to Isight

L4.24

Gradient Based Algorithms in Isight Isight


Sequential Quadratic Programming (NLPQL)

Large Scale Generalized Reduced Gradient (LSGRG)

Modified Method of Feasible Directions (MMFD)

Introduction to Isight

L4.25

Gradient Optimization Example MMFD

Finds local
minimum

Minimize

Y = (X1 - 5)2 + (X2 - 6)2

Using MMFC gradient technique


Dropped at (9,2) (9,2)
Settled at bottom (5,6) (5,6)

Introduction to Isight

L4.26

Gradient Optimization Rosenbrock Function


Rosenbrock
Z = 100*(Y-X2)2 + (1-X)

Local Value: [-1,1]; Z = 4[-1,1]; Z = 4


Local Value [0,0]; Z = 1 [0,0]; Z = 1
Global Minimum: [1,1] ; Z = 0;
[1,1] ; Z = 0;
1.

2.

Attempt 1

Initial Points

LSGRG

Initial Point X=-1, Y=1

X=-1, Y=1

Global Optimum

Attempt 2

LSGRG

Initial Point X=2, Y= 3

X=2, Y= 3

Introduction to Isight

L4.27

Optimization Exercise Rosenbrock Function


Rosenbrock
Attempt 1
X = -1.0; Y = 1.0
Best design point:
X = 0.81 ; Y = 0.66
Objective = 0.033
Rests in local optimum

Attempt 2
X = 2.0; Y = 3.0
Best design point:
X = 0.99; Y = 0.99
Objective = 1.2E-7
Finds Global Optimum

Introduction to Isight

L4.28

Gradient-Based Algorithms Advantages

Exploits the local area around a design point

If the landscape is continuous and unimodal, algorithm efficiently moves in the direction of steepest
descent

Has a general applicability to engineering designs

Sound mathematical basis, i.e.. proofs have been developed for the convergence of specific techniques
under given conditions

Gradient calculations can be performed in parallel

Introduction to Isight

L4.29

Gradient-Based Algorithms: Disadvantages

Does not take advantage of engineering expertise or domain knowledge



Very dependent on the starting point

If analytic gradients are not available, finite differencing must be used

Gradient computational efforts increases with the dimensionality of the


problem and duration of each exact analysis

Likely to get stuck around local minima.

Introduction to Isight

L4.30

NLQPL: Basic Tuning Parameters

Max Number of Iterations

1 Itermax + 1 +

Default = 1010

Relative Step Size

AKA: finite difference step

0.0 Relative Grad. Step0.0

Default = 0.001

Termination Accuracy

Changes smaller than this number are not considered


improvements

Default 1.0E-6

Minimum Absolute Step Size

If finite difference step is reduced below this value, terminate


optimization step

0.0 Absolute Grad. Step 0.0

Default = 0.0001

Introduction to Isight

F(xq ) F(xq 1 ) Tolerance

L4.31

MMFD: Basic Tuning Parameters

Max Number of Iterations

1 Itermax + 1 +

Default = 4040

Relative Gradient Step

AKA: Finite difference step

0.0 Relative Grad. Step0

Default = 0.01 (1 percent)0.01

Minimum Absolute Gradient Step

Minimum absolute finite difference

0.0 Absolute Grad. Step0.0

Default = 0.001

Absolute Objective Convergence


Default = 0.001

F(xq ) F(xq 1 ) Tolerance

Introduction to Isight

L4.32

LSGRG: Basic Tuning Parameters

Max Number of Iterations

1 Itermax + 1 +

Default = 1010

Relative Step Size

AKA: finite difference step

0.0 Relative Grad. Step0.0

Default = 0.001

Convergence Epsilon

If finite difference step is reduced below this value,


terminate optimization step

Default = 0.001

Introduction to Isight

Numeric Techniques
Direct Methodology

L4.34

Direct Methodology

Use only direct function evaluations

Require only one function value to proceed ( No gradients )

Direct Pattern methods evaluate a performance index in some pattern around the base point.

Selects a base point

Evaluates a performance index


in some pattern around the base
point

Isight has one Direct Penalty pattern


search numerical technique:

Isight

Hooke-Jeeves Pattern search

Hooke-Jeeves

Introduction to Isight

L4.35

Hooke-Jeeves Method Pattern Move Phase


Hooke-Jeeves
Hookes Jeeves
Relative Step Size = .8
Reduction Factor = .8
More Evaluations, but may be able to jump
out of local minima before reduction occurs.

Hookes Jeeves
Relative Step Size = .5
Reduction Factor = .5
Quicker but will fall into local minima

Minimize

Y = (X1 -

5)2

+ (X2 -

6)2

Introduction to Isight

Dropped at (9,2) (9,2)


Settled at bottom (5,6) (5,6)

L4.36

Hooke-Jeeves Basic Tuning

Max Number of iterations:

A Hookes-Jeeves iteration will contain evaluations from both the techniques exploratory moves and the
pattern move. A single iteration will have several task process evaluations which are dependent upon
the number of design variables. One iteration does not equal one task process run. The default value is
10.

Hookes-Jeeves
10

Max Relative Step Size:

Step Size Taken in Exploratory


Phase D.

Step Size Reduction Factor:

The % used to reduce the relative


step size is reduced. This occurs
when exploratory steps fail to find a
better design

Max Number of Evaluations:

Termination Criteria: Max number of task process evaluations. Default is 100.

100

Introduction to Isight

L4.37

Downhill Simplex Optimization Algorithm

Exploratory method
Very popular in chemical engineering, fluids, dynamics

Nealder and Mead method modified from Numerical Recipes
Nealder Mead
Geometric Method: Mixture DOE & optimization


Very good optimizer for moderate number of design variables and
moderately non-linear problems (10 50)
1050

Start with group of points. Move worst point in the direction of the
center of better points until optimum is found

Introduction to Isight

L4.38

Nealder & Mead Downhill Simplex

Introduction to Isight

L4.39

HJ and Downhill Simplex Advantages


Works with all parameter types: real, integers, and discrete

Efficiently exploits the local area around a design point

Large step sizes in exploratory phase allow algorithms to search a
wider area than gradient techniques.

Has a general applicability to engineering designs

Sound mathematical basis, i.e.. proofs have been developed for the
convergence of specific techniques under given conditions

Introduction to Isight

L4.40

HJ and Downhill Simplex Disadvantages


Parallel execution not available.
Somewhat Dependent on the starting point

Likely to get stuck around local minima.
Usually requires many function evaluations (executions); therefore not
well suited for long running codes

Does not take advantage of engineering expertise or domain knowledge

Introduction to Isight

Exploratory Optimization Techniques

L4.42

Exploratory Optimization Techniques

Exploratory techniques

Look for global optimum

Computationally expensive

Exploratory algorithms in Isight include: Isight

Genetic Algorithm

mimics the process of natural selection

select population size & number of generations

Adaptive Simulated Annealing

mimics the process of the Annealing of metals during the cooling process

Introduction to Isight

Exploratory Techniques
Adaptive Simulated Annealing

L4.44

Simulated Annealing Technique

Simulated Annealing (SA) is a controlled random exploring


technique.

To continue the Bouncing Ball Analogy, imagine if the ball is made


with a non-smooth surface so that the bouncing would become
randomized (not following the steepest direction). The ball may
bounce a few more time before it gets settled down at the valley
bottom. Such random jumping process is very similar to the ASA
algorithm.

Why the Random Jumping?

The Random Bouncing Helps to Jump to the Other Side of the


Hill and Allows to Explore Other Valleys. This makes SA Less
Sensitive to the Initial Dropping Point as Gradient-Based
Techniques Do.

C = initial design

Cnew = perturb C (in radius r)

Cnew better than C?

no 95%

yes
Update design: C = Cnew

(reduce r)

Introduction to Isight

5%

L4.45

Adaptive Simulated Annealing

The working principle of simulated annealing is borrowed from metallurgy:

a piece of metal is heated (the atoms are given thermal-agitation), and then the metal is left to cool slowly.

The slow and regular cooling of the metal allows the atoms to slide progressively in their most stable ("minimal energy")
positions. (Rapid cooling would have "frozen" them in whatever position they happened to be at that time.)

The resulting structure of the metal is stronger and more stable.

In terms of optimization, minimizing the energy of a block of metal (or maximizing its strength), the program minimizes the
objective function. The global minima corresponds to the ground state of the substance.

Introduction to Isight

L4.46

Adaptive Simulated Annealing

Example: Imagine a mountain range defined by our Objective and Penalty


Function. Find the lowest valley.

Introduction to Isight

L4.47

Adaptive Simulated Annealing


Initially, ASA starts at a high energy "temperature," where the temperature is an ASA parameter that mimics the effect of a
fast moving particle in a hot object like a hot molten metal, thereby permitting the ball to make very high bounces and
being able to bounce over any mountain to access any valley, given enough bounces.

Introduction to Isight

L4.48

Adaptive Simulated Annealing


As the temperature is made relatively colder, the ball cannot bounce so high, and it also can settle to become trapped in relatively
smaller ranges of valleys. At each temperature the simulation must proceed long enough for the system to reach a steady state or
equilibrium. This is known as thermalization. The sequence of temperatures and the number of iterations applied to thermalize the
system at each temperature comprise an annealing schedule.

Introduction to Isight

L4.49

Adaptive Simulated Annealing


To apply simulated annealing, the system is initialized with a particular configuration. A new configuration is constructed by imposing
a random displacement. If the energy of this new state is lower than that of the previous one, the change is accepted unconditionally
and the system is updated. If the energy is greater, the new configuration is accepted probabilistically. This is the Metropolis step,
the fundamental procedure of simulated annealing. This procedure allows the system to move consistently towards lower energy
states, yet still `jump' out of local minima due to the probabilistic acceptance of some upward moves.

Metropolis

Random reconfiguration, Check energy level


Introduction to Isight

L4.50

Comparing Optimization Techniques


Pattern
(Hooke-Jeeves)

Gradient (MMFD)

Minimize

Y = (X1 - 5)2 + (X2 - 6)2

Dropped at (9,2)
Settled at bottom (5,6)

Introduction to Isight

Exploratory

(Simulated Annealing)

L4.51

ASA Tuning Parameters

Max Number of iterations: Maximum number of generated designs. 1iteration = 1 design point. Default is 10000 ( This
is quite high )
10000
Relative Gradient Step : Gradient step used when re-annealing
Note: All Design variables must have Upper and lower boundaries. If not, ASA will generate designs using
inputs ranging from - to

Introduction to Isight

L4.52

ASA: Advantages & Disadvantages

Advantages

Can deal with arbitrary systems and cost functions

Statistically improve chances of finding a global optimal solution

Generally provides a good solution

Fewer tuning parameters and less difficult to control compared to Genetic Algorithms

No assumptions regarding the continuity or convex nature of design landscape

Can move uphill or downhill, thus away from local optima

Disadvantages

Repeated annealing is slow

Not the simplest or fastest optimization algorithm for problems where the energy landscape is smooth or
where there are few local optima

CPU intensive

Cannot be run in parallel

Introduction to Isight

Exploratory Techniques
Genetic Algorithms

L4.54

Genetic Algorithms

Genetic Algorithms (GA) was inspired by Darwin's theory of natural selection and survival of the fittest.

Genetic algorithms mimic the way large populations solve problems over a long period time, through
processes such as reproduction, mutation, and natural selection.

GA creates a population of candidate solutions to a particular problem, and through a process of random
selection and variation, each generation of the program improves upon the quality of the solution.

Consequently, genetic algorithms promote the evolution of solutions by using genetically based
processes.

No restrictive assumptions about the continuity or convexity of the parameter space are made.

Introduction to Isight

L4.55

Genetic Algorithms
Key Concept:
Genetic Algorithms is another
random exploring technique (more
random than SA).

GA simulates Natures evolution


process to find the most- fitted
(optimum) designs

GA is good at dealing with very


irregular terrain, but is usually very
computationally expensive.

Basic Genetic Algorithm


1.

[Start] Generate random population of n chromosomes (suitable solutions for the


problem) n
2. [Fitness] Evaluate the fitness f(x) of each chromosome x in the population

3. [New population] Create a new population by repeating following steps until the new
population is complete
4. [Selection] Select two parent chromosomes from a population according to their fitness
(the better fitness, the bigger chance to be selected)

5. [Crossover] With a crossover probability cross over the parents to form a new offspring
(children). If no crossover was performed, offspring is an exact copy of parents.

6. [Mutation] With a mutation probability mutate new offspring at each locus (position in
chromosome).
7. [Accepting] Place new offspring in a new population
8. [Replace] Use new generated population for a further run of algorithm

9. [Test] If the end condition is satisfied, stop, and return the best solution in current
population
10. [Loop] Go to step 2

Introduction to Isight

L4.56

Genetic Algorithms

Evolution of population of designs: survival of the fittest

The genetic operations of selection, crossover and mutation are applied to each successive population
evaluation

Design 1
A B C D E
7 3 2 0 2

Original population

Selection of design characteristics from a


Fitness Function:
survival of the fittest

Design 2
A B C D E
1 4 6 2 2
Crossover

New Design 1
A B C D E
7 4 6 0 2

New Design 2
A B C D E
1 3 2 2 2

Mutation
New Selected population

New Design 1
A B C D E
7 4 6 0 6

Introduction to Isight

L4.57

Genetic Algorithms Tuning Parameters

Size of SubPopulation. (integer >=1, default 10)


The total population is spread
out equally between the islands. The size of the
total population depends on the number of
islands and the size of sub-population.

110

Number of Generations. (integer >=1, default 10)


Number of generations that will be evaluated by
the algorithm. Each generation includes subpopulations on all islands.
110

Introduction to Isight

L4.58

Genetic Algorithm Optimization Example

Runs: 1- 50

Runs: 50- 100


Minimize

Runs: 250- 300

Y = (X1 - 5)2 + (X2 - 6)2

Runs: 200-250

Introduction to Isight

Runs: 100- 150

Runs: 150-200

L4.59

GA Optimization Example (Cumulative)

50 Runs

300 Runs

100 Runs

250 Runs

Introduction to Isight

150 Runs

200 Runs

L4.60

GA Advantages & Disadvantages

Advantages

Search from a set of designs and not a single design

Not derivative-based

Number of function evaluations is dependent on population size and does not increase with the dimensionality of the problem

Use probabilistic rules

Random selection of parameters to exchange between designs

Allows non-biased parameter space exploration

Work with discrete and continuous parameters

Explore and exploit the parameter design space

Calculation can be performed in parallel

Disadvantages

Inefficient, large population size and many population evaluations required

Tend toward premature convergence

Difficulty in determining correct penalty function parameters

Penalty parameters are used to identify good features

Introduction to Isight

Expert System Optimization Technique

L4.62

Expert System Optimization Techniques

Expert system techniques might:

Be Knowledge-based expert systems

Use engineers knowledge

Learn as optimization progresses

Pointer learns as the optimization progresses (on-the-job training!)

pointer

Introduction to Isight

L4.63

Pointer: Automatic Optimization

Is like an automatic transmission for your optimizers

Is a 3rd generation technique for adaptive optimization

Efficiently solves a wide range of problems

Lowers technical barriers to using optimization

Uses proprietary algorithm to control selection and tuning of 4 complimentary optimization techniques

SIMULIA

Pointer trains itself Pointer

based on the design problems topography

stores successful approaches towards finding the optimum

Introduction to Isight

L4.64

Pointer: Automatic Optimization


Inputs:
Design variables
Constraints
Objective
Total time available for the search
Time required to evaluate one design

Topography
Pointers Core Algorithms:
Linear simplex

Sequential quadratic programming (NLPQL)


Downhill simplex
Genetic algorithm
Introduction to Isight

L4.65

Pointer: Automatic Optimization


Topography type
Nonlinear (default)
Linear

Smooth
Discontinuous
Unknown
Maximum allowable time
Uses all time given

Tries more radical changes when no improvement


is found

Default value is 1 hour 1
Value > 0.0 0

Average analysis time


Average wall clock time for one
complete analysis
CPU
Default value is 1 second1

Introduction to Isight

Value > 0.0

L4.66

Pointer: Advantages & Disadvantages


Advantages
Can handle highly non-linear problems
Requires no knowledge of optimization and design space ahead of time


Can handle many input parameters (> 20)20
Automatically switches techniques internally
Never assumes convergence continuing to try to find a better solution as long as it is running

Disadvantages
Requires many iterations so may be more difficult to apply to long running simulation codes

Introduction to Isight

Multi Objective Optimization

L4.68

Multi-Objective Optimization Problem (MOOP)

MOOP:

Find a vector of decision variables x, which optimizes a vector function

fm(x), m = 1, 2, , M;
gj(x) <= 0, j = 1, 2, , J;

satisfies inequality
and equality constraints

hk(x) = 0, k = 1, 2, , K;

And whose elements represent the objective functions. These functions form a mathematical description of
performance criteria which are usually in conflict with each other. Hence, the term optimize means finding
such a solution which would give the values of all the objective functions acceptable to the decision maker.

Introduction to Isight

L4.69

Concept of Domination
Most multi-objective optimization algorithm use the concept of
domination. In these algorithms, two solutions are compared on the
base of whether one dominates the other solution or not.

Any solution x(1) is said to dominate x(2) or x(1) is said to be nondominated by x(2) if both the condition 1 and 2 are true:
12 21
1. x(1) is no worse than x(2) in all objectives
1
2. x(1) is strictly better than x(2) in at least one objective

12

Introduction to Isight

L4.70

Concept of Domination Example

Solution 2 dominates solution 1

f1 (minimize)

21
Solution 3 dominate solution 1.

31
2

Neither solution 2 or 3 dominates


each other.
23
Solution 4 dominates solution1, 2 and
3.

3
4

4123

f2 (minimize)

Introduction to Isight

L4.71

Pareto Optimality

Among a set of solutions P, the non-dominated


set of solutions P are those that are not
dominated by any member of set P.

Pareto optimality can be defined as the best that


could be achieved without disadvantage at least
one group.

pareto

The solution to a MOOP is, as a rule, not a


particular value, but a set of values of decision
variables such that, for each element in this set,
none of the objective functions can be further
increased without a decrease of some of the
remaining object functions (every such value of
a decision variable is referred to as paretooptimal (PO)).

f1 (minimize)

Pareto

Introduction to Isight

f2 (minimize)

f1 (minimize)

f2 (maximize)

L4.72

Global/Local Pareto-Optimal Set


pareto
Global PO set is the nondominated set of entire
feasible search space S.
Since the solutions of this
set are not dominated by
any feasible member of the
search space, they are the
optimal solutions for MOOP.

pareto

f1 (minimize)

Local pareto-optimal set

Global pareto-optimal set

f2 (minimize)

Introduction to Isight

L4.73

Multi-Objective Optimization: 2 Approaches

The fundamental difference between a single and multi-objective optimization task: All of the solution on
the PO front are optimal.

pareto

Depending on the stage of optimization where this high level information is used, there can be two possible
approaches to MOOP:

Preference-Based Approach OR a priori.

Decision-maker combines the differing objectives into a scalar cost function and convert MOOP to
single objective optimization problem. This procedure can be repeated again and again to find multi
trade-off solutions by using different cost function.

Ideal Multi-Objective Optimization OR a posteriori.

First a multi-objective optimizer is used to find multiple trade-off optimal solution with a wide range of
values for the objectives, then one solution is chosen from them using higher level information

Introduction to Isight

L4.74

Multi-Objective Optimization: 2 Approaches

MOOP
Minimize f1

Minimize fM
Subject to constraints

Single objective
optimization problem
Higher level
Information

Estimate a relative
Important vector

(w1, w2, , wM)

F = w1f1 + w2f2 + + wMfM

Preference Based Approach

IDEAL
Multi-objective optimizer

Genetic
algorithms

Single objective
Optimizer

Ideal Multi-Objective Optimization

Gradient based methods

Multiple trade-off solutions found

Higher level
Information

Introduction to Isight

One optimum solution

L4.75

Desirable Features in Multi-Objective GA

1. Approaching to Pareto
Front
Pareto

3. Uniform distribution on
Pareto front
Pareto

2. Covering wide area of


Pareto front
Pareto

Introduction to Isight

L4.76

MOGA in iSIGHT Isight

NSGA II Deb and Agrawal 2001 MOGA in iSIGHT

Deb Agrawal 2001Isight

SPEA2 improvements continued, but NSGAII generally considered slightly better

Pareto

Improved definition of fitness

Increased size of archive population

Selection

NCGA (Neighborhood Cultivation Genetic Algorithm) MOGA in iSIGHT

Isight

Based on SPEA2, but introduces the new concept of neighborhood cultivation

Pareto

Limits crossover to a certain range of design space

Derived from the concept of sub populations in Distributed GA

Crossover is more effective with individuals that have similar characteristics (neighbors)

Research group concluded, based on testing, that NCGA gives better results than NSGAII when:

Objective functions have multi-peak landscapes

Number of design variables is large (>100) 100

However, NSGAII is often used as a standard benchmark multi-objective algorithm

Introduction to Isight

L4.77

Custom Optimization Techniques


Custom Techniques can be added using SDK
Isight
Custom Techniques once added are available as plug ins and can be
used by anyone in your company

Engineous consulting can be provided to assist in this process
DS SIMULIA

Introduction to Isight

L4.78

Engineering Data Mining


EDM is a tool that allows easy visualization of Data, especially Pareto
Data associated with MOGA Techniques
Pareto

Can be used with any set of data generated by Isight


Isight
Scatter Plots are interactive

Introduction to Isight

L4.79

Engineering Data Mining

Interactive: moving mouse over


graphs selects data for given
RunCounter, both in line graphs
and scatter

Data displays in tables

Can view only pareto points if


pareto file was specified (pareto
file comes from new
multiobjective genetic
algorithms (NSGAII and NCGA)

Pareto
ParetoPareto
NSGAII
NCGA

Sorting and filtering capabilities


to view data better

Introduction to Isight

L4.80

Filtering for Trade Off and Decision Making

Task: Find design variable values that result in a design with Mass less
than 700
700

Introduction to Isight

L4.81

Troubleshooting
Check the problem formulation
Is my objective well defined?
Included all critical independent variables?

Constraints?
Do the math models capture the physics?

Is my starting point design feasible or infeasible?

Is my design space continuous? Nonlinear?

Do I have discrete variables?


Do I have a large number of variables and/or constraints?

Introduction to Isight

You might also like