You are on page 1of 29

Particle Swarm optimisation

2002-04-
24
Maurice.Clerc@WriteM
e.com
Particle Swarm optimisation
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
The inventors (1)
Russell
Eberhart
eberhart@engr.iupui.edu
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
The inventors (2)
James
Kennedy
Kennedy_Jim@bls.gov
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
Cooperation example
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
The basic idea
Each particle is searching for the optimum
Each particle is moving (cant search
otherwise!), and hence has a velocity.
Each particle remembers the position it was
in where it had its best result so far (its
personal best)
But this would not be much good on its own; particles
need help in figuring out where to search.
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
The basic idea II
The particles in the swarm co-operate. They
exchange information about what theyve discovered
in the places they have visited
The co-operation need only be very simple. In basic
PSO (which is pretty good!) it is like this:
A particle has a neighbourhood associated with it.
A particle knows the fitnesses of those in its
neighbourhood, and uses the position of the one with best
fitness.
This position is simply used to adjust the particles velocity
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
Initialization. Positions
and velocities
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
The basic idea
Each particle is searching for the optimum
Each particle is moving (cant search
otherwise!), and hence has a velocity.
Each particle remembers the position it was
in where it had its best result so far (its
personal best)
But this would not be much good on its own; particles
need help in figuring out where to search.
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
The basic idea II
The particles in the swarm co-operate. They
exchange information about what theyve discovered
in the places they have visited
The co-operation need only be very simple. In basic
PSO (which is pretty good!) it is like this:
A particle has a neighbourhood associated with it.
A particle knows the fitnesses of those in its
neighbourhood, and uses the position of the one with best
fitness.
This position is simply used to adjust the particles velocity
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
What a particle does
In each timestep, a particle has to move to a
new position. It does this by adjusting its
velocity.
The adjustment is essentially this:
The current velocity PLUS
A weighted random portion in the direction of its personal
best PLUS
A weighted random portion in the direction of the
neighbourhood best.
Having worked out a new velocity, its position is
simply its old position plus the new velocity.
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
Neighbourhoods
geographical
social
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
Neighbourhoods
Global
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
The circular
neighbourhood
Virtual circle
1
5
7
6
4
3
8
2
Particle 1s 3-
neighbourhood
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
Particles Adjust their positions according to a
``Psychosocial compromise between what an
individual is comfortable with, and what society reckons
Here I
am!
The best
perf. of my
neighbours
My best
perf.
x
p
g
p
i
v
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
Pseudocode
http://www.swarmintelligence.org/tutorials.php
Equation (a)
v[] = c0 *v[]
+ c1 * rand() * (pbest[] - present[])
+ c2 * rand() * (gbest[] - present[])
(in the original method, c0=1, but many
researchers now play with this parameter)
Equation (b)
present[] = present[] + v[]
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
Pseudocode
http://www.swarmintelligence.org/tutorials.php
For each particle
Initialize particle
END
Do
For each particle
Calculate fitness value
If the fitness value is better than its peronal best
set current value as the new pBest
End
Choose the particle with the best fitness value of all as gBest
For each particle
Calculate particle velocity according equation (a)
Update particle position according equation (b)
End
While maximum iterations or minimum error criteria is not attained
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
Pseudocode
http://www.swarmintelligence.org/tutorials.php
Particles' velocities on each dimension
are clamped to a maximum velocity
Vmax. If the sum of accelerations would
cause the velocity on that dimension to
exceed Vmax, which is a parameter
specified by the user. Then the velocity
on that dimension is limited to Vmax.
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
The basic algorithm:
again
)
)
) ) )
) ) )

|
+
+
= +
t x p rand
t x p rand
t v
t v
d d g
d d i
d
d
, 2
, 1
, 0
, 0
1
N F
N F
E
for each particle
update
the
velocity
) ) 1 ) 1 ( t v t x t x then move
for each component d
At each time step t
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
Animated illustration
Global
optimum
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
Parameters
Number of particles
C1 (importance of personal best)
C2 (importance of neighbourhood best)
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
How to choose
parameters
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
Parameters
Number of particles
(1050) are reported as usually
sufficient.
C1 (importance of personal best)
C2 (importance of neighbourhood best)
Usually C1+C2 = 4. No good reason other
than empiricism
Vmax too low, too slow; too high, too
unstable.
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
Some functions ...
Rosenbrock
Griewank
Rastrigin
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
... and some results
30D function PSO Type 1" Evolutionary
algo.(Angeline 98)
rie ank [300] 0.003944 0.4033
Rastrigin [5] 82.95618 46.4689
Rosenbrock [10] 50.193877 1610.359
Optimum=0, dimension=30
Best result after 40 000 evaluations
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
Some variants
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
Adaptive swarm size
There has been enough
improvement
but there has been not enough
improvement
although I'm the worst
I'm the best
I try to kill myself
I try to generate a
new particle
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
Adaptive coefficients
The better I
am, the more I
follow my own
way
The better is my best
neighbour, the more
I tend to go towards
him
Ev
rand(0b)(p-x)
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
Is this Just A Variation on
Evolutionary Algorithms?
Particle Swarm optimisation
2002-04-
24
Maurice.Clerc@WriteM
e.com
An applet
Next Cellular Automata

You might also like