Professional Documents
Culture Documents
(2014) 7:1728
DOI 10.1007/s12065-013-0102-2
SPECIAL ISSUE
Received: 31 July 2013 / Revised: 28 November 2013 / Accepted: 5 December 2013 / Published online: 17 December 2013
Springer-Verlag Berlin Heidelberg 2013
1 Introduction
Swarm intelligence (SI) has received great interests and
attention in the literature. In the communities of optimization, computational intelligence and computer science,
bio-inspired algorithms, especially those swarm-intelligence-based algorithms, have become very popular. In fact,
these nature-inspired metaheuristic algorithms are now
among the most widely used algorithms for optimization
and computational intelligence [16, 18, 25, 26, 49]. SIbased algorithms such as ant and bee algorithms, particle
swarm optimization, cuckoo search and firefly algorithm
X.-S. Yang (&)
School of Science and Technology, Middlesex University,
London NW4 4BT, UK
e-mail: xy227@cam.ac.uk
123
18
f 0 xt
Axt :
f 00 xt
1t1
0 1t
x1
x1
B x2 C
B x2 C
B C
B C
B .. C A xt1 ; xt2 ; . . .; xtn ; p1 ; p2 ; . . .; pk ; 1 ; 2 ; . . .; m B .. C ;
@ . A
@ . A
xn
xn
f 0 xt
;
f 00 xt
1
:
1 A0 x
123
19
S ! S :
f xt ! fmin x :
Features
Algorithm
Characteristics
Noise,
perturbations
Diversity
Randomization
Escape local
optima
Selection
mechanism
Structure
Selection
Convergence
Re-organization
Change of
states
Evolution
Solutions
123
20
123
be reached. In addition, genetic algorithms are populationbased with multiple chromosomes, and thus it is possible to
implement in a parallel manner [48, 49].
The three key evolutionary operators in genetic algorithms can be summarized as follows:
Crossover: the recombination of two parent chromosomes (solutions) by exchanging part of one chromosome with a corresponding part of another so as to
produce offsprings (new solutions).
Mutation: the change of part of a chromosome (a bit or
several bits) to generate new genetic characteristics. In
binary string encoding, mutation can be simply
achieved by flipping between 0 and 1. Mutation can
occur at a single site or multiple sites simultaneously.
Selection: the survival of the fittest, which means the
highest quality chromosomes and/characteristics will
stay within the population. This often takes some form
of elitism, and the simplest form is to let the best genes
to pass onto the next generations in the population.
21
i 1; 2; . . .; d:
x x dx x xnewmove x:
10
123
22
/aij dijb
pij Pn
;
a b
i;j1 /ij dij
12
123
vt1
vti a1 g xti b2 xi xti ;
i
13
xt1
i
14
xti
vt1
i ;
23
xt1
xti b0 ecrij xtj xti a ti ;
i
15
16
where xtj and xtk are two different solutions selected randomly by random permutation, H(u) is a Heaviside function, is a random number drawn from a uniform
distribution, and s is the step size. Here, is an entry-wise
multiplication.
On the other hand, the global random walk is carried out
by using Levy flights
xt1
xti aLs; k;
17
i
where
Ls; k
kCksinpk=2 1
;
p
s1k
s s0 [ 0:
18
123
24
19
vti vt1
xt1
x fi ;
i
i
20
xti
21
xt1
i
vti ;
22
and
rit1
4 Performance measures
Despite the huge number of studies about various metaheuristic algorithms, it still lacks a good performance
measure for comparing different algorithms. In general,
there are two major ways to compare any two algorithms
[49].
One way is to compare the accuracy of two algorithms to solve the same problem for a fixed number of
function evaluations. In most case, the problem to be
solved is usually a test function whose minimum value is
often zero by proper formulation. For example, if algorithm A obtains 0.001 for, say, N = 1000 function
evaluations, while algorithm B obtains 0.002 in a run,
one tends to say A obtained a better value than
B. However, care should be taken when making a
statement. Due to the stochastic nature, multiple runs
should be carried out so that meaningful statistical
measures can be obtained. Suppose, we run each algorithm 100 times, and we can get a mean function value
and its standard deviation. For example, if we run
algorithm A for 100 independent runs, we get a mean
lA = 0.001 and a standard deviation rA = 0.01. We can
write the results as
fA lA
rA 0:001
0:01:
24
ri0 1
expct;
23
123
fB lB
rB 0:001
0:02:
25
25
26
27
28
rA=B
r
r2A r2B
:
A2 B
29
lB 0:001;
rA 0:01;
rB 0:02;
30
which gives
rA=B
0:001
0:001
r
0:012
0:022
22:3607:
0:0012 0:0012
31
32
5 Discussions
Many optimization algorithms are based on swarm intelligence, and use population-based approaches. Most will
use some sort of three key evolutionary operators: crossover, mutation and selection; however, almost all algorithms use mutation and selection, while crossover may
appear in some subtle way in some algorithms. Crossover
is efficiently in exploitation and can often provide good
convergence in a subspace. If this subspace is where the
global optimality lies, then crossover with elitism can
almost guarantee to achieve global optimality. However, if
the subspace of crossover is not in the region where the
global optimality lies, there is a danger for premature
convergence.
The extensive use of mutation and selection can typically enable a stochastic algorithm to have a high ability of
exploration. As the exploitation is relatively low, the convergence rate is usually low, compared with that of traditional methods such as NewtonRaphsons methods. As a
result, most metaheuristic algorithms can usually perform
well for nonlinear problems, including relatively tough
123
26
123
6 Conclusion
Optimization algorithms based on swarm intelligence can
have some distinct advantages over traditional methods. By
using theories of dynamical systems and self-organization
as well as the framework of Markov chains, we have
provided a critical analysis of some recent SI-based algorithms. The analysis has focus on the way of achieving
exploration and exploitation, and the basic components of
evolutionary operators such as crossover, mutation and
selection of the fittest.
Through analysis, it has been found that most SI-based
algorithms use mutation and selection to achieve exploration and exploitation. Some algorithms use crossover as
well, while most do not. Mutation enables an algorithm to
escape any local modes, while crossover provides good
mixing to explore a subspace more effectively, and thus
more likely to lead to convergence. Selection provides a
driving mechanism to select the promising states or
solutions.
The analysis also implies that there is room for
improvement. Some algorithms such as PSO may lack
mixing and crossover, and therefore, hybridization may be
useful to enhance its performance.
It is worth pointing out that the above analysis is based
on the system behaviour for continuous optimization
problems, and it can be expected that these results are still
valid for combinatorial optimization problems. However,
care should be taken for combinatorial problems where
neighbourhood may have different meaning, and, therefore,
the subspace concept may also be different. Further analysis and future studies may help to provide more elaborate
insight.
References
1. Ashby WR (1962) Principles of the self-organizing system. In:
Von Foerster H, Zopf GW Jr (eds) Pricinples of self-organization:
transactions of the University of Illinois symposium. Pergamon
Press, London, pp 255278
2. Azad SK, Azad SK (2011) Optimum design of structures using an
improved firefly algorithm. Int J Optim Civil Eng 1(2):327340
3. Belavkin RV (2013) Optimal measures and Markov transition
kernels. J Glob Optim 55(2):387416
4. Belavkin RV (2012) On evolution of an information dynamic
system and its generating operator. Optim Lett 6(5):827840
5. Blum C, Roli A (2003) Metaheuristics in combinatorial optimization: overview and conceptual comparison. ACM Comput Surv
35(2):268308
27
30. Nandy S, Sarkar PP, Das A (2012) Analysis of nature-inspired
firefly algorithm based back-propagation neural network training.
Int J Comput Appl 43(22):816
31. Noghrehabadi A, Ghalambaz M, Vosough A, (2011) A hybrid
power seriescuckoo search optimization algorithm to electrostatic deflection of micro fixed-fixed actuators. Int J Multidiscip
Sci Eng 2(4):2226
32. Palit S, Sinha S, Molla M, Khanra A, Kule M (2011) A cryptanalytic attack on the knapsack cryptosystem using binary Firefly
algorithm. In: 2nd international conference on computer and
communication technology (ICCCT), 1517 Sept 2011, India,
pp 428432
33. Pavlyukevich I (2007) Levy flights, non-local search and simulated annealing. J Comput Phys 226:18301844
34. Price K, Storn R, Lampinen J (2005) Differential evolution: a
practical approach to global optimization. Springer, Berlin
35. Sayadi MK, Ramezanian R, Ghaffari-Nasab N (2010) A discrete
firefly meta-heuristic with local search for makespan minimization in permutation flow shop scheduling problems. Int J Ind Eng
Comput 1:110
36. Senthilnath J, Omkar SN, Mani V (2011) Clustering using firefly
algorithm: performance study. Swarm Evolut Comput
1(3):164171
37. Storn R (1996) On the usage of differential evolution for function
optimization. In: Biennial conference of the North American
fuzzy information processing society (NAFIPS), Berkeley, CA,
pp 519523
38. Storn R, Price K (1997) Differential evolutiona simple and
efficient heuristic for global optimization over continuous spaces.
J Glob Optim 11(4):341359
39. Srivastava PR, Chis M, Deb S, Yang XS (2012) An efficient
optimization algorithm for structural software testing. Int J Artif
Intell 9(S12):6877
40. Suli E, Mayer D (2003) An inroduction to numerical analysis.
Cambridge University Press, Cambridge
41. Valian E, Mohanna S, Tavakoli S (2011) Improved cuckoo search
algorithm for feedforward neural network training. Int J Artif
Intell Appl 2(3):3643
42. Walton S, Hassan O, Morgan K, Brown MR (2011) Modified
cuckoo search: a new gradient free optimization algorithm. Chaos
Solitons Fractals 44(9):710718
43. Wang F, He X-S, Wang Y,Yang S-M (2012) Markov model and
convergence analysis based on cuckoo search algorithm. Jisuanji
Gongcheng/Comput Eng 38(11):181185
44. Wang GG, Guo L, Gandomi AH, Cao L, Alavi AH, Duan H, Li J,
Levy-flight krill herd algorithm. Math Probl Eng 2013(682073):14.
doi:10.1155/2013/682073
45. Wolpert DH, Macready WG (1997) No free lunch theorems for
optimization. IEEE Trans Evolut Comput 1(1):6782
46. Wolpert DH, Macready WG (2005) Coevolutionary free lunches.
IEEE Trans Evolut Comput 9(6):721735
47. Yang XS (2008) Introduction to computational mathematics.
World Scientific Publishing Ltd, Singapore
48. Yang XS (2008) Nature-inspired metaheuristic algorithms, First
edn. Luniver Press, UK
49. Yang XS (2010) Engineering optimisation: an introduction with
metaheuristic applications. Wiley, London
50. Yang XS (2009) Firefly algorithms for multimodal optimization.
In: Stochastic Algorithms: Foundations and Applications, SAGA
2009, Lecture Notes in Computer Sciences, vol 5792, pp 169178
51. Yang X-S (2010) Firefly algorithm, stochastic test functions and
design optimisation. Int J Bioinspired Comput 2(2):784
52. Yang XS (2010) A new metaheuristic bat-inspired algorithm. In:
Cruz C, Gonzalez JR, Pelta DA, Terrazas G (eds) Nature inspired
cooperative strategies for optimization (NISCO 2010) studies in
computational intelligence, vol 284. Springer, Berlin, pp 65-74
123
28
53. Yang XS (2011) Bat algorithm for multi-objective optimisation.
Int J Bioinspired Comput 3(5):267274
54. Yang XS, Deb S, Fong S, (2011) Accelerated particle swarm
optimization and support vector machine for business optimization and applications. In: Networked digital technologies 2011.
Communications in Computer and Information Science, vol 136,
pp 5366
55. Yang XS, Gandomi AH (2012) Bat algorithm: a novel approach
for global engineering optimization. Eng Comput 29(5):118
56. Yang XS, Deb S (2009) Cuckoo search via Levy flights, proceedings of world congress on nature and biologically inspired
computing (NaBIC 2009). IEEE Publications, USA, pp 210214
57. Yang X.S., Deb S. (2010) Engineering optimization by cuckoo
search. Int J Math Modell Num Optim 1(4):330343
58. Yang XS, Deb S (2013) Multiobjective cuckoo search for design
optimization. Comput Oper Res 40(6):16161624
59. Yang XS, Karamanoglu M, He XS (2013) Flower pollination
algorithm: a novel approach for multiobjective optimization. Eng
123
60.
61.
62.
63.
Optim. http://www.tandfonline.com/doi/abs/10.1080/0305215X.
2013.832237
Yang XS, Deb S, Loomes M, Karamanoglu M (2013) A framework for self-tuning optimization algorithms. Neural Comput
Appl 23(7/8):20512057
Yildiz AR (2012) Cuckoo search algorithm for the selection of
optimal machine parameters in milling operations. Int J Adv
Manuf Technol (2012). doi:10.1007/s00170-012-4013-7
Yousif A, Abdullah AH, Nor SM, abdelaziz AA (2011) Scheduling jobs on grid computing using firefly algorithm. J Theor
Appl Inf Technol 33(2):155164
Zaman MA, Matin MA (2012) Nonuniformly spaced linear
antenna array design using firefly algorithm. Int J Microwave Sci
Technol 2012(256759):8. doi:10.1155/2012/256759