You are on page 1of 20

OPTIMIZATION TECHNIQUE - HARMONY

SEARCH
Optimization Algorithm Harmony Search

1. INTRODUCTION
1.1 Optimization
Finding an alternative with the most cost effective or highest achievable performance under
the given constraints, by maximizing desired factors and minimizing undesired ones. It also
mean that it make best use of a situation or resource. In
comparison, maximization means trying to attain the highest or maximum result or outcome
without regard to cost or expense. Practice of optimization is restricted by the lack of
full information, and the lack of time to evaluate what information is available (see bounded
reality for details). In computer simulation (modeling) of business problems, optimization is
achieved usually by using linear programming techniques of operations research.

1.2 About Harmony Search


HS mimics the improvisation process of musicians during which each musician plays a note
for finding a best harmony all together.
When applied to optimization problems, the musicians typically represent the decision
variables of the cost function.
And HS acts as a meta-heuristic algorithm which attempts to find a solution vector that
optimizes this function.

The Harmony Search (HS) is a meta-heuristic algorithm recently developed by Geem et al.
for continuous optimization. Imitating the process of music improvisation, HS was
conceptualized using the musical process of searching for a perfect state of harmony. The
harmony in music is analogous to the optimization solution vector. And, the musicians
improvisations of their instruments pitches are analogous to the local and global search
schemes in optimization techniques. Due to its simple structure and effectiveness, HS has
been successfully applied to numerous real-world optimization problems, particularly in
constrained engineering optimization domain. However, like most other meta-heuristics, its
capabilities are sensitive to parameter setting. Some research works are recently undertaken
to improve the performance of HS by tuning its parameters or blending it with other powerful
optimization techniques. A few improved variants of HS have been proposed, but some
experiments are less rigorous or obtained from low-dimensional problems. In this work, a
new adaptive pitch adjustment scheme is proposed into the harmony improvisation of HS.
The performance of the proposed HS is evaluated using thirteen widely used benchmark

2|Page
Optimization Algorithm Harmony Search

functions for continuous optimization in 30 and 100 dimensions. The remainder of this article
is organized as follows. In Section 2, a brief introduction of harmony search variants will be
reviewed. Section 3 describes the proposed HS variant that will be evaluated using
benchmark functions in Section 4. Finally, Section 5 concludes this work.

1.2.1 The Harmony Method


Harmony search tries to find a vector that optimizes (minimizes or maximizes) a certain
objective function.
The algorithm has the following steps
Step 1: Generate initial random vectors (solutions), and store them in harmony
memory (HM)
Step 2: Generate a new vector . For each component ,
with probability hmcr (harmony memory considering rate), pick the stored
value from HM,
with probability 1 hmcr, pick a random value within the allowed range.
Step 3: Perform additional work if the value in Step 2 came from HM
with probability par (pitch adjusting rate), change x by a small amount
with probability 1 par, do nothing.
Step 4: If new solution is better than the worst vector in HM, replace the worst.
Step 5: Repeat from Step 2 to Step 4 until termination criterion (e.g.
maximum iterations) is satisfied.

1.2.2 Harmony Method - Parameters


hms = the size of the harmony memory. A typical value ranges from 1 to 100.
hmcr = the rate of choosing a value from the harmony memory. Typical value ranges from
0.7 to 0.99.
par = the rate of choosing a neighboring value. Typical value ranges from 0.1 to 0.5.
= the amount between two neighboring values in discrete candidate set.
fw (fret width, formerly bandwidth) = the amount of maximum change in pitch adjustment.
It is possible to vary the parameter values as the search progresses, which gives an effect
similar to simulated annealing.

3|Page
Optimization Algorithm Harmony Search

Fig 1.1Algorithm From Music Phenomenon


1.2.3 Application For Real-World Problem
Sudoku Puzzle
Music Composition - Medieval Organum
Project Scheduling (TCTP)
University Time-tabling
Internet Routing
Web-Based Parameter Calibration
Truss Structure Design
School Bus Routing Problem
Generalized Orienteering Problem
Water Distribution Network Design
Large-Scale Water Network Design
Multiple Dam Operation
Hydrologic Parameter Calibration
Ecological Conservation
Satellite Heat Pipe Design
Satellite Heat Pipe Design
Oceanic Oil Structure Mooring
RNA Structure Prediction
Medical Imaging
Radiation Oncology
Astronomical Data Analysis

4|Page
Optimization Algorithm Harmony Search

Fig 1.2 Project Scheduling Fig 1.3 Web-Based Parameter Calibration

Fig 1.4 Multiple Dam Operation Fig 1.5 Ecological Conservation

Fig 1.6 RNA Structure Prediction Fig 1.7 Radiation Oncology

5|Page
Optimization Algorithm Harmony Search

2. LITERATURE REVIEW

Since the 1970s, many meta-heuristic algorithms that combine rules and randomness
imitating natural phenomena have been developed to overcome the computational drawbacks
of existing numerical algorithms (i.e., complex derivatives, sensitivity to initial values, and
the large amount of enumeration memory) when solving difficult and complex engineering
optimization problems. These algorithms include simulated annealing, tabu search, particle
swarm algorithm, bee colony search, ant colony search algorithm, firefly algorithm, genetic
algorithm and other evolutionary computation methods. From above mentioned problem
solving techniques genetic algorithm is widely used technique for optimization problems and
it gives global solution of the problem.
Z. W. Geem et al. in 2001 developed a New Harmony search (HS) meta-heuristic algorithm
having the purpose to produce better solution than other existing algorithm in less number of
iterations, which is explained in section 2.1. In current international literature one can find
variety of applications of HSA and number of publications on HSA can be found. After
extensive search we found interesting results which are shown in figure 1. After 2004
publications on HSA has been increased and day by day the interest of researchers is
increasing.

Fig 2.1 Growth Of HAS Publication (Year Wise)

6|Page
Optimization Algorithm Harmony Search

2.1 Harmony Search Algorithm (HSA)


Harmony Search was inspired by the improvisation of Jazz musicians. Specifically,
the process by which the musicians (who have never played together before) rapidly refine
their individual improvisation through variation resulting in an aesthetic harmony.
Steps of harmony search algorithm are shown as flowchart in Figure 6.1. They are as follows
Step 1. Initialize the problem and algorithm parameters.
Step 2. Initialize the Harmony Memory (HM).
Step 3. Improvise a New Harmony memory.
Step 4. Update the Harmony memory.
Step 5. Check the stopping criterion.

2.1.1 Initialize the problem and algorithm parameters

In Step 1, the optimization problem is specified as follows: Minimize, Subject to where is


an objective function; is the set of each decision variable ; is the number of decision
variables, is the set of the possible range of values for each decision variable, that is and and
are the lower and upper bounds for each decision variable. The HS algorithm parameters are
also specified in this step. These parameters are
1. Harmony Memory Size (HMS), or the number of solution vectors in the harmony
memory;
2. Harmony Memory Considering Rate (HMCR), HMCR
3. Pitch Adjusting Rate (PAR) ;
4. Number of improvisations (NI), or stopping criterion;
5. The harmony memory (HM) is a memory location where all the solution vectors are
stored.

2.1.2 Initialize the harmony memory

In this Step, the HM matrix is filled with as many randomly generated solution vectors as the
HMS.

7|Page
Optimization Algorithm Harmony Search

Fig 2.2 Flowchart Of HSA

2.1.3 Improvise a new harmony

A New Harmony vector is generated based on three rules:


1. memory consideration,
2. pitch adjustment and
3. random selection
Generating a new harmony is called "improvisation". In the memory consideration, the
value of the first decision variable for the new vector is chosen from any of the values in the
specified HM range. HM is similar to the step where the musician uses his/her memory to
generate a tune. Values of the other decision variables are chosen in the same manner. The
HMCR , is the rate of choosing one value from the historical values stored in the HM, while
(1HMCR) is the rate of randomly selecting one value from the possible range of values. For
example, a HMCR of 0.90 indicates that the HS algorithm will choose the decision variable
value from historically stored values in the HM with an 90% probability or from the entire
possible range with a (10090)% probability. Every component obtained by the memory

8|Page
Optimization Algorithm Harmony Search

consideration is examined to determine whether it should be pitch-adjusted. This operation


uses the PAR parameter, which is the rate of pitch adjustment as follows: The value of (1-
PAR) sets the rate of doing nothing. If the pitch adjustment decision for is YES, is replaced
as follow: Where, is an arbitrary distance bandwidth. is a random number between 0 and 1.

2.1.4 Update harmony memory

For each new value of harmony the value of objective function, is calculated. If the new
harmony vector is better than the worst harmony in the HM, the new harmony is included in
the HM and the existing worst harmony is excluded from the HM.

2.1.5 Check stopping criterion

If the stopping criterion (maximum number of improvisations) is satisfied, computation is


terminated. Otherwise, Steps 3 and 4 are repeated. Finally the best harmony memory vector is
selected and is considered as best solution to the problem.

2.2. Improved Harmony Search Algorithm (IHSA)

This method HSA is developed by M. Mahdavi et al. 2007 [5]. In HSA there are few
important parameter are there HMCR, PAR, bw, but PAR and bw are very important
parameters in fine-tuning of optimized solution vectors. The traditional HS algorithm uses
fixed value for both PAR and bw. In the HS method PAR and bw values adjusted in Step 1
and cannot be changed during new generations. The main drawback of this method is that the
number of iterations increases to find an optimal solution .To improve the performance of the
HS algorithm and eliminate the drawbacks lies with fixed values of PAR and bw, IHSA uses
variables PAR and bw in improvisation step (Step 3). PAR and bw change dynamically with
generation number and expressed as follow also pseudo code for IHSA is shown in figure :
PAR (gn) = PARmin+(PARmax-PARmin)/NI* gn,
Where,
PAR(gn) = Pitch Adjusting Rate for each generation
PARmin = Minimum Pitch Adjusting Rate
PARmax = Maximum Pitch Adjusting Rate

9|Page
Optimization Algorithm Harmony Search

NI = Number of solution vector generation


gn = Generation Number
bw(gn) = bwmax exp(c . gn)
c=ln(bwmin/bwmax)/NI

Where,
bw (gn) = Bandwidth for each
bwmin = Minimum bandwidth
bwmax = Maximum bandwidth

Fig 2.3 Pseudo code of IHSA

10 | P a g e
Optimization Algorithm Harmony Search

2.3 Other Related Research on HSA


HSA is very effective method of optimization and so that many researchers are
attracted to it. Now a days it is used to solve many optimization problems such as engineering
and computer science problems .
Consequently interest in HSA led the researcher to improve and increase its performance in
line with the requirements of problems that are solved. These improvements basically can be
classified under two aspects:
(1) Improvements of HS in terms of parameter setting
(2) Improvements in terms of hybridizing of HS components with other meta-heuristic
algorithms.
Improvements in terms of parameter setting many research have been done up till now. In HS
there are three important parameters are there HMCR, PAR, and BW. Firstly M. Mahdavi et
al (2007) has developed a new method known as Improved harmony search algorithm in
which PAR and BW have been modified. Then after orman and Mahdavi (2008) has
developed Global harmony search (GHS) algorithm in which BW is modified using the PSO
concept. Mukhopadhyay et al (2008) proposed to modify bw by standard deviation method
when GMCR is close to 1Chakraborty et al. (2009) proposed new improvements to HS
through inspiring the differential evolution (DE) mutation operator and replaces PAR with
mutation strategy. Hasancebi et al (2009), saka and Hasancebi (2009) proposed new
adaptation for HS by modifying both HMCR and PAR dynamically during improvisation
process. Wang and Huang (2010) proposed dynamic selection of bw and PAR parameters.
bw is os replaced by maximal and minimal values in HM, the PAR value is linearly
decreased. Kattan et al (2010) talks about stopping criteria, stopping criteria is replaced by
best to worst harmony ratio in the current harmony memory.Hadi Sarvari, Kamran Zamanifar
(2012) proposed improvement in HS by statistical analysis in which new harmony and bw is
modified. Hybridizing the HS is also well known technique to increase the performance of
HSA by combining HS and other meta-heuristic algorithm. Orman and Mahdavi(2008) uses
PSO concept,in this method global best particle is incorporated by replacing the bw
parameter and adding a randomly selected decision variables from best harmony vector in
HM. Fesanghary et al. (2008) proposed a new framework that combined HS with Sequential
Quadratic Programming (SQP) to solve engineering optimization problems. Taherinejad
(2009) modified HS by inspiring the SAs way of cooling strategy and modified PAR value.

11 | P a g e
Optimization Algorithm Harmony Search

3. REVIEWS OF HARMONY SEARCH

3.1 Original Harmony Search


In 2001, Geemet al. developed and proposed Harmony Search (HS) meta-heuristic
algorithm that was conceptualized using the musical process of searching for a perfect state
of harmony. The original HS algorithm consists of a harmony memory (HM) with size HMS.
HM stores the candidate solution vectors (of D decision variables), all of which are initialized
randomly within the search space as follows.

HMi(d) = LB(d) + (UB(d) LB(d)) rand() for i = 0 to HMS 1 and d = 0 to D

where LB(d) and UB(d) are the lower and upper bounds in the search space of decision
variable d. Rand() is a random function returning a number between 0 and 1 with uniform
distribution.
Then an improvisation process is repeated as shown in Figure 1 to adjust the pitch in order to
search for the perfect harmony, or the optimal solutions. HS has three parameters controlling
the improvisation as follows: the harmony consideration rate (HMCR), pitch adjustment rate
(PAR), and the pitch adjustment bandwidth (bw). In each improvisation or each iteration, if
rand() is less than HMCR, then a trial vector is generated with memory consideration,
otherwise the trial vector is obtained by a random selection. The trial vector from memory
consideration is further adjusted with a small pitch adjustment bandwidth (bw) if another
rand() returns a number less than PAR. If the obtained trial vector is better than the worst
harmony in the HM, replace the worst harmony with the trial vector. This procedure is
repeated until a stopping criterion is satisfied.

12 | P a g e
Optimization Algorithm Harmony Search

Set parameters: HMCR, HMS, PAR


Initialize HM = {HM0, HM1, ..., HMHMS-1}
i-0
while no stopping criterion is met do

for d = 0 to D 1 do
if rand() < HMCR then // memory consideration
trial(d) = HMR(d) where R = IntRand(0, HMS 1)
if rand() < PAR then // pitch adjustment
trial(d) = trial(d) +-rand() *bw
end if
else // random selection
trial(d) = LB(d) + (UB(d) LB(d)) *rand()
end if
end for
if fitness(trial) is better than finess(HMworst) then
replace HMworst wIth trial
end if
i=i+1
end while
Fig 3.1 The Proposed HMCR Algorithm

HMCR, ranging from 0 to 1, controls the balance of exploration and exploitation. A low
HMCR makes the HS behave toward a random search, whereas a high HMCR tends to
choose a vector from HM. Generally, the value of HMCR is preset as a constant close to 1.0
to perform more improvisation. Pitch adjustment rate (PAR) determines the chance that the
chosen harmony is further adjusted with a small distance bandwidth (bw). This adjustment is
analogous to the local search with a small step size.

13 | P a g e
Optimization Algorithm Harmony Search

3.2 Improved HS Variants


A few variants of HS algorithms have been proposed since its introduction. Mahdavi
et al. proposed the Improved Harmony Search (IHS) that dynamically increases PAR and
decreases bw. The idea of decreasing bw is analogous to decreasing the inertia weight in
Particle swarm optimization or the learning rate in neural networks. But, as pointed out by
Wang and Huang , the decreasing PAR is contradictory to a general strategy of continuously
increasing local search capability.
Omran&Mahdavi recently enhanced performance of HS by using concept of the global best
from Particle swarm optimization . In their proposed Global Harmony Search (GHS), the
pitch adjustment with bw is replaced with the global best harmony, thus beneficially reducing
one parameter.
Wang & Huang proposed the Self-adaptive Harmony Search, to be called SHS hereafter, with
automatically adjusted parameters. Their experiments and results are promising as well as
trustworthy with full-factorial design, asymmetric initialization and tested at both 30- and
100-dimensions. Unfortunately, the set of test functions is too small, and the intrinsic strength
of algorithm is partially arisen from their low-discrepancy sequences for initialization. More
recently, Pan et al. proposed an improved global harmony search algorithm whose
improvisation scheme attempts to retain good information in the current global best solution.

14 | P a g e
Optimization Algorithm Harmony Search

4. THE PROPOSED HARMONY SEARCH


This paper proposes an improved HS variant called Harmony Search with Adaptive
Pitch Adjustment, or HSAPA for short. A large HMCR value favors of harmony
improvisation, thus increasing the convergence rate of the algorithm in most cases, while a
small value of HMCR increase the diversity of the harmony memory. The results from my
preliminary tests found that a higher HMCR even provides a better performance in general.
Therefore, the proposed HSAPA uses HMCR = 0.995. Parameter PAR is very important in
the fine-tuning of optimized solution vectors. Their constant values in traditional HS
algorithms prolong the convergence due to the imbalance of global and local search
capabilities during the optimization period. Furthermore, simultaneous dynamic HMCR and
dynamic PAR can cause a twist of global and local search. This rational confirms the prior
discussion of using a constant HMCR and a dynamically decreasing PAR. Based on results of
prior experiments, the linearly decrease of PAR from 1.0 to 0.0 during the optimization run
outperforms other configurations in most cases.
Parameter bw is also very important. The original HS sets it at a constant bw = 0.01. On the
other hand, IHS uses an exponentially decreasing value, but the beginning and ending values
are constants. GHS replaces this feature with the global best harmony as mentioned in
previous section. SHS also replaces bw as follows. The new harmony is updated according to
the maximal and minimal values in the HM at current iteration from either eq. (1) or eq. (2)
with an equal chance.
trial(d)+[max(HM(d))-trial(d)] rand() ........ (1)
trial(d)-[trial(d)-min(HM(d))] rand() ......... (2)
Subscript d denotes the dimension; max(HM(d)) and min(HM(d)) are the largest and the
smallest values of the dth variable in the HM. By this way, the new harmony is guaranteed to
stay within search boundary since the updated distances are kept within min(HM(d)) and max
(HM(d)). However, this method eliminates the benefits of the exploration of search space
outside current boundary. The equal chance of using eq. (1) or (2) results in a nonuniform
distribution of position of the new harmony.
In this work, the pitch adjustment approach is simple and partially inspired from the velocity
clamping in classical variants of Particle swarm optimization . The pitch adjustment in this
work is as follows.
trial(d)+-range(d) rand(d), where range(d)-max(HM(d))-min(HM(d)) (3)
if trial(d)<LB(d) then trial(d)-LB(d) (4)

15 | P a g e
Optimization Algorithm Harmony Search

if trial (d)>UB(d) then trial(d)-UB(d) (5)


Set parameters: HMCR, HMS
Initialize HM = {HM0, HM1, ..., HMHMS-1}
i - 0 while no stopping criterion is met do
PAR - 1.0 i / maxiteration
for d = 0 to D 1 docalculate max(HM(d)), min(HM(d)), and range(d)
if rand() < HMCR then // memory consideration
trial(d) = HMR(d) where R = IntRand(0, HMS 1)
if rand() < PAR then // pitch adjustment
Pitch adjustment eq. (3)-(5)
end if
else // random selection
trial(d) = LB(d) + (UB(d) LB(d)) rand()
end if
end for
if fitness(trial) is better than finess(HMworst) then
replace HMworst with trial
end if
i = i + 1end while
Fig4.1 The Proposed HSAPA Algorithm.
In each iteration, the parameter bw in original HS is replaced with ranged, making the pitch
adjustment in eq. (3) dynamic.
The stochastic oscillation in each dimension is restricted to the current range of harmony
position and is regulated by the parameter. If the value ofis too high, the pitch may move
erratically going beyond a good solution. But if it is too small, then the target may not be
reached.
Equations (4) and (5) are necessary to keep the harmony within the search boundary. The
experimental results of using 13 widely accepted benchmark functions in Section 4
demonstrate that the performance is not sensitive to and its suggested value is between 0.4
and 0.5. Figure 2 outlines the proposed HSAPA algorithm as described above, where
maxiteration equals to the maximum number of iterations allowed to improvise or optimize.
HMS is set at 50 instead of just 5 in most former variants, as suggested from the full-factorial
tests in SHS.

16 | P a g e
Optimization Algorithm Harmony Search

5. PERFORMANCE EVALUATION
This section describes the experimentation to evaluate the performance of proposed
HSAPA and analyzes the results. Thirteen nonlinear optimization functions widely used in
literature are selected to evaluate the HSAPA. Table 1 illustrates descriptions and the search
ranges, for each dimension, of all the 13 benchmark functions. They are minimization
functions with optimal values of 0 and having different characteristics. The first 6 functions
are unimodal while the remaining 7 functions are more difficult multimodal. Since HSAPA
introduces a new parameter
investigated by varying from 0.2 to 0.8 with a step size of 0.1. The results are compared
against original HM, IHM, GHS, and SHS, described early, as well as Opposition-based
Differential Evolution (ODE), a state-of-the-art evolutionary algorithm. ODE is a recent
variant of Differential Evolution (DE) that was enhanced by utilizing the opposition-based
learning in population initialization, generation jumping, and local improvement. The ODEs
performance was comprehensively proven using a large set of complex benchmark functions.
Recall that SHS was proposed with the low-discrepancy sequences for initialization. From
my own prior experiments, this special initialization has improved the results to some extent.
However, it has not been equipped into all tested algorithms, and the goal of this experiment
is to focus on the evolution design. Therefore, I decide not to test SHS with such low-
discrepancy initialization. For a fair comparison, I instead use a common uniform random
initialization for every algorithm.
5.1 Experimental Setting
Two sets of experiments are conducted on each test function in 30 and 100 dimensions. For
each function, 50 runs with independent seed numbers for random number generator are
performed for each algorithm. The average results are presented in Table 2 and Table 3 for
the case of 30 and 100 dimensions, respectively. Table 4 and Table 5 summarize the results
by ranking all algorithms for each function in 30 and 100 dimensions, respectively.
According to the no free lunch theorem, for any algorithm, any elevated performance over
one class of problems is exactly paid for in performance over another class. Therefore, the
major criterion for performance comparison in this work is the mean of rank (R) achieved for
each algorithm. The R is computed from the average ranks of that algorithm for all 13
functions. The mean of rank for unimodal functions and that for multimodal functions
(denoted as R,Uni and R,Multi, respectively) are also computed, correspondingly.

17 | P a g e
Optimization Algorithm Harmony Search

5.2 Experimental Results and Analysis


The results in Table 2 and Table 3 indicate that the IHS and GHS fail to achieve the best
results in nearly all cases. The overall R summaries (in the last lines) of Table 4 and Table 5
also show that both algorithms finish in the worst ranks of all algorithms tested in both 30
and 100 dimensions. A careful investigation on the R lines in both tables suggests that the Rs
of the original HS are superior to both IHS and GHS, which are quite comparable.
1. 30D unimodal functions: ODE and HSAPA with 0.4 and 0.5 perform very well in
30D unimodal functions and share the first rank in this category. SHS follows as the
8th rank and can beat HSAPA with only 0.2, which is the worst one of all tested.

2. 30D multimodal functions: SHSs R,Multi surpasses ODE with a small margin (6.5 vs.
7.0), but both algorithms are defeated by HSAPA with various values (0.3 to 0.6).
HSAPA with 0.4, 0.5, and 0.6 are ranked the top three with R,Multi ranging from 3.17
to 4.33, far better than SHS and ODE. Table 2 also shows that only HSAPA with 0.4,
0.5 and 0.6 can achieve the optimal value of 0 for f11-Griewank function in every run
(mean = s.d. = 0).

3. 100D unimodal functions: The top three R,Uni belong to HSAPA with 0.4 and 0.5
and ODE, respectively.

4. 100D multimodal functions: The top three R,Multi belong to HSAPA with 0.4, 0.5 and
0.3, respectively.

In summary, HSAPA with 0.4 and 0.5 are superior to all other tested algorithms in terms of R
for both unimodal and multimodal function sets in both 30 and 100 dimensions. In addition,
the performance of HSAPA is not very sensitive to the parameter, especially with its value
ranging from 0.3 to 0.6.

18 | P a g e
Optimization Algorithm Harmony Search

6. CONCLUSION

Harmony search algorithm is a meta-heuristic algorithm which is very important tool in


optimization problems. It has been applied to diversified problems in past years and results
are very effective compare to other meta-heuristic algorithms. Also HSA is modified and
hybridized by different methods. Here one modified algorithm IHSA has been discussed with
its applications. HAS is very flexible to implement easy to apply. In IHSA few drawbacks of
HSA are removed and so that performance of HSA is improved. Also previously it is applied
to many different areas of discipline but in mechanical engineering its applications are very
few. Few of applications are discussed in this paper and author hope that this paper will
inspire researchers to work on HSA by applying in mechanical engineering area. In present
study the harmony search algorithm based optimization method have been research for
engineering optimization problem. These example showed that HS algorithm can easily
obtain minimum value of function with fewer iteration at simple numerical problems. HS
parameters, HMS, HMCR, and PAR, affect the value of objective function. For the harmony
search algorithm, high HMCR, especially from 0.7 to 0.95, contributes excellent Fortran
programming outputs, while PAR and HMS demonstrate little correlation with improvement
in performance.
When numerical results compared with other studies, especially Genetic Algorithm based
methods, to show accuracy and performance of HS method.

19 | P a g e
Optimization Algorithm Harmony Search

REFERENCES

1. Geem Zong Woo, Joong Hoon Kim, Loganathan G. V., A New Heuristic
Optimization Algorithm: Harmony Search, SIMULATION 76:2, 2001, 60- 68.

2. Lee Kang Seok, Geem Zong Woo, A new meta-heuristic algorithm for continuous
engineering optimization: harmony search theory and practice, Comput. Methods
Appl. Mech. Engrg. 194, Elsevier (2005) 39023933.

3. Geem Zong Woo, Fesanghary M., Choi Jeong-Yoon, Saka M. P., Williams Justin C.,
M. Tamer Ayvaz, Liang Li, Sam Ryu and A. Vasebi, Recent Advances in
Harmony Search, Advances in Evolutionary Algorithms, I-Tech Education and
Publishing, Vienna, Austria, 2008.

4. Yang X.S., Harmony Search as a Metaheuristic Algorithm, in: Music- Inspired


Harmony Search Algorithm: Theory and Applications (Editor Z. W. Geem),
Studies in Computational Intelligence, Springer Berlin, vol. 191, pp 1-14 (2009)

5. Mahdavi M., Fesanghary M., Damangir E., An improved harmony search algorithm
for solving optimization problems, Applied Mathematics and Computation 188
(2007), 15671579.

20 | P a g e

You might also like