Abstract

A novel robust hybrid metaheuristic optimization approach, which can be considered as an improvement of the recently developed bat algorithm, is proposed to solve global numerical optimization problems. The improvement includes the addition of pitch adjustment operation in HS serving as a mutation operator during the process of the bat updating with the aim of speeding up convergence, thus making the approach more feasible for a wider range of real-world applications. The detailed implementation procedure for this improved metaheuristic method is also described. Fourteen standard benchmark functions are applied to verify the effects of these improvements, and it is demonstrated that, in most situations, the performance of this hybrid metaheuristic method (HS/BA) is superior to, or at least highly competitive with, the standard BA and other population-based optimization methods, such as ACO, BA, BBO, DE, ES, GA, HS, PSO, and SGA. The effect of the HS/BA parameters is also analyzed.

1. Introduction

The process of optimization is searching a vector in a function that produces an optimal solution. All of feasible values are available solutions, and the extreme value is optimal solution. In general, optimization algorithms are applied to solve these optimization problems. A simple classification way for optimization algorithms is considering the nature of the algorithms, and optimization algorithms can be divided into two main categories: deterministic algorithms and stochastic algorithms. Deterministic algorithms using gradients such as hill climbing have a rigorous move and will generate the same set of solutions if the iterations commence with the same initial starting point. On the other hand, stochastic algorithms without using gradients often generate different solutions even with the same initial value. However, generally speaking, the final values, though slightly different, will converge to the same optimal solutions within a given accuracy [1]. Generally, stochastic algorithms have two types: heuristic and metaheuristic. Recently, nature-inspired metaheuristic algorithms perform powerfully and efficiently in solving modern nonlinear numerical global optimization problems. To some extent, all metaheuristic algorithms strive for making balance between randomization (global search) and local search [2].

Inspired by nature, these strong metaheuristic algorithms are applied to solve NP-hard problems, such as UCAV path planning [1114], test-sheet composition [15], WSN deployment [16] and water, and geotechnical and transport engineering [17]. Optimization algorithms cover all searching for extreme value problems. These kinds of metaheuristic algorithms carry out on a population of solutions and always find best solutions. During the 1950s and 1960s, computer scientists studied the possibility of conceptualizing evolution as an optimization tool, and this generated a subset of gradient free approaches named genetic algorithms (GA) [18]. Since then, many other nature-inspired metaheuristic algorithms have emerged, such as differential evolution (DE) [10, 19, 20], particle swarm optimization (PSO) [2123], biogeography-based optimization (BBO) [24, 25], harmony search (HS) [26, 27], and more recently, the bat algorithm (BA) [28, 29] that is inspired by echolocation behavior of bats in nature.

Firstly presented by Yang in 2010, the bat-inspired algorithm or bat algorithm (BA) [30, 31] is a metaheuristic search algorithm, inspired by the echolocation behavior of bats with varying pulse rates of emission and loudness. The primary purpose for a bat’s echolocation is to act as a signal system to sense distance and hunt for food/prey. A comprehensive review about swarm intelligence involving BA is performed by Parpinelli and Lopes [32]. Furthermore, Tsai et al. proposed an evolving bat algorithm (EBA) to improve the performance of BA with better efficiency [33].

Firstly proposed by Geem et al. in 2001, harmony search (HS) [26] is a new metaheuristic approach for minimizing possibly nondifferentiable and nonlinear functions in continuous space. HS is inspired by the behavior of a musician’s improvisation process, where every musician attempts to improve its tune so as to create optimal harmony in a real-world musical performance processes. HS algorithm originates in the similarity between engineering optimization and music improvisation, and the engineers search for a global optimal solution as determined by an objective function, just like the musicians strive for finding aesthetic harmony as determined by aesthetician. In music improvisation, each musician chooses any pitch within the feasible range, together producing one harmony vector. If all the pitches produce a good solution, that experience is reserved in each variable’s memory, and the possibility of producing a better solution is also increased next time. Furthermore, this new approach requires few control variables, which makes HS easy to implement, more robust, and very appropriate for parallel computation.

BA is a powerful algorithm in exploitation (i.e., local search), but at times it may trap into some local optima, so that it cannot perform global search well. For bat algorithm, the search depends completely on random walks, so a fast convergence cannot be guaranteed. Firstly presented here, in order to increase the diversity of the population for BA so as to avoid trapping into local optima, a main improvement of adding pitch adjustment operation in HS serving as a mutation operator is made to the BA with the aim of speeding up convergence, thus making the approach more feasible for a wider range of practical applications while preserving the attractive characteristics of the basic BA. That is to say, we combine two approaches to propose a new hybrid metaheuristic algorithm according to the principle of HS and BA, and then this improved BA method is used to search the optimal objective function value. The proposed approach is evaluated on fourteen standard benchmark functions that have ever been applied to verify optimization algorithms on continuous optimization problems. Experimental results show that the HS/BA performs more efficiently and accurately than basic BA, ACO, BBO, DE, ES, GA, HS, PSO, and SGA.

The structure of this paper is organized as follows: Section 2 describes global numerical optimization problem, the HS algorithm, and basic BA in brief. Our proposed approach HS/BA is presented in detail in Section 3. Subsequently, our method is evaluated through fourteen benchmark functions in Section 4. In addition, the HS/BA is also compared with BA, ACO, BBO, DE, ES, GA, HS, PSO and SGA. Finally, Section 5 consists of the conclusion and proposals for future work.

2. Preliminary

To begin with, in this section we will provide a brief background on the optimization problem, harmony search (HS), and bat algorithm (BA).

2.1. Optimization Problem

In computer science, mathematics, management science, and control theory, optimization (also called mathematical optimization or mathematical programming) means the selection of an optimal solution from some set of feasible alternatives. In general, an optimization problem includes minimizing or maximizing a function by systematically selecting input values from a given feasible set and calculating the value of the function. More generally, optimization consists of finding the optimal values of some objective function within a given domain, including a number of different types of domains and different types of objective functions [34].

A global optimization problem can be described as follows.Given: a function : from some set to the real numbers.Sought: a parameter in such that for all in (“minimization") or such that for all in (“maximization").

Such a formulation is named a numerical optimization problem. Many theoretical and practical problems may be modeled in this general framework. In general, is some subset of the Euclidean space , often specified by a group of constraints, equalities, or inequalities that the components of have to satisfy. The domain of is named the search space, while the elements of are named feasible solutions or candidate solutions. In general, the function is called an objective function, cost function (minimization), or utility function (maximization). An optimal solution is an available solution that is the extreme of (minimum or maximum) the objective function.

Conventionally, the standard formulation of an optimization problem is stated in terms of minimization. In general, unless both the feasible region and the objective function are convex in a minimization problem, there may be more than one local minima. A local minimum is defined as a point for which the following expression holds. More details about the optimization problem can be found in [35].

The branch of numerical analysis and applied mathematics that investigates deterministic algorithms that can guarantee convergence in limited time to the true optimal solution of a nonconvex problem is called global numerical optimization problems. A variety of algorithms have been proposed to solve nonconvex problems. Among them, heuristics algorithms can evaluate approximate solutions to some optimization problems, as described in introduction [36].

2.2. Harmony Search

Firstly developed by Geem et al. in 2001, harmony search (HS) [26] is a relatively new metaheuristic optimization algorithm, and it is based on natural musical performance processes that occur when a musician searches for an optimal state of harmony. The optimization operators of HS algorithm are specified as the harmony memory (HM), which keeps the solution vectors which are all within the search space, as shown in (2); the harmony memory size HMS, which represents the number of solution vectors kept in the HM; the harmony memory consideration rate (HMCR); the pitch adjustment rate (PAR); the pitch adjustment bandwidth (bw)

In more detail, we can explain the HS algorithm with the help of discussing the improvisation process by a music player. When a player is improvising, he or she has three possible options: (1) play any well-known piece of music (a series of pitches in harmony) exactly from his or her memory as the harmony memory consideration rate (HMCR); (2) play something similar to a known piece in player’s memory (thus adjusting the pitch slightly); or (3) play totally new or random pitch from feasible ranges. If these three options are formalized for optimization, we have three corresponding components: employment of harmony memory, pitch adjusting, and randomization. Similarly, when each decision variable selects one value in the HS algorithm, it can make use of one of the above three rules in the whole HS procedure. If a new harmony vector is better than the worst harmony vector in the HM, the new harmony vector takes the place of the worst harmony vector in the HM. This procedure is repeated until a stopping criterion is satisfied.

The employment of harmony memory is significant, as it is analogous to select the optimal fit individuals in the GA (genetic algorithm). This will make sure that the best harmonies will be kept on to the new harmony memory. For the purpose of using this memory more effectively, we should properly set the value of the parameter . If this rate is very approaching to 1 (too high), almost all the harmonies are utilized in the harmony memory, then other harmonies are not explored well, resulting in potentially wrong solutions. If this rate is very close to 0 (extremely low), only few best harmonies are chosen, and it may have a slow convergence rate. Therefore, generally, .

To adjust the pitch slightly in the second component, an appropriate approach is to be applied to adjust the frequency efficiently. In theory, we can adjust the pitch linearly or nonlinearly, but in fact, linear adjustment is utilized. If is the current solution (or pitch), then the new solution (pitch) is generated by where is a random real number drawn from a uniform distribution . Here bw is the bandwidth, controlling the local range of pitch adjustment. Actually, we can see that the pitch adjustment (3) is a random walk.

Pitch adjustment is similar to the mutation operator in GA. Also, we must appropriately set the parameter PAR to control the degree of the adjustment. If PAR nears 1 (too high), then the solution is always changing and the algorithm may not converge at all. If it is very close to 0 (too low), then there is very little change and the algorithm may premature. Therefore, we use in most simulations as usual.

For the purpose of increasing the diversity of the solutions, the randomization is needed in the third component. Although adjusting pitch has a similar role, it is confined to certain local pitch adjustment and thus corresponds to a local search. The usage of randomization can make the system move further to explore multifarious regions with high solution diversity in order to search for the global optimal solution. In real-world engineering applications, HS has been applied to solve many optimization problems including function optimization, water distribution network, groundwater modeling, energy-saving dispatch, structural design, and vehicle routing.

The mainframe of the basic HS algorithm can be described as shown in Algorithm 1, where is the number of decision variables and rand is a random real number in interval (0, 1) drawn from uniform distribution. From Algorithm 1, it is clear that there are only two control parameters in HS, which are HMCR and PAR.

Begin
Step  1: Set the parameters and initialize the HM.
Step  2: Evaluate the fitness for each individual in HM
Step  3: while the halting criteria is not satisfied do
      for   :   do
       if rand < HMCR then    // memory consideration
          where
         if rand < PAR then       // pitch adjustment
           
         endif
       else        // random selection
        
       endif
      endfor  
      Update the HM as , if (minimization objective)
      Update the best harmony vector found so far
Step  4: end while
Step  5: Post-processing the results and visualization.
End.

2.3. Bat Algorithm

The bat algorithm is a novel metaheuristic swarm intelligence optimization method developed for the global numerical optimization, in which the search algorithm is inspired by social behavior of bats and the phenomenon of echolocation to sense distance.

In [28], for simplicity, bat algorithm is based on idealizing some of the echolocation characteristics of bats, which are the following approximate or idealized rules: (1)all bats apply echolocation to sense distance, and they always “know” the surroundings in some magical way; (2)bats fly randomly with velocity and a fixed frequency at position , varying wavelength λ, and loudness to hunt for prey. They can spontaneously accommodate the wavelength (or frequency) of their emitted pulses and adjust the rate of pulse emission , depending on the proximity of their target; (3)although the loudness can change in different ways, it is supposed that the loudness varies from a minimum constant (positive) to a large .

Based on these approximations and idealization, the basic steps of the bat algorithm (BA) can be described as shown in Algorithm 2.

Begin
Step  1: Initialization. Set the generation counter ; Initialize the population of NP bats
    randomly and each bat corresponding to a potential solution to
    the given problem; define loudness , pulse frequency
    and the initial velocities ; set pulse rate .
Step  2: While the termination criteria is not satisfied or   < MaxGeneration do
    Generate new solutions by adjusting frequency, and updating velocities
    and locations/solutions [(4)–(6)]
    if (rand > ) then
     Select a solution among the best solutions;
     Generate a local solution around the selected best solution
    end if
    Generate a new solution by flying randomly
    if (rand < & < )) then
     Accept the new solutions
     Increase and reduce
    end if
    Rank the bats and find the current best
    ;
 Step  3: end while
 Step  4: Post-processing the results and visualization.
End.

In BA, each bat is defined by its position , velocity , frequency , loudness , and the emission pulse rate in a -dimensional search space. The new solutions and velocities at time step t are given by where is a random vector drawn from a uniform distribution. Here is the current global best location (solution) which is located after comparing all the solutions among all the bats. Generally speaking, depending on the domain size of the problem of interest, the frequency is assigned to and in practical implementation. Initially, each bat is randomly given a frequency which is drawn uniformly from [, ].

For the local search part, once a solution is selected among the current best solutions, a new solution for each bat is generated locally using random walk where is a scaling factor which is a random number, while is the average loudness of all the bats at time step .

The update of the velocities and positions of bats is similar to the procedure in the standard PSO [21], as controls the pace and range of the movement of the swarming particles in essence. To some degree, BA can be considered as a balanced combination of the standard PSO and the intensive local search controlled by the loudness and pulse rate.

Furthermore, the loudness and the rate of pulse emission update accordingly as the iterations proceed as shown in (8) where and are constants. In essence, is similar to the cooling factor of a cooling schedule in the simulated annealing (SA) [37]. For simplicity, we set in this work.

3. Our Approach: HS/BA

Based on the introduction of HS and BA in previous section, we will explain how we combine the two approaches to form the proposed bat algorithm with harmony search (HS/BA) in this section, which modifies the solutions with poor fitness in order to add diversity of the population to improve the search efficiency.

In general, the standard BA algorithm is adept at exploiting the search space, but at times it may trap into some local optima, so that it cannot perform global search well. For BA, the search depends completely on random walks, so a fast convergence cannot be guaranteed. Firstly presented here, in order to increase the diversity of the population for BA so as to avoid trapping into local optima, a main improvement of adding pitch adjustment operation in HS serving as a mutation operator is made to the BA with the aim of speeding up convergence, thus making the approach more feasible for a wider range of practical applications while preserving the attractive characteristics of the basic BA. In this paper, a hybrid metaheuristic algorithm by inducing the pitch adjustment operation in HS as a mutation operator into bat algorithm, so-called harmony search/bat algorithm (HS/BA), is used to optimize the benchmark functions. The difference between HS/BA and BA is that the mutation operator is used to improve the original BA generating a new solution for each bat. In this way, this method can explore the new search space by the mutation of the HS algorithm and exploit the population information with BA, and therefore can avoid trapping into local optima in BA. In the following, we will show the algorithm HS/BA which is an improvement of HS and BA.

The critical operator of HS/BA is the hybrid harmony search mutation operator, which composes the improvisation of harmony in HS with the BA. The core idea of the proposed hybrid mutation operator is based on two considerations. Firstly, poor solutions can take in many new used features from good solutions. Secondly, the mutation operator can improve the exploration of the new search space. In this way, the strong exploration abilities of the original HS and the exploitation abilities of the BA can be fully developed.

For bat algorithm, as the search relies entirely on random walks, a fast convergence cannot be guaranteed. Described here for the first time, a main improvement of adding mutation operator is made to the BA, including three minor improvements, which are made with the aim of speeding up convergence, thus making the method more practical for a wider range of applications, but without losing the attractive features of the original method.

The first improvement is that we use fixed frequency and loudness instead of various frequency and . Similar to BA, in HS/BA, each bat is defined by its position , velocity , the emission pulse rate and the fixed frequency , and loudness in a -dimensional search space. The new solutions and velocities at time step are given by where is the current global best location (solution) which is located after comparing all the solutions among all the bats. In our experiments, we make . Through a series of simulation experiments on benchmarks in Section 4.2, it was found that setting the parameter of pulse rate to 0.6 and the loudness to 0.95 produced the best results.

The second improvement is to add mutation operator in an attempt to increase diversity of the population to improve the search efficiency and speed up the convergence to optima. For the local search part, once a solution is selected among the current best solutions, a new solution for each bat is generated locally using random walk by (7) when is larger than pulse rate , that is, , where is a random real number drawn from a uniform distribution; while when , we use pitch adjustment operation in HS serving as a mutation operator updating the new solution to increase diversity of the population to improve the search efficiency, as shown in (3). Through testing benchmarks in Section 4.2, it was found that setting the parameter of harmony memory consideration rate HMCR to 0.95 and pitch adjustment rate PAR to 0.1 produced the best results.

The last improvement is the addition of elitism scheme into the HS/BA. As with other population-based optimization algorithms, we typically incorporate some sort of elitism in order to retain the best solutions in the population. This prevents the best solutions from being corrupted by pitch adjustment operator. Note that we use an elitism approach to save the property of the bats that has the best solution in the HS/BA process, so even if pitch adjustment operation ruins its corresponding bat, we have saved it and can revert back to it if needed.

Based on the above-mentioned analyses, the mainframe of the harmony search/bat algorithm (HS/BA) is presented in Algorithm 3.

Begin
Step  1: Initialization. Set the generation counter ; initialize the population of NP
    bats randomly and each bat corresponding to a potential
    solution to the given problem; define loudness ; set frequency ,
    the initial velocities , and pulse rate ; set the harmony memory consideration
    rate HMCR, the pitch adjustment rate PAR and bandwidth bw;
    set maximum of elite individuals retained KEEP.
Step  2: Evaluate the quality for each bat in determined by the objective function .
Step  3: While the termination criteria is not satisfied or < MaxGeneration do
     Sort the population of bats from best to worst by order of quality for each bat.
     Store the KEEP best bats as KEEPBAT.
     for   :NP (all bats) do
         
         
         if (rand > ) then
           
         end if
         for   : (all elements) do //Mutate
           if     then
            
             where
            if (rand < PAR)  then
               
            endif
           else
             
           endif
         endfor  
         Evaluate the fitness for the offsprings , ,
         Select the offspring with the best fitness among the offsprings , , .
         if (rand < ) then
           ;
         end if
         Replace the KEEP worst bats with the KEEP best bats KEEPBAT stored.
     end for
      ;
Step  4: end while
Step  5: Post-processing the results and visualization;
End.

4. Simulation Experiments

In this section, we test the performance of the proposed metaheuristic HS/BA to global numerical optimization through a series of experiments conducted on benchmark functions.

To allow a fair comparison of running times, all the experiments were conducted on a PC with a Pentium IV processor running at 2.0 GHz, 512 MB of RAM and a hard drive of 160 Gbytes. Our implementation was compiled using MATLAB R2012a (7.14) running under Windows XP3. No commercial BA or HS tools were used in the following experiments.

4.1. General Performance of HS/BA

In order to explore the benefits of HS/BA, in this subsection we compared its performance on global numeric optimization problem with nine other population-based optimization methods, which are ACO, BA, BBO, DE, ES, GA, HS, PSO, and SGA. ACO (ant colony optimization) [38, 39] is a swarm intelligence algorithm for solving optimization problems which is based on the pheromone deposition of ants. BBO (biogeography-based optimization) [24] is a new excellent powerful and efficient evolution algorithm, developed for the global optimization inspired by the immigration and emigration of species between islands (or habitats) in search of more compatible islands. DE (differential evolution) [19] is a simple but excellent optimization method that uses the difference between two solutions to probabilistically adapt a third solution. An ES (evolutionary strategy) [40] is an algorithm that generally distributes equal importance to mutation and recombination and that allows two or more parents to reproduce an offspring. A GA (genetic algorithm) [18] is a search heuristic that mimics the process of natural evolution. PSO (particle swarm optimization) [21] is also a swarm intelligence algorithm which is based on the swarm behavior of fish and bird schooling in nature. A stud genetic algorithm (SGA) [41] is a GA that uses the best individual at each generation for crossover.

In all experiments, we will use the same parameters for HS, BA and HS/BA that are loudness , pulse rate , scaling factor , the harmony memory consideration rate , and the pitch adjustment rate . For ACO, BBO, DE, ES, GA, PSO, and SGA, we set the parameters as follows. For ACO, initial pheromone value , pheromone update constant , exploration constant , global pheromone decay rate , local pheromone decay rate , pheromone sensitivity , and visibility sensitivity ; for BBO, habitat modification probability = 1, immigration probability bounds per gene =, step size for numerical integration of probabilities = 1, maximum immigration and migration rates for each island = 1, and mutation probability = 0.005; for DE, a weighting factor and a crossover constant ; for the ES, the number of offspring produced in each generation and standard deviation for changing solutions. For the GA, we used roulette wheel selection and single-point crossover with a crossover probability of 1 and a mutation probability of 0.01. For PSO, we set an inertial constant = 0.3, a cognitive constant = 1, and a social constant for swarm interaction = 1. For the SGA, we used single-point crossover with a crossover probability of 1 and a mutation probability of 0.01.

Well-defined problem sets are favorable for evaluating the performance of optimization methods proposed in this paper. Based on mathematical functions, benchmark functions can be applied as objective functions to perform such tests. The properties of these benchmark functions can be easily achieved from their definitions. Fourteen different benchmark functions are applied to verify our proposed metaheuristic algorithm HS/BA. Each of the functions in this study has 20 independent variables (i.e., ).

The benchmark functions described in Table 1 are standard testing functions. The properties of the benchmark functions are given in Table 2. The modality property means the number of the best solutions in the search space. Unimodal benchmark functions only have an optimum, which is the global optimum. Multimodal benchmark functions have at least two optima in their search space, indicating that they have more than one local optima except the global optimum. More details of all the benchmark functions can be found in [6].

We set population size , an elitism parameter , and maximum generation for each algorithm. We ran 100 Monte Carlo simulations of each algorithm on each benchmark function to get representative performances. Tables 3 and 4 illustrate the results of the simulations. Table 3 shows the average minima found by each algorithm, averaged over 100 Monte Carlo runs. Table 4 shows the absolute best minima found by each algorithm over 100 Monte Carlo runs. In other words, Table 3 shows the average performance of each algorithm, while Table 4 shows the best performance of each algorithm. The best value achieved for each test problem is shown in bold. Note that the normalizations in the tables are based on different scales, so values cannot be compared between the two tables.

From Table 3, we see that, on average, HS/BA is the most effective at finding objective function minimum on ten of the fourteen benchmarks (F02, F03, F06–F11, and F13-F14). ACO is the second most effective, performing the best on the benchmarks F04 and F05. BBO and SGA are the third most effective, performing the best on the benchmarks F12 and F01, respectively, when multiple runs are made. Table 4 shows that HS/BA performs the best on thirteen of the fourteen benchmarks (F02–F14), while BBO performs the second best at finding objective function minimum on the only benchmark (F01) when multiple runs are made. In addition, statistical analysis [42] on these values obtained by the ten methods on 14 benchmark functions based on the Friedman’s test [43] reveals that the differences in the obtained average and the best function minima are statistically significant ( * 10−17 and  * 10−17, resp.) at the confidence level of 5%.

Furthermore, the computational requirements of the ten optimization methods were similar. We collected the average computational time of the optimization methods as applied to the 14 benchmarks discussed in this section. The results are shown in Table 3. BA was the quickest optimization method. HS/BA was the third fastest of the ten algorithms. However, it should be noted that in the vast majority of real-world applications, it is the fitness function evaluation that is by far the most expensive part of a population-based optimization algorithm.

Furthermore, convergence graphs of ACO, BA, BBO, DE, ES, GA, HS, HS/BA, PSO, and SGA are shown in Figures 114 which represent the process of optimization. The values shown in Figures 114 are the mean objective function optimum achieved from 100 Monte Carlo simulations, which are the true objective function values, not normalized.

Figure 1 shows the results obtained for the ten methods when the F01 Ackley function is applied. From Figure 1, clearly, we can draw the conclusion that BBO is significantly superior to all the other algorithms including HS/BA during the process of optimization, while HS/BA performs the second best on this multimodal benchmark function. Here, all the other algorithms show the almost same starting point, and HS/BA has a faster convergence rate than other algorithms, while HS/BA is outperformed by BBO after 13 generations. Although slower, SGA eventually finds the global minimum close to HS/BA. Obviously, BBO, HS/BA, and SGA outperform ACO, BA, DE, ES, GA, HS, and PSO during the whole process of searching the global minimum.

Figure 2 illustrates the optimization results for F02 Fletcher-Powell function. In this multimodal benchmark problem, it is obvious that HS/BA outperforms all other methods during the whole progress of optimization (except the generation 20–33, BBO, and HS/BA performs approximately equally).

Figure 3 shows the optimization results for F03 Griewank function. From Table 3 and Figure 3, we can see that HS/BA performs the best on this multimodal benchmark problem. Looking at Figure 3 carefully, HS/BA shows a faster convergence rate initially than ACO; however, it is outperformed by ACO after 5 generations. For other algorithms, although slower, BBO, GA, and SGA eventually find the global minimum close to HS/BA, while BA, DE, ES, HS, and PSO fail to search the global minimum within the limited iterations.

Figure 4 shows the results for F04 Penalty #1 function. From Figure 4, it is obvious that HS/BA outperforms all other methods during the whole progress of optimization in this multimodal benchmark function. Although slower, SGA and BBO perform the second and third best at finding the global minimum.

Figure 5 shows the performance achieved for F05 Penalty #2 function. For this multimodal function, similar to F04 Penalty #1 function shown in Figure 4, HS/BA is significantly superior to all the other algorithms during the process of optimization. Here, PSO shows a faster convergence rate initially than HS/BA; however, it is outperformed by HS/BA after 3 generations.

Figure 6 shows the results achieved for the ten methods when using the F06 Quartic (with noise) function. From Table 3 and Figure 6, we can conclude that HS/BA performs the best in this multimodal function. Looking carefully at Figure 6, PSO and HS/BA show the same initial fast convergence towards the known minimum, as the procedure proceeds, HS/BA gets closer and closer to the minimum, while PSO comes into being premature and traps into the local minimum. BA, ES, GA, HS, and PSO do not manage to succeed in this benchmark function within maximum number of generations. At last, BBO, DE, and SGA converge to the value very close to HS/BA.

Figure 7 shows the optimization results for the F07 Rastrigin function. Very clearly, HS/BA has the fastest convergence rate at finding the global minimum and significantly outperforms all other approaches. Looking carefully at Figure 7, approximately, we can divide all the algorithms into three groups: one group including BBO, HS/BA, and SGA performing the best; the other group including ACO, DE, GA, and PSO performing the second best; another group including BA, ES, and HS performing the worst.

Figure 8 shows the results for F08 Rosenbrock function. From Table 3 and Figure 8, we can see that HS/BA is superior to the other algorithms during the optimization process in this relatively simple unimodal benchmark function. Also, we see that, similar to the F06 Quartic (with noise) function shown in Figure 6, PSO shows a faster convergence rate initially than HS/BA; however, it is outperformed by HS/BA after 4 generations and is premature to the local minimum.

Figure 9 shows the equivalent results for the F09 Schwefel 2.26 function. From Figure 9, very clearly, though HS/BA is outperformed by BBO between the generations 10–30, it has the stable convergence rate at finding the global minimum and significantly outperforms all other approaches in this multimodal benchmark function.

Figure 10 shows the results for F10 Schwefel 1.2 function. From Figure 10, we can see that HS/BA performs far better than other algorithms during the optimization process in this relative simple unimodal benchmark function. PSO shows a faster convergence rate initially than HS/BA; however, it is outperformed by HS/BA after 3 generations. Looking carefully at Figure 10, akin to F07 Rastrigin function shown in Figure 7, all the algorithms except BA can be approximately divided into three groups: one group including BBO and HS/BA performing the best; the other group including ACO, GA, PSO, and SGA performing the second best; another group including DE, ES, and HS performing the worst.

Figure 11 shows the results for F11 Schwefel 2.22 function. Very clearly, BBO shows a faster convergence rate initially than HS/BA; however, it is outperformed by HS/BA after 40 generations. At last, HS/BA reaches the optimal solution significantly superior to other algorithms. BBO is only inferior to HS/BA and performs the second best in this unimodal function.

Figure 12 shows the results for F12 Schwefel 2.21 function. Very clearly, HS/BA has the fastest convergence rate at finding the global minimum and significantly outperforms all other approaches.

Figure 13 shows the results for F13 Sphere function. From Table 3 and Figure 13, HS/BA has the fastest convergence rate at finding the global minimum and outperforms all other approaches. Looking carefully at Figure 13, for BBO and SGA, we can see that BBO has a faster convergence rate than SGA, but SGA does finally converge to the value of BBO that is approaching to HS/BA.

Figure 14 shows the results for F14 Step function. Apparently, HS/BA shows the fastest convergence rate at finding the global minimum and significantly outperforms all other approaches. Looking carefully at Figure 14, for BBO and SGA, we can see that BBO has a faster convergence rate than SGA, but SGA does finally converge to the value of BBO.

From the above-analyses about Figures 114, we can come to the conclusion that our proposed hybrid metaheuristic algorithm HS/BA significantly outperforms the other nine algorithms. In general, ACO, BBO, and SGA are only inferior to the HS/BA, and ACO, BBO, and SGA perform better than HS/BA in the benchmarks F04-F05, the benchmark F12, and the benchmark F01, respectively. Further, benchmarks F04, F05, F06, F08, and F10 illustrate that PSO has a faster convergence rate initially, while later it converges slower and slower to the true objective function value. At last, we must point out that, in [24], Simon compared BBO with seven state-of-the-art EAs over 14 benchmark functions and a real-world sensor selection problem. The results demonstrated the good performance of BBO. It is also indirectly demonstrated that our proposed hybrid metaheuristic method HS/BA is a more powerful and efficient optimization algorithm than other population-based optimization algorithms.

4.2. Influence of Control Parameter

In [28], Dr. Yang concluded that if the parameters in BA can be adjusted properly, it can outperform GA, HS (harmony search) [26], and PSO. The choice of the control parameters is of vital importance for different problems. To investigate the influence of the loudness , pulse rate , harmony memory consideration rate HMCR, and pitch adjustment rate PAR on the performance of HS/BA, we carry out this experiment in the benchmark problem with different parameters. All other parameter settings are kept unchanged (unless noted otherwise in the following paragraph). The results are recorded in Tables 512 after 100 Monte Carlo runs. Among them, Tables 5, 7, 9, and 11 show the best minima found by HS/BA algorithm over 100 Monte Carlo runs. Tables 6, 8, 10, and 12 show the average minima found by HS/BA algorithm, averaged over 100 Monte Carlo runs. In other words, Tables 5, 7, 9, 11 and Tables 6, 8, 10, 12 show the best and average performance of HS/BA algorithm, respectively. In each table, the last row is the total number of functions on which HS/BA performs the best with some parameters.

4.2.1. Loudness:

Tables 5 and 6 recorded the results performed in the benchmark problem with the loudness and fixed pulse rate , harmony memory consideration rate HMCR = 0.95, and pitch adjustment rate PAR = 0.1. From Tables 5 and 6, obviously, it can be seen that (i) for the three benchmark functions F02, F03, and F04, HS/BA performs slightly differently, that is to say, these three benchmark functions are insensitive to the parameter . (ii) For benchmark functions F01, F06, F11, and F12, HS/BA performs better on smaller (<0.5). (iii) However, for functions F05, F07–10, F13, and F14, HS/BA performs better on bigger (>0.5). In sum, HS/BA performs the best when is equal or very close to 1.0. So, we set , which is very close to 0.9 and 1.0 in other experiments.

4.2.2. Pulse Rate:

Tables 7 and 8 recorded the results performed on the benchmark problem with the pulse rate and fixed loudness, harmony memory consideration rate HMCR = 0.95, and pitch adjustment rate PAR = 0.1. From Tables 7 and 8, we can evidently conclude that HS/BA performs significantly better on bigger (>0.5) than on smaller (<0.5), and HS/BA performs the best when r is equal or very close to 0.6. So, we set in other experiments.

4.2.3. Harmony Memory Consideration Rate:

Tables 9 and 10 recorded the results performed in the benchmark problem with the harmony memory consideration rate , fixed loudness , pulse rate , and pitch adjustment rate . From Tables 9 and 10, we can recognize that the function values evaluated by HS/BA are better/smaller on bigger HMCR (>0.5) than on smaller r (<0.5), and most benchmarks reach minimum when HMCR is equal or very close to 1. So, we set HMCR = 0.95 in other experiments.

4.2.4. Pitch Adjustment Rate:

Tables 11 and 12 recorded the results performed on the benchmark problems with the pitch adjustment rate , fixed loudness , pulse rate , and harmony memory consideration rate . From Tables 11 and 12, we can recognize that the function values for HS/BA vary little with the increasing PAR, and HS/BA reaches optimum/minimum in most benchmarks when PAR is equal or very close to 0.1. So, we set in other experiments.

4.3. Discussion

In the HS/BA, the bats fly in the sky using echolocation to find food/prey (i.e., best solutions). Four other parameters are the loudness (), the rate of pulse emission (), harmony memory consideration rate (HMCR), and pitch adjustment rate (PAR). The appropriate update for the pitch adjustment rate (PAR) and the rate of pulse emission () balances the exploration and exploitation behavior of each bat, respectively, as the loudness usually decreases once a bat has found its prey/solution, while the rate of pulse emission increases in order to raise the attack accuracy.

For all of the standard benchmark functions that have been considered, the HS/BA has been demonstrated to perform better than or be equal to the standard BA and other acclaimed state-of-the-art population-based algorithms with the HS/BA performing significantly better in some functions. The HS/BA performs excellently and efficiently because of its ability to simultaneously carry out a local search, still searching globally at the same time. It succeeds in doing this due to the local search via harmony search algorithm and global search via bat algorithm concurrently. A similar behavior may be performed in the PSO by using multiswarm from a particle population initially [44]. However, HS/BA’s advantages include performing simply and easily, and only having four parameters to regulate. The work carried out here demonstrates the HS/BA to be robust over all kinds of benchmark functions.

Benchmark evaluation is a good way for verifying the performance of the metaheuristic algorithms, but it also has limitations. First, we did not make any special effort to tune the optimization algorithms in this section. Different tuning parameter values in the optimization algorithms might result in significant differences in their performance. Second, real-world optimization problems may not have much of a relationship to benchmark functions. Third, benchmark tests might result in different conclusions if the grading criteria or problem setup change. In our work, we examined the mean and best results obtained with a certain population size and after a certain number of generations. However, we might arrive at different conclusions if (for example) we change the generation limit, or look at how many generations it takes to reach a certain function value, or if we change the population size. In spite of these caveats, the benchmark results shown here are promising for HS/BA and indicate that this novel method might be able to find a niche among the plethora of population-based optimization algorithms.

We note that CPU time is a bottleneck to the implementation of many population-based optimization algorithms. If an algorithm cannot converge fast, it will be impractical, since it would take too long to find an optimal or suboptimal solution. HS/BA does not seem to require an unreasonable amount of computational effort; of the ten optimization algorithms compared in this paper, HS/BA was the third fastest. Nevertheless, finding mechanisms to accelerate HS/BA could be an important area for further research.

5. Conclusion and Future Work

This paper proposed a hybrid metaheuristic HS/BA method for optimization problem. We improved the BA by combining original harmony search (HS) algorithm and evaluating the HS/BA on multimodal numerical optimization problems. A novel type of BA model has been presented, and an improvement is applied to mutate between bats using harmony search algorithm during the process of bats updating. Using the original configuration of the bat algorithm, we generate the new harmonies based on the newly generated bat each iteration after bat’s position has been updated. The new harmony vector substitutes the newly generated bat only if it has better fitness. This selection scheme is rather greedy, which often overtakes original HS and BA. The HS/BA attempts to take merits of the BA and the HS in order to avoid all bats getting trapped in inferior local optimal regions. The HS/BA enables the bats to have more diverse exemplars to learn from, as the bats are updated each iteration and also form new harmonies to search in a larger search space. This new method can speed up the global convergence rate without losing the strong robustness of the basic BA. From the analysis of the experimental results, we observe that the proposed HS/BA makes good use of the information in past solutions more effectively to generate better quality solutions frequently, when compared to the other population-based optimization algorithms such as ACO, BA, BBO, DE, ES, GA, HS, PSO, and SGA. Based on the results of the ten approaches on the test problems, we can conclude that the HS/BA significantly improves the performances of the HS and the BA on most multimodal and unimodal problems.

In this work, 14 benchmark functions are used to evaluate the performance of our approach; we will test our approach on more problems, such as the high-dimensional () CEC 2010 test suit [45] and the real-world problems. Moreover, we will compare HS/BA with other metaheuristic method, such as DLHS [46, 47], SGHS [48], adaptive DE [49], CLPSO [50], and DMS-PSO [51]. In addition, we only consider the unconstrained function optimization in this work. Our future work consists on adding the diversity rules into HS/BA for constrained optimization problems, such as constrained real-parameter optimization CEC 2010 test suit [52].

In the field of optimization, there are many issues worthy of further study, and efficient optimization method should be developed depending on the analysis of specific engineering problem. Our future work will focus on the two issues: on the one hand, we would apply our proposed approach HS/BA to solve practical engineering optimization problems, and, obviously, HS/BA can become a fascinating method for real-world engineering optimization problems; on the other hand, we would develop new meta-hybrid approach to solve optimization problem.

Acknowledgments

This work was supported by State Key Laboratory of Laser Interaction with Material Research Fund under Grant no. SKLLIM0902-01 and Key Research Technology of Electric-discharge Non-chain Pulsed DF Laser under Grant no. LXJJ-11-Q80.