Abstract

A hybrid metaheuristic approach by hybridizing harmony search (HS) and firefly algorithm (FA), namely, HS/FA, is proposed to solve function optimization. In HS/FA, the exploration of HS and the exploitation of FA are fully exerted, so HS/FA has a faster convergence speed than HS and FA. Also, top fireflies scheme is introduced to reduce running time, and HS is utilized to mutate between fireflies when updating fireflies. The HS/FA method is verified by various benchmarks. From the experiments, the implementation of HS/FA is better than the standard FA and other eight optimization methods.

1. Introduction

In engineering problems, optimization is to look for a vector that can maximize and minimize a function. Nowadays, stochastic method is generally utilized to cope with optimization problems [1]. Though there are many ways to classify them, a simple one is used to divide them into two groups according to their nature: deterministic and stochastic. Deterministic algorithms can get the same solutions if the initial conditions are unchanged, because they always follow the rigorous move. However, regardless of the initial values, stochastic ones are based on certain stochastic distribution; therefore they generally generate various solutions. In fact, both of them can find satisfactory solutions after some generations. Recently, nature-inspired algorithms are well capable of solving numerical optimization problems more efficiently.

These metaheuristic approaches are developed to solve complicated problems, like permutation flow shop scheduling [2], reliability [3, 4], high-dimensional function optimization [5], and other engineering problems [6, 7]. In the 1950s, nature evolution was idealized as an optimization technology and this made a new type of approach, namely, genetic algorithms (GAs) [8]. After that, many other metaheuristic methods have appeared, like evolutionary strategy (ES) [9, 10], ant colony optimization (ACO) [11], probability-based incremental learning (PBIL) [12], big bang-big crunch algorithm [1316], harmony search (HS) [1719], charged system search (CSS) [20], artificial physics optimization [21], bat algorithm (BA) [22, 23], animal migration optimization (AMO) [24], krill herd (KH) [2527], differential evolution (DE) [2831], particle swarm optimization (PSO) [3235], stud GA (SGA) [36], cuckoo search (CS) [37, 38], artificial plant optimization algorithm (APOA) [39], biogeography-based optimization (BBO) [40], and FA method [41, 42].

As a global optimization method, FA [42] is firstly proposed by Yang in 2008, and it is originated from the fireflies swarm. Recent researches demonstrate that the FA is quite powerful and relatively efficient [43]. Furthermore, the performance of FA can be improved with feasible promising results [44]. In addition, nonconvex problems can be solved by FA [45]. A summarization of swarm intelligence containing FA is given by Parpinelli and Lopes [46].

On the other hand, HS [17, 47] is a novel heuristic technique for optimization problems. In engineering optimization, the engineers make an effort to find an optimum that can be decided by an objective function. While, in the music improvisation process, musicians search for most satisfactory harmony as decided by aesthetician. HS method originates in the similarity between them [1].

In most cases, FA can find the optimal solution with its exploitation. However, the search used in FA is based on randomness, so it cannot always get the global best values. On the one hand, in order to improve diversity of fireflies, an improvement of adding HS is made to the FA, which can be treated as a mutation operator. By combining the principle of HS and FA, an enhanced FA is proposed to look for the best objective function value. On the other hand, FA needs much more time to search for the best solution and its performance significantly deteriorates with the increases in population size. In HS/FA, top fireflies scheme is introduced to reduce running time. This scheme is carried out by reduction of outer loop in FA. Through top fireflies scheme, the time complexity of HS/FA decreases from O(NP2) to O(KEEP*NP), where KEEP is the number of top fireflies. The proposed approach is evaluated on various benchmarks. The results demonstrate that the HS/FA performs more effectively and accurately than FA and other intelligent algorithms.

The rest of this paper is structured below. To begin with, a brief background on the HS and FA is provided in Sections 2 and 3, respectively. Our proposed HS/FA is presented in Section 4. HS/FA is verified through various functions in Section 5, and Section 6 presents the general conclusions.

2. HS Method

As a relative optimization technique, there are four optimization operators in HS [17, 48, 49]: HM: the harmony memory, as shown in (1);HMS: the harmony memory size, HMCR: the harmony memory consideration rate, PAR: the pitch adjustment rate, andbw: the pitch adjustment bandwidth [1].

Consider

The HS method can be explained according to the discussion of the player improvisation process. There are 3 feasible options for a player in the music improvisation process: (1) play several pitches that are the same with the HMCR; (2) play some pitches like a known piece; or (3) improvise new pitches [1]. These three options can be idealized into three components: use of HM, pitch adjusting, and randomization [1].

Similar to selecting the optimal ones in GA, the first part is important as it is [1]. This can guarantees that the optimal harmonies will not be destroyed in the HM. To make HS more powerful, the parameter HMCR should be properly set [1]. Through several experiments, in most cases, HMCR = 0.7~0.95.

The pitch in the second part needs to be adjusted slightly; and hence a proper method is used to adjust the frequency [1]. If the new pitch is updated bywhere is a random number in and is the current pitch. Here, is the bandwidth.

Parameter PAR should also be appropriately set. If PAR is very close to 1, then the solution is always updating and HS is hard to converge. If it is next to 0, then little change is made and HS may be premature. So, here we set PAR = 0.1~0.5 [1].

To improve the diversity, the randomization is necessary as shown in the third component. The usage of randomization allows the method to go a step further into promising area so as to find the optimal solution [1].

The HS can be presented in Algorithm 1. Where is the number of decision variables. rand is a random real number in interval drawn from uniform distribution.

Begin
  Step  1. Initialize the HM.
  Step  2. Evaluate the fitness.
  Step  3.  while the halting criteria is not satisfied do
    for    : D  do
      if  rand < HMCR then  // memory consideration
           where  
        if rand < PAR then   // pitch adjustment
          
        endif
      else           // random selection
        
      endif
    endfor d
    Update the HM as   ,  if     (minimization objective)
    Update the best harmony vector
  Step  4. end  while
  Step  5.  Output results.
End.

3. FA Method

FA [42] is a metaheuristic approach for optimization problems. The search strategy in FA comes from the fireflies swarm behavior [50]. There are two significant issues in FA that are the formulation of attractiveness and variation of light intensity [42].

For simplicity, several characteristics of fireflies are idealized into three rules described in [51]. Based on these three rules, the FA can be described in Algorithm 2.

Begin
  Step  1. Initialization. Set ; define ; set step size and at .
  Step  2. Evaluate the light intensity I determined by
  Step  3.  While  G < MaxGeneration do
     for    : NP (all NP fireflies) do
        for    : NP (NP fireflies) do
       if ( ),
       move firefly i towards j;
       end if
        Update attractiveness;
        Update light intensity;
        end for  j
     end for  i
      ;
  Step  4. end while
  Step  5. Output the results.
End.

For two fireflies and , they can be updated as follows: where is the step size, is the attractiveness at , the second part is the attraction, while the third is randomization [50]. In our present work, we take = 1, , and [50].

4. HS/FA

Based on the introduction of HS and FA in the previous section, the combination of the two approaches is described and HS/FA is proposed, which updates the poor solutions to accelerate its convergence speed.

HS and FA are adept at exploring the search space and exploiting solution, respectively. Therefore, in the present work, a hybrid by inducing HS into FA method named HS/FA is utilized to deal with optimization problem, which can be considered as mutation operator. By this strategy, the mutation of the HS and FA can explore the new search space and exploit the population, respectively. Therefore, it can overcome the lack of the exploration of the FA.

To combat the random walks used in FA, in the present work, the addition of mutation operator is introduced into the FA, including two detailed improvements.

The first one is the introduction of top fireflies scheme into FA to reduce running time that is analogous to the elitism scheme frequently used in other population-based optimization algorithms. In FA, due to dual loop, time complexity is O(NP2), whose performance significantly deteriorates with the increases in population size. This improvement is carried out by reduction of outer loop in FA. In HS/FA, we select the special firefly with optimal or near-optimal fitness (i.e., the brightest fireflies) to form top fireflies, and all the fireflies only move towards top fireflies. Through top fireflies scheme, the time complexity of HS/FA decreases from O(NP2) to O(KEEP*NP), where KEEP is the number of top fireflies. In general, KEEP is far smaller than NP, so the time used by HS/FA is much less than FA. Apparently, if KEEP = NP, the algorithm HS/FA is declined to the standard FA. If KEEP is too small, only few best fireflies are selected to form top fireflies and it converges too fast, moreover, may be premature for lack of diversity. If KEEP is extremely big (near NP), almost all the fireflies are used to form top fireflies, so all fireflies are explored well, leading to potentially optimal solutions, while the algorithm performs badly and converges too slowly. Therefore, we use KEEP = 2 in our study.

The second is the addition of HS serving as mutation operator striving to improve the population diversity to avoid the premature convergence. In standard FA, if firefly is brighter than firefly , firefly will move towards firefly , and then evaluate newly-generated fireflies and update light intensity. If not, firefly does nothing. However, in HS/FA, if firefly is not brighter than firefly , firefly is updated by mutation operation to improve the light intensity for firefly . More concretely, for the global search part, with respect to HS/FA, we tune every element () in (the position of firefly ) using HS. When is not less than HMCR, that is, ≥ HMCR, the element is updated randomly; whereas when < HMCR, we update the element in accordance with . Under this circumstance, pitch adjustment operation in HS is applied to update the element if < PAR to increase population diversity, as shown in (2), where and are two uniformly distributed random numbers in , is the integer number in , and NP is population size.

In sum, the detailed presentation of HS/FA can be given in Algorithm 3.

Begin
  Step  1. Initialization. Set ; define ; set , at ; set HMCR and PAR; set the
    number of top fireflies KEEP.
  Step  2. Evaluate the light intensity I.
  Step  3. While  t < MaxGeneration do
      Sort the fireflies by light intensity I;
      for    : KEEP (all Top fireflies) do
       for    : NP (all fireflies) do
       if     then
          Move firefly i towards j;
       else
          for    : D (all elements) do// Mutate
         if  (rand < HMCR) then
           
           
           if (rand < PAR) then
             
           end if
         else
           
         end if
          end for  k
       end if
       Update attractiveness;
       Update light intensity;
       end for  j
      end for  i
      Evaluate the light intensity I.
      Sort the population by light intensity I;
      ;
  Step  4. end while
End.

5. The Results

The HS/FA method is tested on optimization problems through several simulations conducted in test problems. To make a fair comparison between different methods, all the experiments were conducted on the same conditions described in [1].

In this section, HS/FA is compared on optimization problems with other nine methods, which are ACO [11], BBO [40], DE [2830], ES [9, 10], FA [41, 42], GA [8], HS [1719], PSO [32, 52], and SGA [36]. Here, for HS, FA, and HS/FA, the parameters are set as follows: absorption coefficient = 1.0, the HMCR = 0.9, and the PAR = 0.1. For parameters used in other methods, they can be referred to as in [48, 53]. Thirty-six functions are utilized to verify our HS/FA method, which can be shown in Table 1. More knowledge of all the benchmarks can be found in [54].

Because all the intelligent algorithms always have some randomness, in order to get representative statistical features, we did 500 implementations of each method on each problem. Tables 2 and 3 illustrate the average and best results found by each algorithm, respectively. Note that we have used two different scales to normalize the values in the tables, and its detailed process can be found in [54]. The dimension of each function is set to 30.

From Table 2, on average, HS/FA is well capable of finding function minimum on twenty-eight of the thirty-six functions. FA performs the second best on ten of the thirty-six functions. Table 3 shows that HS/FA and FA perform the same and best on twenty-two of the thirty-six and seventeen functions, respectively. ACO, DE, and GA perform the best on eight benchmarks. From the above tables, we can see that, for low-dimensional functions, both FA and HS/FA perform well, and their performance has little difference between each other.

Further, convergence graphs of ten methods for most representative functions are illustrated in Figures 1, 2, 3, and 4 which indicate the optimization process. The values here are the real mean function values from above experiments.

F26 is a complicated multimodal function and it has a single global value 0 and several local optima. Figure 1 shows that HS/FA converges to global value 0 with the fastest speed. Here FA converges a little faster initially, but it is likely to be trapped into subminima as the function value decreases slightly.

F28 is also a multimodal problem and it has only a global value 0. For this problem, HS/FA is superior to the other nine methods and finds the optimal value earliest.

For this function, the figure illustrates that HS/FA significantly outperforms all others in the optimization process. At last, HS/FA converges to the best solution superiorly to others. BBO is only inferior to HS/FA and performs the second best for this case.

HS/FA significantly outperforms all others in the optimization process. Furthermore, Figure 4 indicates that, at the early stage of the optimization process, FA converges faster than HS/FA, while HS/FA is well capable of improving its solution steadily in the long run. Here FA shows faster converges initially (within 20 iterations), however it seems to be trapped into subminima as the function value decreases slightly (after 20 iterations), and it is outperformed by HS/FA after 30 iterations.

From Figures 14, our HS/FA’s performance is far better than the others. In general, BBO and FA, especially FA, are only inferior to the HS/FA. Note that, in [40], BBO is compared with seven EAs and an engineering problem. The experiments proved the excellent performance of BBO. It is also indirectly proven that our HS/FA is a more effective optimization method than others.

6. Conclusions

In the present work, a hybrid HS/FA was proposed for optimization problems. FA is enhanced by the combination of the basic HS method. In HS/FA, top fireflies scheme is introduced to reduce running time; the other is used to mutate between fireflies when updating fireflies. The new harmony vector takes the place of the new firefly only if it is better than before, which generally outperforms HS and FA. The HS/FA strive to exploit merits of the FA and HS so as to escape all fireflies being trapped into local optima. Benchmark evaluation on the test problems is used to investigate the HS/FA and the other nine approaches. The results demonstrated that HS/FA is able to make use of the useful knowledge more efficiently to find much better values compared with the other optimization algorithms.