The Scientific World Journal

The Scientific World Journal / 2013 / Article
Special Issue

Recent Advances on Bioinspired Computation

View this Special Issue

Research Article | Open Access

Volume 2013 |Article ID 125625 | 9 pages | https://doi.org/10.1155/2013/125625

An Effective Hybrid Firefly Algorithm with Harmony Search for Global Numerical Optimization

Academic Editor: X. Yang
Received10 Aug 2013
Accepted29 Sep 2013
Published20 Nov 2013

Abstract

A hybrid metaheuristic approach by hybridizing harmony search (HS) and firefly algorithm (FA), namely, HS/FA, is proposed to solve function optimization. In HS/FA, the exploration of HS and the exploitation of FA are fully exerted, so HS/FA has a faster convergence speed than HS and FA. Also, top fireflies scheme is introduced to reduce running time, and HS is utilized to mutate between fireflies when updating fireflies. The HS/FA method is verified by various benchmarks. From the experiments, the implementation of HS/FA is better than the standard FA and other eight optimization methods.

1. Introduction

In engineering problems, optimization is to look for a vector that can maximize and minimize a function. Nowadays, stochastic method is generally utilized to cope with optimization problems [1]. Though there are many ways to classify them, a simple one is used to divide them into two groups according to their nature: deterministic and stochastic. Deterministic algorithms can get the same solutions if the initial conditions are unchanged, because they always follow the rigorous move. However, regardless of the initial values, stochastic ones are based on certain stochastic distribution; therefore they generally generate various solutions. In fact, both of them can find satisfactory solutions after some generations. Recently, nature-inspired algorithms are well capable of solving numerical optimization problems more efficiently.

These metaheuristic approaches are developed to solve complicated problems, like permutation flow shop scheduling [2], reliability [3, 4], high-dimensional function optimization [5], and other engineering problems [6, 7]. In the 1950s, nature evolution was idealized as an optimization technology and this made a new type of approach, namely, genetic algorithms (GAs) [8]. After that, many other metaheuristic methods have appeared, like evolutionary strategy (ES) [9, 10], ant colony optimization (ACO) [11], probability-based incremental learning (PBIL) [12], big bang-big crunch algorithm [1316], harmony search (HS) [1719], charged system search (CSS) [20], artificial physics optimization [21], bat algorithm (BA) [22, 23], animal migration optimization (AMO) [24], krill herd (KH) [2527], differential evolution (DE) [2831], particle swarm optimization (PSO) [3235], stud GA (SGA) [36], cuckoo search (CS) [37, 38], artificial plant optimization algorithm (APOA) [39], biogeography-based optimization (BBO) [40], and FA method [41, 42].

As a global optimization method, FA [42] is firstly proposed by Yang in 2008, and it is originated from the fireflies swarm. Recent researches demonstrate that the FA is quite powerful and relatively efficient [43]. Furthermore, the performance of FA can be improved with feasible promising results [44]. In addition, nonconvex problems can be solved by FA [45]. A summarization of swarm intelligence containing FA is given by Parpinelli and Lopes [46].

On the other hand, HS [17, 47] is a novel heuristic technique for optimization problems. In engineering optimization, the engineers make an effort to find an optimum that can be decided by an objective function. While, in the music improvisation process, musicians search for most satisfactory harmony as decided by aesthetician. HS method originates in the similarity between them [1].

In most cases, FA can find the optimal solution with its exploitation. However, the search used in FA is based on randomness, so it cannot always get the global best values. On the one hand, in order to improve diversity of fireflies, an improvement of adding HS is made to the FA, which can be treated as a mutation operator. By combining the principle of HS and FA, an enhanced FA is proposed to look for the best objective function value. On the other hand, FA needs much more time to search for the best solution and its performance significantly deteriorates with the increases in population size. In HS/FA, top fireflies scheme is introduced to reduce running time. This scheme is carried out by reduction of outer loop in FA. Through top fireflies scheme, the time complexity of HS/FA decreases from O(NP2) to O(KEEP*NP), where KEEP is the number of top fireflies. The proposed approach is evaluated on various benchmarks. The results demonstrate that the HS/FA performs more effectively and accurately than FA and other intelligent algorithms.

The rest of this paper is structured below. To begin with, a brief background on the HS and FA is provided in Sections 2 and 3, respectively. Our proposed HS/FA is presented in Section 4. HS/FA is verified through various functions in Section 5, and Section 6 presents the general conclusions.

2. HS Method

As a relative optimization technique, there are four optimization operators in HS [17, 48, 49]: HM: the harmony memory, as shown in (1);HMS: the harmony memory size, HMCR: the harmony memory consideration rate, PAR: the pitch adjustment rate, andbw: the pitch adjustment bandwidth [1].

Consider

The HS method can be explained according to the discussion of the player improvisation process. There are 3 feasible options for a player in the music improvisation process: (1) play several pitches that are the same with the HMCR; (2) play some pitches like a known piece; or (3) improvise new pitches [1]. These three options can be idealized into three components: use of HM, pitch adjusting, and randomization [1].

Similar to selecting the optimal ones in GA, the first part is important as it is [1]. This can guarantees that the optimal harmonies will not be destroyed in the HM. To make HS more powerful, the parameter HMCR should be properly set [1]. Through several experiments, in most cases, HMCR = 0.7~0.95.

The pitch in the second part needs to be adjusted slightly; and hence a proper method is used to adjust the frequency [1]. If the new pitch is updated bywhere is a random number in and is the current pitch. Here, is the bandwidth.

Parameter PAR should also be appropriately set. If PAR is very close to 1, then the solution is always updating and HS is hard to converge. If it is next to 0, then little change is made and HS may be premature. So, here we set PAR = 0.1~0.5 [1].

To improve the diversity, the randomization is necessary as shown in the third component. The usage of randomization allows the method to go a step further into promising area so as to find the optimal solution [1].

The HS can be presented in Algorithm 1. Where is the number of decision variables. rand is a random real number in interval drawn from uniform distribution.

Begin
  Step  1. Initialize the HM.
  Step  2. Evaluate the fitness.
  Step  3.  while the halting criteria is not satisfied do
    for    : D  do
      if  rand < HMCR then  // memory consideration
           where  
        if rand < PAR then   // pitch adjustment
          
        endif
      else           // random selection
        
      endif
    endfor d
    Update the HM as   ,  if     (minimization objective)
    Update the best harmony vector
  Step  4. end  while
  Step  5.  Output results.
End.

3. FA Method

FA [42] is a metaheuristic approach for optimization problems. The search strategy in FA comes from the fireflies swarm behavior [50]. There are two significant issues in FA that are the formulation of attractiveness and variation of light intensity [42].

For simplicity, several characteristics of fireflies are idealized into three rules described in [51]. Based on these three rules, the FA can be described in Algorithm 2.

Begin
  Step  1. Initialization. Set ; define ; set step size and at .
  Step  2. Evaluate the light intensity I determined by
  Step  3.  While  G < MaxGeneration do
     for    : NP (all NP fireflies) do
        for    : NP (NP fireflies) do
       if ( ),
       move firefly i towards j;
       end if
        Update attractiveness;
        Update light intensity;
        end for  j
     end for  i
      ;
  Step  4. end while
  Step  5. Output the results.
End.

For two fireflies and , they can be updated as follows: where is the step size, is the attractiveness at , the second part is the attraction, while the third is randomization [50]. In our present work, we take = 1, , and [50].

4. HS/FA

Based on the introduction of HS and FA in the previous section, the combination of the two approaches is described and HS/FA is proposed, which updates the poor solutions to accelerate its convergence speed.

HS and FA are adept at exploring the search space and exploiting solution, respectively. Therefore, in the present work, a hybrid by inducing HS into FA method named HS/FA is utilized to deal with optimization problem, which can be considered as mutation operator. By this strategy, the mutation of the HS and FA can explore the new search space and exploit the population, respectively. Therefore, it can overcome the lack of the exploration of the FA.

To combat the random walks used in FA, in the present work, the addition of mutation operator is introduced into the FA, including two detailed improvements.

The first one is the introduction of top fireflies scheme into FA to reduce running time that is analogous to the elitism scheme frequently used in other population-based optimization algorithms. In FA, due to dual loop, time complexity is O(NP2), whose performance significantly deteriorates with the increases in population size. This improvement is carried out by reduction of outer loop in FA. In HS/FA, we select the special firefly with optimal or near-optimal fitness (i.e., the brightest fireflies) to form top fireflies, and all the fireflies only move towards top fireflies. Through top fireflies scheme, the time complexity of HS/FA decreases from O(NP2) to O(KEEP*NP), where KEEP is the number of top fireflies. In general, KEEP is far smaller than NP, so the time used by HS/FA is much less than FA. Apparently, if KEEP = NP, the algorithm HS/FA is declined to the standard FA. If KEEP is too small, only few best fireflies are selected to form top fireflies and it converges too fast, moreover, may be premature for lack of diversity. If KEEP is extremely big (near NP), almost all the fireflies are used to form top fireflies, so all fireflies are explored well, leading to potentially optimal solutions, while the algorithm performs badly and converges too slowly. Therefore, we use KEEP = 2 in our study.

The second is the addition of HS serving as mutation operator striving to improve the population diversity to avoid the premature convergence. In standard FA, if firefly is brighter than firefly , firefly will move towards firefly , and then evaluate newly-generated fireflies and update light intensity. If not, firefly does nothing. However, in HS/FA, if firefly is not brighter than firefly , firefly is updated by mutation operation to improve the light intensity for firefly . More concretely, for the global search part, with respect to HS/FA, we tune every element () in (the position of firefly ) using HS. When is not less than HMCR, that is, ≥ HMCR, the element is updated randomly; whereas when < HMCR, we update the element in accordance with . Under this circumstance, pitch adjustment operation in HS is applied to update the element if < PAR to increase population diversity, as shown in (2), where and are two uniformly distributed random numbers in , is the integer number in , and NP is population size.

In sum, the detailed presentation of HS/FA can be given in Algorithm 3.

Begin
  Step  1. Initialization. Set ; define ; set , at ; set HMCR and PAR; set the
    number of top fireflies KEEP.
  Step  2. Evaluate the light intensity I.
  Step  3. While  t < MaxGeneration do
      Sort the fireflies by light intensity I;
      for    : KEEP (all Top fireflies) do
       for    : NP (all fireflies) do
       if     then
          Move firefly i towards j;
       else
          for    : D (all elements) do// Mutate
         if  (rand < HMCR) then
           
           
           if (rand < PAR) then
             
           end if
         else
           
         end if
          end for  k
       end if
       Update attractiveness;
       Update light intensity;
       end for  j
      end for  i
      Evaluate the light intensity I.
      Sort the population by light intensity I;
      ;
  Step  4. end while
End.

5. The Results

The HS/FA method is tested on optimization problems through several simulations conducted in test problems. To make a fair comparison between different methods, all the experiments were conducted on the same conditions described in [1].

In this section, HS/FA is compared on optimization problems with other nine methods, which are ACO [11], BBO [40], DE [2830], ES [9, 10], FA [41, 42], GA [8], HS [1719], PSO [32, 52], and SGA [36]. Here, for HS, FA, and HS/FA, the parameters are set as follows: absorption coefficient = 1.0, the HMCR = 0.9, and the PAR = 0.1. For parameters used in other methods, they can be referred to as in [48, 53]. Thirty-six functions are utilized to verify our HS/FA method, which can be shown in Table 1. More knowledge of all the benchmarks can be found in [54].


No.NameNo.Name

F01BealeF19Holzman 2 function
F02Bohachevsky #1F20Levy
F03Bohachevsky #2F21Pathological function
F04Bohachevsky #3F22Penalty #1
F05BoothF23Penalty #2
F06BraninF24Powel
F07EasomF25Quartic with noise
F08FoxholesF26Rastrigin
F09Freudenstein-RothF27Rosenbrock
F10Goldstein-PriceF28Schwefel 2.26
F11HumpF29Schwefel 1.2
F12MatyasF30Schwefel 2.22
F13AckleyF31Schwefel 2.21
F14AlpineF32Sphere
F15BrownF33Step
F16Dixon and PriceF34Sum function
F17Fletcher-PowellF35Zakharov
F18GriewankF36Wavy1

Because all the intelligent algorithms always have some randomness, in order to get representative statistical features, we did 500 implementations of each method on each problem. Tables 2 and 3 illustrate the average and best results found by each algorithm, respectively. Note that we have used two different scales to normalize the values in the tables, and its detailed process can be found in [54]. The dimension of each function is set to 30.


ACOBBODEESFAGAHSHSFAPSOSGA

F011.011.011.001.021.081.041.071.001.011.25
F021.432.711.002.551.001.3916.661.003.131.22
F031.251.841.002.281.001.1711.771.013.501.26
F04 49.66 1.00
F051.011.021.001.111.001.011.151.001.051.19
F061.031.021.001.091.001.021.031.001.033.01
F072.402.482.272.352.231.831.881.001.712.99
F081.721.721.721.721.721.721.721.721.721.00
F091.031.011.002.0517.291.006.551.001.041.24
F102.402.402.403.062.402.403.092.402.701.00
F111.001.001.001.031.001.001.031.001.021.25
F121.001.001.001.011.001.001.021.001.001.14
F134.322.563.685.561.394.985.701.004.832.63
F1436.177.9843.1673.7411.6833.1370.321.0053.628.48
F15570.9814.0227.90 141.8699.02652.651.00485.9212.10
F16 75.20317.08 7.35942.08 1.00 26.84
F1721.982.357.6721.635.307.3319.011.0015.462.26
F188.495.4014.1866.702.3128.49139.021.0052.775.69
F19 167.15544.18 25.71 1.00 39.88
F2093.7913.5968.16276.6020.0592.42282.651.00173.179.33
F213.202.491.741.003.692.653.881.782.552.42
F22 6.64 9.81
F23 7.36
F24112.598.0048.08188.981.0425.76133.991.0052.912.92
F25 103.34637.38 17.91 1.00 62.77
F2624.374.5821.0632.517.7520.8429.891.0023.037.29
F2737.232.385.3449.701.0010.3834.121.0412.062.00
F2837.7618.4573.5192.9293.8231.93109.201.00112.2821.21
F294.792.526.727.411.005.407.131.934.814.31
F3042.086.3316.7663.659.3630.1653.571.0035.278.32
F312.983.133.874.561.003.934.781.153.972.79
F32205.8013.5637.41382.801.87131.27361.491.00151.6214.65
F3340.4420.2653.05312.015.14111.20471.251.00194.1015.72
F34274.2127.0246.57546.266.17138.85550.251.00188.9625.75
F35 1.463.323.571.183.123.341.003.122.64
F369.825.2614.1229.9510.6716.9035.571.0023.955.37

The bold data are the best function value among different methods for the specified function.

ACOBBODEESFAGAHSHSFAPSOSGA

F011.001.001.001.001.001.001.001.001.001.00
F021.001.711.001.531.001.001.491.001.721.00
F031.001.281.001.121.001.001.281.001.341.13
F04 1.00
F051.001.001.001.021.001.001.001.001.001.00
F061.011.011.001.011.001.011.001.001.002.51
F07 1.00 2.88
F081.991.991.991.991.991.991.991.991.991.00
F091.001.001.001.011.001.001.001.001.001.00
F102.652.652.652.742.652.652.652.652.651.00
F111.001.001.001.001.001.001.001.001.001.03
F121.001.001.001.001.001.001.001.001.001.00
F138.343.937.0511.021.008.7111.561.549.513.98
F1463.4611.4288.82159.5312.2240.65144.041.00102.4810.66
F15218.309.5447.47591.9133.50128.64792.731.00358.649.44
F16 109.24 1.00315.53 2.31 43.61
F1742.583.2411.8341.061.209.5043.491.0029.073.02
F187.993.6213.8563.261.0014.42160.011.0837.872.32
F19 135.16893.82 1.56566.65 1.00 3.67
F20251.2932.94142.30720.647.96131.48863.431.00483.7023.66
F213.962.891.651.004.602.914.871.902.722.37
F2231.8355.55 8.2689.30 1.00 15.15
F231.00 27.10 4.89 29.22
F24 88.70 1.60215.01 1.00940.8234.50
F25 380.11 1.00 2.01 54.67
F2639.665.6530.0458.396.0327.3642.321.0038.578.91
F2754.772.1312.5387.361.2710.8558.141.0021.112.77
F28164.4567.82335.75447.01430.2985.82596.001.00551.4468.85
F298.784.0016.5015.311.0010.8518.423.276.757.29
F3063.5310.3827.71105.797.1547.0488.911.0058.0212.83
F313.805.637.239.391.007.3110.201.937.564.53
F32740.2430.59184.72 1.00322.80 2.87725.6231.00
F33149.2966.43224.57 3.86255.71 1.00 42.71
F34491.5135.62100.26 1.64113.66 1.00400.6127.55
F353.442.505.896.671.004.285.481.463.833.74
F3611.056.0118.5640.109.0718.4743.181.0031.184.64

The bold data are the best function value among different methods for the specified function.

From Table 2, on average, HS/FA is well capable of finding function minimum on twenty-eight of the thirty-six functions. FA performs the second best on ten of the thirty-six functions. Table 3 shows that HS/FA and FA perform the same and best on twenty-two of the thirty-six and seventeen functions, respectively. ACO, DE, and GA perform the best on eight benchmarks. From the above tables, we can see that, for low-dimensional functions, both FA and HS/FA perform well, and their performance has little difference between each other.

Further, convergence graphs of ten methods for most representative functions are illustrated in Figures 1, 2, 3, and 4 which indicate the optimization process. The values here are the real mean function values from above experiments.

F26 is a complicated multimodal function and it has a single global value 0 and several local optima. Figure 1 shows that HS/FA converges to global value 0 with the fastest speed. Here FA converges a little faster initially, but it is likely to be trapped into subminima as the function value decreases slightly.

F28 is also a multimodal problem and it has only a global value 0. For this problem, HS/FA is superior to the other nine methods and finds the optimal value earliest.

For this function, the figure illustrates that HS/FA significantly outperforms all others in the optimization process. At last, HS/FA converges to the best solution superiorly to others. BBO is only inferior to HS/FA and performs the second best for this case.

HS/FA significantly outperforms all others in the optimization process. Furthermore, Figure 4 indicates that, at the early stage of the optimization process, FA converges faster than HS/FA, while HS/FA is well capable of improving its solution steadily in the long run. Here FA shows faster converges initially (within 20 iterations), however it seems to be trapped into subminima as the function value decreases slightly (after 20 iterations), and it is outperformed by HS/FA after 30 iterations.

From Figures 14, our HS/FA’s performance is far better than the others. In general, BBO and FA, especially FA, are only inferior to the HS/FA. Note that, in [40], BBO is compared with seven EAs and an engineering problem. The experiments proved the excellent performance of BBO. It is also indirectly proven that our HS/FA is a more effective optimization method than others.

6. Conclusions

In the present work, a hybrid HS/FA was proposed for optimization problems. FA is enhanced by the combination of the basic HS method. In HS/FA, top fireflies scheme is introduced to reduce running time; the other is used to mutate between fireflies when updating fireflies. The new harmony vector takes the place of the new firefly only if it is better than before, which generally outperforms HS and FA. The HS/FA strive to exploit merits of the FA and HS so as to escape all fireflies being trapped into local optima. Benchmark evaluation on the test problems is used to investigate the HS/FA and the other nine approaches. The results demonstrated that HS/FA is able to make use of the useful knowledge more efficiently to find much better values compared with the other optimization algorithms.

References

  1. G. Wang and L. Guo, “A novel hybrid bat algorithm with harmony search for global numerical optimization,” Journal of Applied Mathematics, vol. 2013, Article ID 696491, 21 pages, 2013. View at: Publisher Site | Google Scholar
  2. X. Li and M. Yin, “An opposition-based differential evolution algorithm for permutation flow shop scheduling based on diversity measure,” Advances in Engineering Software, vol. 55, pp. 10–31, 2013. View at: Google Scholar
  3. D. Zou, L. Gao, S. Li, and J. Wu, “An effective global harmony search algorithm for reliability problems,” Expert Systems with Applications, vol. 38, no. 4, pp. 4642–4648, 2011. View at: Publisher Site | Google Scholar
  4. D. Zou, L. Gao, J. Wu, S. Li, and Y. Li, “A novel global harmony search algorithm for reliability problems,” Computers and Industrial Engineering, vol. 58, no. 2, pp. 307–316, 2010. View at: Publisher Site | Google Scholar
  5. X.-S. Yang, Z. Cui, R. Xiao, A. H. Gandomi, and M. Karamanoglu, Swarm Intelligence and Bio-Inspired Computation, Elsevier, Waltham, Mass, USA, 2013.
  6. A. H. Gandomi, X. S. Yang, S. Talatahari, and A. H. Alavi, Metaheuristic Applications in Structures and Infrastructures, Elsevier, Waltham, Mass, USA, 2013.
  7. X. S. Yang, A. H. Gandomi, S. Talatahari, and A. H. Alavi, Metaheuristics in Water, Geotechnical and Transport Engineering, Elsevier, Waltham, Mass, USA, 2013.
  8. D. E. Goldberg, Genetic Algorithms in Search, Optimization and Machine Learning, Addison-Wesley, Boston, Mass, USA, 1989.
  9. T. Back, Evolutionary Algorithms in Theory and Practice, Oxford University Press, Oxford, UK, 1996.
  10. H. Beyer, The Theory of Evolution Strategies, Springer, New York, NY, USA, 2001.
  11. M. Dorigo and T. Stutzle, Ant Colony Optimization, MIT Press, Cambridge, UK, 2004.
  12. B. Shumeet, “Population-based incremental learning: a method for integrating genetic search based function optimization and competitive learning,” Carnegie Mellon University CMU-CS-94-163, Carnegie Mellon University, Pittsburgh, Pa, USA, 1994. View at: Google Scholar
  13. O. K. Erol and I. Eksin, “A new optimization method: big bang-big crunch,” Advances in Engineering Software, vol. 37, no. 2, pp. 106–111, 2006. View at: Publisher Site | Google Scholar
  14. A. Kaveh and S. Talatahari, “Size optimization of space trusses using big bang-big crunch algorithm,” Computers and Structures, vol. 87, no. 17-18, pp. 1129–1140, 2009. View at: Publisher Site | Google Scholar
  15. A. Kaveh and S. Talatahari, “Optimal design of schwedler and ribbed domes via hybrid big bang-big crunch algorithm,” Journal of Constructional Steel Research, vol. 66, no. 3, pp. 412–419, 2010. View at: Publisher Site | Google Scholar
  16. A. Kaveh and S. Talatahari, “A discrete big bang-big crunch algorithm for optimal design of skeletal structures,” Asian Journal of Civil Engineering, vol. 11, no. 1, pp. 103–122, 2010. View at: Google Scholar
  17. Z. W. Geem, J. H. Kim, and G. V. Loganathan, “A new heuristic optimization algorithm: harmony search,” Simulation, vol. 76, no. 2, pp. 60–68, 2001. View at: Google Scholar
  18. P. Yadav, R. Kumar, S. K. Panda, and C. S. Chang, “An intelligent tuned harmony search algorithm for optimisation,” Information Sciences, vol. 196, pp. 47–72, 2012. View at: Publisher Site | Google Scholar
  19. S. Gholizadeh and A. Barzegar, “Shape optimization of structures for frequency constraints by sequential harmony search algorithm,” Engineering Optimization, vol. 45, no. 6, pp. 627–646, 2013. View at: Google Scholar
  20. A. Kaveh and S. Talatahari, “A novel heuristic optimization method: charged system search,” Acta Mechanica, vol. 213, no. 3-4, pp. 267–289, 2010. View at: Publisher Site | Google Scholar
  21. L. Xie, J. Zeng, and R. A. Formato, “Selection strategies for gravitational constant G in artificial physics optimisation based on analysis of convergence properties,” International Journal of Bio-Inspired Computation, vol. 4, no. 6, pp. 380–391, 2012. View at: Google Scholar
  22. A. H. Gandomi, X.-S. Yang, A. H. Alavi, and S. Talatahari, “Bat algorithm for constrained optimization tasks,” Neural Computing & Applications, vol. 22, no. 6, pp. 1239–1255, 2013. View at: Google Scholar
  23. X. S. Yang and A. H. Gandomi, “Bat algorithm: a novel approach for global engineering optimization,” Engineering Computations, vol. 29, no. 5, pp. 464–483, 2012. View at: Google Scholar
  24. X. Li, J. Zhang, and M. Yin, “Animal migration optimization: an optimization algorithm inspired by animal migration behavior,” Neural Computing and Applications, 2013. View at: Publisher Site | Google Scholar
  25. A. H. Gandomi and A. H. Alavi, “Krill herd: a new bio-inspired optimization algorithm,” Communications in Nonlinear Science and Numerical Simulation, vol. 17, no. 12, pp. 4831–4845, 2012. View at: Google Scholar
  26. G.-G. Wang, A. H. Gandomi, and A. H. Alavi, “Stud krill herd algorithm,” Neurocomputing, 2013. View at: Publisher Site | Google Scholar
  27. G.-G. Wang, A. H. Gandomi, and A. H. Alavi, “An effective krill herd algorithm with migration operator in biogeography-based optimization,” Applied Mathematical Modelling, 2013. View at: Publisher Site | Google Scholar
  28. R. Storn and K. Price, “Differential evolution-a simple and efficient adaptive scheme for global optimization over continuous spaces,” Tech. Rep. 1075-4946, International Computer Science Institute, Berkley, Calif, USA, 1995. View at: Google Scholar
  29. R. Storn and K. Price, “Differential evolution-a simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341–359, 1997. View at: Google Scholar
  30. X. Li and M. Yin, “Application of differential evolution algorithm on self-potential data,” PLoS One, vol. 7, no. 12, Article ID e51199, 2012. View at: Publisher Site | Google Scholar
  31. G. G. Wang, A. H. Gandomi, A. H. Alavi, and G. S. Hao, “Hybrid krill herd algorithm with differential evolution for global numerical optimization,” Neural Computing & Applications, 2013. View at: Publisher Site | Google Scholar
  32. J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proceedings of the IEEE International Conference on Neural Networks, pp. 1942–1948, Perth, Australia, December 1995. View at: Google Scholar
  33. R. J. Kuo, Y. J. Syu, Z.-Y. Chen, and F. C. Tien, “Integration of particle swarm optimization and genetic algorithm for dynamic clustering,” Information Sciences, vol. 195, pp. 124–140, 2012. View at: Publisher Site | Google Scholar
  34. S. Talatahari, M. Kheirollahi, C. Farahmandpour, and A. H. Gandomi, “A multi-stage particle swarm for optimum design of truss structures,” Neural Computing & Applications, vol. 23, no. 5, pp. 1297–1309, 2013. View at: Google Scholar
  35. K. Y. Huang, “A hybrid particle swarm optimization approach for clustering and classification of datasets,” Knowledge-Based Systems, vol. 24, no. 3, pp. 420–426, 2011. View at: Publisher Site | Google Scholar
  36. W. Khatib and P. Fleming, “The stud GA: a mini revolution?” Parallel Problem Solving from Nature, pp. 683–691, 1998. View at: Google Scholar
  37. X. S. Yang and S. Deb, “Cuckoo search via Lévy flights,” in Proceedings of the World Congress on Nature and Biologically Inspired Computing (NABIC '09), pp. 210–214, Coimbatore, India, December 2009. View at: Publisher Site | Google Scholar
  38. A. H. Gandomi, S. Talatahari, X. S. Yang, and S. Deb, “Design optimization of truss structures using cuckoo search algorithm,” The Structural Design of Tall and Special Buildings, vol. 22, no. 17, pp. 1330–1349, 2013. View at: Publisher Site | Google Scholar
  39. X. Cai, S. Fan, and Y. Tan, “Light responsive curve selection for photosynthesis operator of APOA,” International Journal of Bio-Inspired Computation, vol. 4, no. 6, pp. 373–379, 2012. View at: Google Scholar
  40. D. Simon, “Biogeography-based optimization,” IEEE Transactions on Evolutionary Computation, vol. 12, no. 6, pp. 702–713, 2008. View at: Publisher Site | Google Scholar
  41. A. H. Gandomi, X.-S. Yang, and A. H. Alavi, “Mixed variable structural optimization using firefly algorithm,” Computers & Structures, vol. 89, no. 23-24, pp. 2325–2336, 2011. View at: Publisher Site | Google Scholar
  42. X. S. Yang, Nature-Inspired Metaheuristic Algorithms, Luniver, Frome, UK, 2008.
  43. X. S. Yang, “Firefly algorithms for multimodal optimization,” in Proceedings of the 5th International Conference on Stochastic Algorithms: Foundations and Applications, pp. 169–178, Springer, Sapporo, Japan, 2009. View at: Google Scholar
  44. X. S. Yang, “Firefly algorithm, stochastic test functions and design optimisation,” International Journal of Bio-Inspired Computation, vol. 2, no. 2, pp. 78–84, 2010. View at: Google Scholar
  45. X.-S. Yang, S. S. S. Hosseini, and A. H. Gandomi, “Firefly algorithm for solving non-convex economic dispatch problems with valve loading effect,” Applied Soft Computing Journal, vol. 12, no. 3, pp. 1180–1186, 2012. View at: Publisher Site | Google Scholar
  46. R. Parpinelli and H. Lopes, “New inspirations in swarm intelligence: a survey,” International Journal of Bio-Inspired Computation, vol. 3, no. 1, pp. 1–16, 2011. View at: Google Scholar
  47. D. Zou, L. Gao, J. Wu, and S. Li, “Novel global harmony search algorithm for unconstrained problems,” Neurocomputing, vol. 73, no. 16–18, pp. 3308–3318, 2010. View at: Publisher Site | Google Scholar
  48. G. Wang, L. Guo, H. Wang, H. Duan, L. Liu, and J. Li, “Incorporating mutation scheme into krill herd algorithm for global numerical optimization,” Neural Computing and Applications, 2012. View at: Publisher Site | Google Scholar
  49. S. Z. Zhao, P. N. Suganthan, Q.-K. Pan, and M. Fatih Tasgetiren, “Dynamic multi-swarm particle swarm optimizer with harmony search,” Expert Systems with Applications, vol. 38, no. 4, pp. 3735–3742, 2011. View at: Publisher Site | Google Scholar
  50. G. Wang, L. Guo, H. Duan, L. Liu, and H. Wang, “A modified firefly algorithm for UCAV path planning,” International Journal of Hybrid Information Technology, vol. 5, no. 3, pp. 123–144, 2012. View at: Google Scholar
  51. A. H. Gandomi, X. S. Yang, S. Talatahari, and A. H. Alavi, “Firefly algorithm with chaos,” Communications in Nonlinear Science and Numerical Simulation, vol. 18, no. 1, pp. 89–98, 2013. View at: Google Scholar
  52. Y. Zhang, D. Huang, M. Ji, and F. Xie, “Image segmentation using PSO and PCM with Mahalanobis distance,” Expert Systems with Applications, vol. 38, no. 7, pp. 9036–9040, 2011. View at: Publisher Site | Google Scholar
  53. G. G. Wang, L. Guo, A. H. Gandomi, A. H. Alavi, and H. Duan, “Simulated annealing-based krill herd algorithm for global optimization,” Abstract and Applied Analysis, vol. 2013, Article ID 213853, 11 pages, 2013. View at: Publisher Site | Google Scholar
  54. X. Yao, Y. Liu, and G. Lin, “Evolutionary programming made faster,” IEEE Transactions on Evolutionary Computation, vol. 3, no. 2, pp. 82–102, 1999. View at: Publisher Site | Google Scholar

Copyright © 2013 Lihong Guo et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

2417 Views | 1169 Downloads | 38 Citations
 PDF  Download Citation  Citation
 Download other formatsMore
 Order printed copiesOrder