About this Journal Submit a Manuscript Table of Contents
The Scientific World Journal
Volume 2013 (2013), Article ID 125625, 9 pages
http://dx.doi.org/10.1155/2013/125625
Research Article

An Effective Hybrid Firefly Algorithm with Harmony Search for Global Numerical Optimization

1Changchun Institute of Optics, Fine Mechanics and Physics, Chinese Academy of Sciences, Changchun 130033, China
2School of Computer Science and Technology, Jiangsu Normal University, Xuzhou, Jiangsu 221116, China

Received 10 August 2013; Accepted 29 September 2013

Academic Editors: Z. Cui and X. Yang

Copyright © 2013 Lihong Guo et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

A hybrid metaheuristic approach by hybridizing harmony search (HS) and firefly algorithm (FA), namely, HS/FA, is proposed to solve function optimization. In HS/FA, the exploration of HS and the exploitation of FA are fully exerted, so HS/FA has a faster convergence speed than HS and FA. Also, top fireflies scheme is introduced to reduce running time, and HS is utilized to mutate between fireflies when updating fireflies. The HS/FA method is verified by various benchmarks. From the experiments, the implementation of HS/FA is better than the standard FA and other eight optimization methods.

1. Introduction

In engineering problems, optimization is to look for a vector that can maximize and minimize a function. Nowadays, stochastic method is generally utilized to cope with optimization problems [1]. Though there are many ways to classify them, a simple one is used to divide them into two groups according to their nature: deterministic and stochastic. Deterministic algorithms can get the same solutions if the initial conditions are unchanged, because they always follow the rigorous move. However, regardless of the initial values, stochastic ones are based on certain stochastic distribution; therefore they generally generate various solutions. In fact, both of them can find satisfactory solutions after some generations. Recently, nature-inspired algorithms are well capable of solving numerical optimization problems more efficiently.

These metaheuristic approaches are developed to solve complicated problems, like permutation flow shop scheduling [2], reliability [3, 4], high-dimensional function optimization [5], and other engineering problems [6, 7]. In the 1950s, nature evolution was idealized as an optimization technology and this made a new type of approach, namely, genetic algorithms (GAs) [8]. After that, many other metaheuristic methods have appeared, like evolutionary strategy (ES) [9, 10], ant colony optimization (ACO) [11], probability-based incremental learning (PBIL) [12], big bang-big crunch algorithm [1316], harmony search (HS) [1719], charged system search (CSS) [20], artificial physics optimization [21], bat algorithm (BA) [22, 23], animal migration optimization (AMO) [24], krill herd (KH) [2527], differential evolution (DE) [2831], particle swarm optimization (PSO) [3235], stud GA (SGA) [36], cuckoo search (CS) [37, 38], artificial plant optimization algorithm (APOA) [39], biogeography-based optimization (BBO) [40], and FA method [41, 42].

As a global optimization method, FA [42] is firstly proposed by Yang in 2008, and it is originated from the fireflies swarm. Recent researches demonstrate that the FA is quite powerful and relatively efficient [43]. Furthermore, the performance of FA can be improved with feasible promising results [44]. In addition, nonconvex problems can be solved by FA [45]. A summarization of swarm intelligence containing FA is given by Parpinelli and Lopes [46].

On the other hand, HS [17, 47] is a novel heuristic technique for optimization problems. In engineering optimization, the engineers make an effort to find an optimum that can be decided by an objective function. While, in the music improvisation process, musicians search for most satisfactory harmony as decided by aesthetician. HS method originates in the similarity between them [1].

In most cases, FA can find the optimal solution with its exploitation. However, the search used in FA is based on randomness, so it cannot always get the global best values. On the one hand, in order to improve diversity of fireflies, an improvement of adding HS is made to the FA, which can be treated as a mutation operator. By combining the principle of HS and FA, an enhanced FA is proposed to look for the best objective function value. On the other hand, FA needs much more time to search for the best solution and its performance significantly deteriorates with the increases in population size. In HS/FA, top fireflies scheme is introduced to reduce running time. This scheme is carried out by reduction of outer loop in FA. Through top fireflies scheme, the time complexity of HS/FA decreases from O(NP2) to O(KEEP*NP), where KEEP is the number of top fireflies. The proposed approach is evaluated on various benchmarks. The results demonstrate that the HS/FA performs more effectively and accurately than FA and other intelligent algorithms.

The rest of this paper is structured below. To begin with, a brief background on the HS and FA is provided in Sections 2 and 3, respectively. Our proposed HS/FA is presented in Section 4. HS/FA is verified through various functions in Section 5, and Section 6 presents the general conclusions.

2. HS Method

As a relative optimization technique, there are four optimization operators in HS [17, 48, 49]: HM: the harmony memory, as shown in (1);HMS: the harmony memory size, HMCR: the harmony memory consideration rate, PAR: the pitch adjustment rate, andbw: the pitch adjustment bandwidth [1].

Consider

The HS method can be explained according to the discussion of the player improvisation process. There are 3 feasible options for a player in the music improvisation process: (1) play several pitches that are the same with the HMCR; (2) play some pitches like a known piece; or (3) improvise new pitches [1]. These three options can be idealized into three components: use of HM, pitch adjusting, and randomization [1].

Similar to selecting the optimal ones in GA, the first part is important as it is [1]. This can guarantees that the optimal harmonies will not be destroyed in the HM. To make HS more powerful, the parameter HMCR should be properly set [1]. Through several experiments, in most cases, HMCR = 0.7~0.95.

The pitch in the second part needs to be adjusted slightly; and hence a proper method is used to adjust the frequency [1]. If the new pitch is updated bywhere is a random number in and is the current pitch. Here, is the bandwidth.

Parameter PAR should also be appropriately set. If PAR is very close to 1, then the solution is always updating and HS is hard to converge. If it is next to 0, then little change is made and HS may be premature. So, here we set PAR = 0.1~0.5 [1].

To improve the diversity, the randomization is necessary as shown in the third component. The usage of randomization allows the method to go a step further into promising area so as to find the optimal solution [1].

The HS can be presented in Algorithm 1. Where is the number of decision variables. rand is a random real number in interval drawn from uniform distribution.

alg1
Algorithm 1: HS method.

3. FA Method

FA [42] is a metaheuristic approach for optimization problems. The search strategy in FA comes from the fireflies swarm behavior [50]. There are two significant issues in FA that are the formulation of attractiveness and variation of light intensity [42].

For simplicity, several characteristics of fireflies are idealized into three rules described in [51]. Based on these three rules, the FA can be described in Algorithm 2.

alg2
Algorithm 2: Firefly algorithm. FA method.

For two fireflies and , they can be updated as follows: where is the step size, is the attractiveness at , the second part is the attraction, while the third is randomization [50]. In our present work, we take = 1, , and [50].

4. HS/FA

Based on the introduction of HS and FA in the previous section, the combination of the two approaches is described and HS/FA is proposed, which updates the poor solutions to accelerate its convergence speed.

HS and FA are adept at exploring the search space and exploiting solution, respectively. Therefore, in the present work, a hybrid by inducing HS into FA method named HS/FA is utilized to deal with optimization problem, which can be considered as mutation operator. By this strategy, the mutation of the HS and FA can explore the new search space and exploit the population, respectively. Therefore, it can overcome the lack of the exploration of the FA.

To combat the random walks used in FA, in the present work, the addition of mutation operator is introduced into the FA, including two detailed improvements.

The first one is the introduction of top fireflies scheme into FA to reduce running time that is analogous to the elitism scheme frequently used in other population-based optimization algorithms. In FA, due to dual loop, time complexity is O(NP2), whose performance significantly deteriorates with the increases in population size. This improvement is carried out by reduction of outer loop in FA. In HS/FA, we select the special firefly with optimal or near-optimal fitness (i.e., the brightest fireflies) to form top fireflies, and all the fireflies only move towards top fireflies. Through top fireflies scheme, the time complexity of HS/FA decreases from O(NP2) to O(KEEP*NP), where KEEP is the number of top fireflies. In general, KEEP is far smaller than NP, so the time used by HS/FA is much less than FA. Apparently, if KEEP = NP, the algorithm HS/FA is declined to the standard FA. If KEEP is too small, only few best fireflies are selected to form top fireflies and it converges too fast, moreover, may be premature for lack of diversity. If KEEP is extremely big (near NP), almost all the fireflies are used to form top fireflies, so all fireflies are explored well, leading to potentially optimal solutions, while the algorithm performs badly and converges too slowly. Therefore, we use KEEP = 2 in our study.

The second is the addition of HS serving as mutation operator striving to improve the population diversity to avoid the premature convergence. In standard FA, if firefly is brighter than firefly , firefly will move towards firefly , and then evaluate newly-generated fireflies and update light intensity. If not, firefly does nothing. However, in HS/FA, if firefly is not brighter than firefly , firefly is updated by mutation operation to improve the light intensity for firefly . More concretely, for the global search part, with respect to HS/FA, we tune every element () in (the position of firefly ) using HS. When is not less than HMCR, that is, ≥ HMCR, the element is updated randomly; whereas when < HMCR, we update the element in accordance with . Under this circumstance, pitch adjustment operation in HS is applied to update the element if < PAR to increase population diversity, as shown in (2), where and are two uniformly distributed random numbers in , is the integer number in , and NP is population size.

In sum, the detailed presentation of HS/FA can be given in Algorithm 3.

alg3
Algorithm 3: HS/FA method.

5. The Results

The HS/FA method is tested on optimization problems through several simulations conducted in test problems. To make a fair comparison between different methods, all the experiments were conducted on the same conditions described in [1].

In this section, HS/FA is compared on optimization problems with other nine methods, which are ACO [11], BBO [40], DE [2830], ES [9, 10], FA [41, 42], GA [8], HS [1719], PSO [32, 52], and SGA [36]. Here, for HS, FA, and HS/FA, the parameters are set as follows: absorption coefficient = 1.0, the HMCR = 0.9, and the PAR = 0.1. For parameters used in other methods, they can be referred to as in [48, 53]. Thirty-six functions are utilized to verify our HS/FA method, which can be shown in Table 1. More knowledge of all the benchmarks can be found in [54].

tab1
Table 1: Benchmark functions.

Because all the intelligent algorithms always have some randomness, in order to get representative statistical features, we did 500 implementations of each method on each problem. Tables 2 and 3 illustrate the average and best results found by each algorithm, respectively. Note that we have used two different scales to normalize the values in the tables, and its detailed process can be found in [54]. The dimension of each function is set to 30.

tab2
Table 2: Mean normalized optimization results.
tab3
Table 3: Best normalized optimization results.

From Table 2, on average, HS/FA is well capable of finding function minimum on twenty-eight of the thirty-six functions. FA performs the second best on ten of the thirty-six functions. Table 3 shows that HS/FA and FA perform the same and best on twenty-two of the thirty-six and seventeen functions, respectively. ACO, DE, and GA perform the best on eight benchmarks. From the above tables, we can see that, for low-dimensional functions, both FA and HS/FA perform well, and their performance has little difference between each other.

Further, convergence graphs of ten methods for most representative functions are illustrated in Figures 1, 2, 3, and 4 which indicate the optimization process. The values here are the real mean function values from above experiments.

125625.fig.001
Figure 1: Performance comparison for the F26 Rastrigin function.
125625.fig.002
Figure 2: Performance comparison for the F28 Schwefel 2.26 function.
125625.fig.003
Figure 3: Performance comparison for the F30 Schwefel 2.22 function.
125625.fig.004
Figure 4: Performance comparison for the F33 step function.

F26 is a complicated multimodal function and it has a single global value 0 and several local optima. Figure 1 shows that HS/FA converges to global value 0 with the fastest speed. Here FA converges a little faster initially, but it is likely to be trapped into subminima as the function value decreases slightly.

F28 is also a multimodal problem and it has only a global value 0. For this problem, HS/FA is superior to the other nine methods and finds the optimal value earliest.

For this function, the figure illustrates that HS/FA significantly outperforms all others in the optimization process. At last, HS/FA converges to the best solution superiorly to others. BBO is only inferior to HS/FA and performs the second best for this case.

HS/FA significantly outperforms all others in the optimization process. Furthermore, Figure 4 indicates that, at the early stage of the optimization process, FA converges faster than HS/FA, while HS/FA is well capable of improving its solution steadily in the long run. Here FA shows faster converges initially (within 20 iterations), however it seems to be trapped into subminima as the function value decreases slightly (after 20 iterations), and it is outperformed by HS/FA after 30 iterations.

From Figures 14, our HS/FA’s performance is far better than the others. In general, BBO and FA, especially FA, are only inferior to the HS/FA. Note that, in [40], BBO is compared with seven EAs and an engineering problem. The experiments proved the excellent performance of BBO. It is also indirectly proven that our HS/FA is a more effective optimization method than others.

6. Conclusions

In the present work, a hybrid HS/FA was proposed for optimization problems. FA is enhanced by the combination of the basic HS method. In HS/FA, top fireflies scheme is introduced to reduce running time; the other is used to mutate between fireflies when updating fireflies. The new harmony vector takes the place of the new firefly only if it is better than before, which generally outperforms HS and FA. The HS/FA strive to exploit merits of the FA and HS so as to escape all fireflies being trapped into local optima. Benchmark evaluation on the test problems is used to investigate the HS/FA and the other nine approaches. The results demonstrated that HS/FA is able to make use of the useful knowledge more efficiently to find much better values compared with the other optimization algorithms.

References

  1. G. Wang and L. Guo, “A novel hybrid bat algorithm with harmony search for global numerical optimization,” Journal of Applied Mathematics, vol. 2013, Article ID 696491, 21 pages, 2013. View at Publisher · View at Google Scholar
  2. X. Li and M. Yin, “An opposition-based differential evolution algorithm for permutation flow shop scheduling based on diversity measure,” Advances in Engineering Software, vol. 55, pp. 10–31, 2013.
  3. D. Zou, L. Gao, S. Li, and J. Wu, “An effective global harmony search algorithm for reliability problems,” Expert Systems with Applications, vol. 38, no. 4, pp. 4642–4648, 2011. View at Publisher · View at Google Scholar · View at Scopus
  4. D. Zou, L. Gao, J. Wu, S. Li, and Y. Li, “A novel global harmony search algorithm for reliability problems,” Computers and Industrial Engineering, vol. 58, no. 2, pp. 307–316, 2010. View at Publisher · View at Google Scholar · View at Scopus
  5. X.-S. Yang, Z. Cui, R. Xiao, A. H. Gandomi, and M. Karamanoglu, Swarm Intelligence and Bio-Inspired Computation, Elsevier, Waltham, Mass, USA, 2013.
  6. A. H. Gandomi, X. S. Yang, S. Talatahari, and A. H. Alavi, Metaheuristic Applications in Structures and Infrastructures, Elsevier, Waltham, Mass, USA, 2013.
  7. X. S. Yang, A. H. Gandomi, S. Talatahari, and A. H. Alavi, Metaheuristics in Water, Geotechnical and Transport Engineering, Elsevier, Waltham, Mass, USA, 2013.
  8. D. E. Goldberg, Genetic Algorithms in Search, Optimization and Machine Learning, Addison-Wesley, Boston, Mass, USA, 1989.
  9. T. Back, Evolutionary Algorithms in Theory and Practice, Oxford University Press, Oxford, UK, 1996.
  10. H. Beyer, The Theory of Evolution Strategies, Springer, New York, NY, USA, 2001.
  11. M. Dorigo and T. Stutzle, Ant Colony Optimization, MIT Press, Cambridge, UK, 2004.
  12. B. Shumeet, “Population-based incremental learning: a method for integrating genetic search based function optimization and competitive learning,” Carnegie Mellon University CMU-CS-94-163, Carnegie Mellon University, Pittsburgh, Pa, USA, 1994.
  13. O. K. Erol and I. Eksin, “A new optimization method: big bang-big crunch,” Advances in Engineering Software, vol. 37, no. 2, pp. 106–111, 2006. View at Publisher · View at Google Scholar · View at Scopus
  14. A. Kaveh and S. Talatahari, “Size optimization of space trusses using big bang-big crunch algorithm,” Computers and Structures, vol. 87, no. 17-18, pp. 1129–1140, 2009. View at Publisher · View at Google Scholar · View at Scopus
  15. A. Kaveh and S. Talatahari, “Optimal design of schwedler and ribbed domes via hybrid big bang-big crunch algorithm,” Journal of Constructional Steel Research, vol. 66, no. 3, pp. 412–419, 2010. View at Publisher · View at Google Scholar · View at Scopus
  16. A. Kaveh and S. Talatahari, “A discrete big bang-big crunch algorithm for optimal design of skeletal structures,” Asian Journal of Civil Engineering, vol. 11, no. 1, pp. 103–122, 2010. View at Scopus
  17. Z. W. Geem, J. H. Kim, and G. V. Loganathan, “A new heuristic optimization algorithm: harmony search,” Simulation, vol. 76, no. 2, pp. 60–68, 2001. View at Scopus
  18. P. Yadav, R. Kumar, S. K. Panda, and C. S. Chang, “An intelligent tuned harmony search algorithm for optimisation,” Information Sciences, vol. 196, pp. 47–72, 2012. View at Publisher · View at Google Scholar · View at Scopus
  19. S. Gholizadeh and A. Barzegar, “Shape optimization of structures for frequency constraints by sequential harmony search algorithm,” Engineering Optimization, vol. 45, no. 6, pp. 627–646, 2013.
  20. A. Kaveh and S. Talatahari, “A novel heuristic optimization method: charged system search,” Acta Mechanica, vol. 213, no. 3-4, pp. 267–289, 2010. View at Publisher · View at Google Scholar · View at Scopus
  21. L. Xie, J. Zeng, and R. A. Formato, “Selection strategies for gravitational constant G in artificial physics optimisation based on analysis of convergence properties,” International Journal of Bio-Inspired Computation, vol. 4, no. 6, pp. 380–391, 2012.
  22. A. H. Gandomi, X.-S. Yang, A. H. Alavi, and S. Talatahari, “Bat algorithm for constrained optimization tasks,” Neural Computing & Applications, vol. 22, no. 6, pp. 1239–1255, 2013.
  23. X. S. Yang and A. H. Gandomi, “Bat algorithm: a novel approach for global engineering optimization,” Engineering Computations, vol. 29, no. 5, pp. 464–483, 2012.
  24. X. Li, J. Zhang, and M. Yin, “Animal migration optimization: an optimization algorithm inspired by animal migration behavior,” Neural Computing and Applications, 2013. View at Publisher · View at Google Scholar
  25. A. H. Gandomi and A. H. Alavi, “Krill herd: a new bio-inspired optimization algorithm,” Communications in Nonlinear Science and Numerical Simulation, vol. 17, no. 12, pp. 4831–4845, 2012.
  26. G.-G. Wang, A. H. Gandomi, and A. H. Alavi, “Stud krill herd algorithm,” Neurocomputing, 2013. View at Publisher · View at Google Scholar
  27. G.-G. Wang, A. H. Gandomi, and A. H. Alavi, “An effective krill herd algorithm with migration operator in biogeography-based optimization,” Applied Mathematical Modelling, 2013. View at Publisher · View at Google Scholar
  28. R. Storn and K. Price, “Differential evolution-a simple and efficient adaptive scheme for global optimization over continuous spaces,” Tech. Rep. 1075-4946, International Computer Science Institute, Berkley, Calif, USA, 1995.
  29. R. Storn and K. Price, “Differential evolution-a simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341–359, 1997. View at Scopus
  30. X. Li and M. Yin, “Application of differential evolution algorithm on self-potential data,” PLoS One, vol. 7, no. 12, Article ID e51199, 2012. View at Publisher · View at Google Scholar
  31. G. G. Wang, A. H. Gandomi, A. H. Alavi, and G. S. Hao, “Hybrid krill herd algorithm with differential evolution for global numerical optimization,” Neural Computing & Applications, 2013. View at Publisher · View at Google Scholar
  32. J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proceedings of the IEEE International Conference on Neural Networks, pp. 1942–1948, Perth, Australia, December 1995. View at Scopus
  33. R. J. Kuo, Y. J. Syu, Z.-Y. Chen, and F. C. Tien, “Integration of particle swarm optimization and genetic algorithm for dynamic clustering,” Information Sciences, vol. 195, pp. 124–140, 2012. View at Publisher · View at Google Scholar · View at Scopus
  34. S. Talatahari, M. Kheirollahi, C. Farahmandpour, and A. H. Gandomi, “A multi-stage particle swarm for optimum design of truss structures,” Neural Computing & Applications, vol. 23, no. 5, pp. 1297–1309, 2013.
  35. K. Y. Huang, “A hybrid particle swarm optimization approach for clustering and classification of datasets,” Knowledge-Based Systems, vol. 24, no. 3, pp. 420–426, 2011. View at Publisher · View at Google Scholar · View at Scopus
  36. W. Khatib and P. Fleming, “The stud GA: a mini revolution?” Parallel Problem Solving from Nature, pp. 683–691, 1998.
  37. X. S. Yang and S. Deb, “Cuckoo search via Lévy flights,” in Proceedings of the World Congress on Nature and Biologically Inspired Computing (NABIC '09), pp. 210–214, Coimbatore, India, December 2009. View at Publisher · View at Google Scholar · View at Scopus
  38. A. H. Gandomi, S. Talatahari, X. S. Yang, and S. Deb, “Design optimization of truss structures using cuckoo search algorithm,” The Structural Design of Tall and Special Buildings, vol. 22, no. 17, pp. 1330–1349, 2013. View at Publisher · View at Google Scholar
  39. X. Cai, S. Fan, and Y. Tan, “Light responsive curve selection for photosynthesis operator of APOA,” International Journal of Bio-Inspired Computation, vol. 4, no. 6, pp. 373–379, 2012.
  40. D. Simon, “Biogeography-based optimization,” IEEE Transactions on Evolutionary Computation, vol. 12, no. 6, pp. 702–713, 2008. View at Publisher · View at Google Scholar · View at Scopus
  41. A. H. Gandomi, X.-S. Yang, and A. H. Alavi, “Mixed variable structural optimization using firefly algorithm,” Computers & Structures, vol. 89, no. 23-24, pp. 2325–2336, 2011. View at Publisher · View at Google Scholar · View at Scopus
  42. X. S. Yang, Nature-Inspired Metaheuristic Algorithms, Luniver, Frome, UK, 2008.
  43. X. S. Yang, “Firefly algorithms for multimodal optimization,” in Proceedings of the 5th International Conference on Stochastic Algorithms: Foundations and Applications, pp. 169–178, Springer, Sapporo, Japan, 2009.
  44. X. S. Yang, “Firefly algorithm, stochastic test functions and design optimisation,” International Journal of Bio-Inspired Computation, vol. 2, no. 2, pp. 78–84, 2010.
  45. X.-S. Yang, S. S. S. Hosseini, and A. H. Gandomi, “Firefly algorithm for solving non-convex economic dispatch problems with valve loading effect,” Applied Soft Computing Journal, vol. 12, no. 3, pp. 1180–1186, 2012. View at Publisher · View at Google Scholar · View at Scopus
  46. R. Parpinelli and H. Lopes, “New inspirations in swarm intelligence: a survey,” International Journal of Bio-Inspired Computation, vol. 3, no. 1, pp. 1–16, 2011.
  47. D. Zou, L. Gao, J. Wu, and S. Li, “Novel global harmony search algorithm for unconstrained problems,” Neurocomputing, vol. 73, no. 16–18, pp. 3308–3318, 2010. View at Publisher · View at Google Scholar · View at Scopus
  48. G. Wang, L. Guo, H. Wang, H. Duan, L. Liu, and J. Li, “Incorporating mutation scheme into krill herd algorithm for global numerical optimization,” Neural Computing and Applications, 2012. View at Publisher · View at Google Scholar
  49. S. Z. Zhao, P. N. Suganthan, Q.-K. Pan, and M. Fatih Tasgetiren, “Dynamic multi-swarm particle swarm optimizer with harmony search,” Expert Systems with Applications, vol. 38, no. 4, pp. 3735–3742, 2011. View at Publisher · View at Google Scholar · View at Scopus
  50. G. Wang, L. Guo, H. Duan, L. Liu, and H. Wang, “A modified firefly algorithm for UCAV path planning,” International Journal of Hybrid Information Technology, vol. 5, no. 3, pp. 123–144, 2012.
  51. A. H. Gandomi, X. S. Yang, S. Talatahari, and A. H. Alavi, “Firefly algorithm with chaos,” Communications in Nonlinear Science and Numerical Simulation, vol. 18, no. 1, pp. 89–98, 2013.
  52. Y. Zhang, D. Huang, M. Ji, and F. Xie, “Image segmentation using PSO and PCM with Mahalanobis distance,” Expert Systems with Applications, vol. 38, no. 7, pp. 9036–9040, 2011. View at Publisher · View at Google Scholar · View at Scopus
  53. G. G. Wang, L. Guo, A. H. Gandomi, A. H. Alavi, and H. Duan, “Simulated annealing-based krill herd algorithm for global optimization,” Abstract and Applied Analysis, vol. 2013, Article ID 213853, 11 pages, 2013. View at Publisher · View at Google Scholar
  54. X. Yao, Y. Liu, and G. Lin, “Evolutionary programming made faster,” IEEE Transactions on Evolutionary Computation, vol. 3, no. 2, pp. 82–102, 1999. View at Publisher · View at Google Scholar · View at Scopus