Table of Contents Author Guidelines Submit a Manuscript
Computational Intelligence and Neuroscience
Volume 2016 (2016), Article ID 6097484, 17 pages
http://dx.doi.org/10.1155/2016/6097484
Research Article

A Novel Quantum-Behaved Bat Algorithm with Mean Best Position Directed for Numerical Optimization

1Communications Engineering, Chongqing University, Chongqing 400030, China
2Jiuquan Satellite Launch Center, Jiuquan 732750, China

Received 26 January 2016; Revised 11 April 2016; Accepted 26 April 2016

Academic Editor: Christian W. Dawson

Copyright © 2016 Binglian Zhu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

This paper proposes a novel quantum-behaved bat algorithm with the direction of mean best position (QMBA). In QMBA, the position of each bat is mainly updated by the current optimal solution in the early stage of searching and in the late search it also depends on the mean best position which can enhance the convergence speed of the algorithm. During the process of searching, quantum behavior of bats is introduced which is beneficial to jump out of local optimal solution and make the quantum-behaved bats not easily fall into local optimal solution, and it has better ability to adapt complex environment. Meanwhile, QMBA makes good use of statistical information of best position which bats had experienced to generate better quality solutions. This approach not only inherits the characteristic of quick convergence, simplicity, and easy implementation of original bat algorithm, but also increases the diversity of population and improves the accuracy of solution. Twenty-four benchmark test functions are tested and compared with other variant bat algorithms for numerical optimization the simulation results show that this approach is simple and efficient and can achieve a more accurate solution.

1. Introduction

In recent years, with the need of optimization problems in reality, all kinds of bioinspired optimization algorithms or swarm intelligence optimization algorithms have been proposed, such as the genetic algorithm (GA) [1], differential evolution (DE) [2], ant colony optimization (ACO) [3], firefly algorithm (FA) [4, 5], cuckoo search (CS) algorithm [6, 7], particle swarm optimization (PSO) [8], artificial bee colony (ABC) optimization [9], and bat algorithm (BA) [10, 11]. These bionic intelligent algorithms are random search methods which mimic natural biological systems [12]; entirely instinct depends on the organism itself, by the unconscious optimization behavior to optimize its survival to adapt to the environment. Compared with the traditional optimal algorithms, they do not depend on the characteristic of strict mathematical optimization problem itself and have the characteristic of strong parallelism. Each individual has self-organization and evolution; as a result, some high dimensional optimization problems are superior to the traditional methods.

The bat algorithm is a new metaheuristic method which was proposed by Yang in 2010 [10, 11]. The capability of echolocation of microbats is fascinating as these bats can find their prey and discriminate different types of insects even in complete darkness [13]. Inspired by the echolocative behavior of bats, this algorithm carries the search process using artificial bats as search agents mimicking the natural pulse loudness and emission rate of real bats. When these bats are chasing prey, they tend to decrease the loudness and increase the rate of pulse emission. Bat algorithm has a good optimization performance in low dimensional case [1416] and is widely used in engineering optimization [17, 18] and multiobjective optimization [19]; however, due to low population diversity, it severely suffers from premature convergence problem in the high dimensional case [20]. So many variant bat algorithms are proposed to enhance the population diversity and to avoid being trapped into local optimum. In [21], Lin et al. put forward a chaotic Levy flight bat algorithm (CLBA) for parameter estimation in nonlinear dynamic biological system; in [13], Xie et al. introduced the difference operator and Levy flight trajectory into the bat algorithm (DLBA) to solve function optimization problems; in [22], Yılmaz and Kucuksille, inspired by standard particle swarm optimization (PSO) [8] algorithm and artificial bee colony (ABC) algorithm [9], proposed improved bat algorithm (IBA). In IBA, the velocity of each bat is updated with linear decreasing inertia weight factor and the frequency is self-adaptive to improve the exploration and exploitation. In [23], Wang and Guo put the harmony search method into the bat algorithm and developed a hybrid metaheuristic HSBA method for optimization problem, speeding up convergence of bat algorithm; in [24], Yılmaz et al. have studied the mechanism of updating loudness and pulse emission rate of BA and they found that loudness and pulse emission rate provide a balance between exploitation and exploration. So they modify the equations to improve the exploration capability of BA. Afrabandpey et al. [25] introduced the chaotic sequences into bat algorithm in different ways to avoid premature convergence. In [26], four parameters of BA are replaced by the chaotic maps separately; meanwhile, in [20], loudness and pulse emission rate are tuned via multiplying a linear decreasing or increasing function by a chaotic map function; in [25], chaotic map takes place of random number for parameter initialization. These approaches, to some extent, can avoid getting trapped into local minimum. However, in QMBA, bats adopt different search strategies in different times and have the mechanism of jumping out of local optimal solution; these strategies enhance the convergence speed of the algorithm and improve the accuracy of solutions.

The rest of paper is organized as follows. Section 2 describes the standard BA and Section 3 presents the quantum-behaved bat algorithm with the direction of mean best position. The simulation and comparison of this proposed algorithm are presented in Section 4. Finally, general conclusions are drawn in Section 5.

2. The Bat Algorithm

Echolocation is a very important character of bats; Yang proposed bat algorithm by mimics of bats’ foraging behavior. Bats fly randomly in the air or in the process of searching for prey by using echolocation to catch food and to avoid obstacles. In order to transform these behaviors of bats to algorithm, there are some approximations and idealized rules [10].(i)All bats use echolocation to sense distance, and they also “know” the difference between food/prey and background barriers in some magical way.(ii)Bats fly randomly with velocity at position with a fixed frequency , varying wavelength , and loudness to search for prey. They can automatically adjust the wavelength (or frequency) of their emitted pulses and adjust the rate of pulse emission , depending on the proximity of their target.(iii)Although the loudness can vary in many ways, we assume that the loudness varies from a large (positive) to a minimum constant value .

In the BA, for the th bats of swarm having position (solution), velocity , and frequency , each bat will move toward the current best position (solution), and its position, velocity, and frequency are updated during the course of iteration as follows:where is a random number of a uniform distribution in and represents the current global best solution (position) after comparing all the solutions (positions) among all the bats. These equations can guarantee the exploration ability of BA.

For the local search, when a solution is selected among the current best solutions, a new candidate solution can be generated as where is a random number in and directs new solution apart from or close to the current best solution. Here, is mean value of all bats of loudness.

When finding prey, bat will gradually decrease the loudness and increase the rate of pulse emission in order to track its prey to capture it. The loudness and pulse emission rate update accordingly as the iterations proceed as shown in where and are constants. In fact, the parameter controls the convergence of bat algorithm and therefore plays a similar role as the cooling factor in the simulated annealing algorithm [27]. For simplicity, we set in our simulations.

The basic steps of BA can be summarized as the pseudocode shown in Algorithm 1.

Algorithm 1: Bat algorithm (BA) pseudocode.

3. Quantum-Behaved Bat Algorithm with the Direction of Mean Best Position

The standard bat algorithm has quick convergence and is easy to be implemented; therefore, it has been widely applied in practical engineering. However, BA is more easily to fall into local optimal point when optimizing the multimodal functions. There are many reasons for prematuration of BA. Firstly, through the analysis of the trajectory of bats, we found that many bats are trapped into local optimal point because of decreasing diversity of bats. Secondly, other bats are directed only by current optimal solution, if only the best bat falls into local point and it will misguide others.

Thirdly, there are no mechanisms to jump out of local optima in BA. In order to solve the above problem, the behavior of quantum of bats is introduced into the algorithm to increase the diversity of population and it also contributes to avoiding prematuration. The current global optimal solution is used to guide other bats flying in the early stage of searching, while mean best position is used in later stage of searching. That can enhance the efficiency and convergence speed of BA.

Our proposed algorithm (QMBA) is based on the basics of framework of original bat algorithm. The parameters and control the exploitation and exploration, respectively, by updating the two factors to guide BA into local search or global search; however, the new candidate solutions are generated by the following formulas which are different from original BA:where and it indicates the distance between the position of th dimension of th bat and the th dimension of current optimal position among all bats, and rand is a random number in . If is greater than the threshold TH, it means the distance between th bat and current optimal position is far; hence, the current bat moves toward optimal position so far by random step. However, if is less than the threshold TH, it suggests that the current bat is nearby the current optimal position; therefore, the bat flies randomly. The diversity of bat population and the exploration ability are improved by self-adapting the step in terms of distance.

During the process of search, according to certain probability of mutation , some bats will be mutated with quantum behavior [28, 29]; these bats are updated with the following formulas: where is the contraction-expansion coefficient defined as

and are the initial and the final values of , respectively, and is the maximum number of cycles. We usually set and to obtain good performance in general [30].

is the mean best position defined as the average of positions of all bats. That is,where indicate the best position of the th bat experienced, is the population size, and is the dimension of problem. Bat with quantum behavior increases the diversity of population and contributes to jumping out of the local optima.

In the late state of searching, positions of bats are updated as follows:The mean best position is used to guide other bats flying in the late stage of searching; it improves the accuracy of solutions and speeds up the convergence of the algorithm because of using the statistical information of better position of bats. The pseudocode of quantum-behaved bat algorithm with mean best position is shown in Algorithm 2.

Algorithm 2: Bat algorithm with mean best position (QMBA) pseudocode.

4. The Simulations

In order to verify the efficiency of the proposed algorithm, we select twenty-four standard benchmark functions [24, 31, 32] to test the ability to search QMBA. The results are compared with BA and other variants of BA to show the performance of global numerical optimization. For this purpose, we have used BA, IBA, and HSBA to carry out numerical experiments for 24 standard test benchmarks, and the results will be discussed in this section.

4.1. Benchmark Functions

There are many different functions used to test the performance of algorithms. These benchmark functions can be divided into four categories. Category I is unimodal, category II is multimodal, category III is shifted or rotated, and category IV is composite function. Table 1 lists these functions, respectively, where indicates the dimension of the function, range is the boundary of the function’s search space, and is the minimum value of the function. Unimodal function (F1~F6) has only one extreme point and these algorithms can easily find the point; however, in multimodal function (F7~F12) there will be many local minimums; it is relatively difficult to find the global minimum. As the global minimum of most benchmark functions are zero for all dimensions, therefore, we select some shifted and rotated functions to test the algorithm’s robustness. For shifted and rotated functions (F13~F18) which have different parameter values for different dimensions, as can be seen in Table 1, F13, F14, and F15 are shifted functions whose global optimum is shifted to random positions to make different parameter values for different dimensions, where , define the new shifted optimum position for primitive test function. The other way is rotating the functions by using formula, where is an orthogonal rotation matrix, such as F16, F17, and F18. Finally, composite test function (F19~F24) combines different other test functions by stretching and manipulating their coverage range. It should be noted that define the linear transformation matrix for each , is used to control each ’s coverage range, and indicate the stretch or compression level of the primitive test functions (see [32] for the detailed description of the composite functions). These composite functions provide challenge to find the global optimum, but that can verify the searching capability of the algorithm effectively.

Table 1: Benchmark function.
4.2. Parameter Setting

In order to determine whether QMBA algorithm can be as effective as BA and other variants of BA, we compared its performance on numerical optimization with BA and other variants of BA, which include BA with inertia weight (IBA) [22], modified bat algorithm (MBA) [24], BA with harmony search (HSBA) [23], and bat algorithm based on chaotic map (CBA) [25].

All these algorithms are tested with 30 independent runs, the number of bats in population is fixed to 50, the dimension of problem is 30, and maximum number of iterations is set to 900 except HSBA which is 300, so these algorithms have reached 45000 FEs. The other parameter settings of these algorithms are given in Table 2.

Table 2: The parameter settings of these algorithms.
4.3. Comparison of Experiment Results

The comparisons of test results of BA and other variants of BA are shown in Tables 3, 5, 7, and 9. As can be seen from these tables, the mean fitness values, maximum values, minimum values, and standard deviations are obtained in thirty trails.

Table 3: Experimental results of unimodal benchmark functions by algorithms (best results in bold).

According to Derrac et al. [33], to improve the evaluation of evolutionary algorithms’ performance, statistical tests should be conducted. In other words, it is not enough to compare algorithms based on the mean and standard deviation values. A statistical test is necessary to prove that a proposed new algorithm presents a significant improvement over other existing methods for a particular problem.

In order to judge whether the results of the QMBA differ from the best results of the other algorithms in a statistically significant way, a nonparametric statistical test, -test [32, 33], was carried out at a 5% significance level. The values calculated in -test comparing QMBA and other algorithms over all the benchmark functions are given in Tables 4, 6, 8, and 10. In these tables, according to Derrac et al. [33], those values that are less than 0.05 could be considered as strong evidence against the null hypothesis.

Table 4: values calculated for -test on unimodal benchmark functions.
Table 5: Experimental results of multimodal benchmark functions by algorithms (best results in bold).
Table 6: values calculated for -test on multimodal benchmark functions.
Table 7: Experimental results of shifted and rotated benchmark functions by algorithms (best results in bold).
Table 8: values calculated for -test on shifted and rotated benchmark functions.
Table 9: Experimental results of hybrid composite benchmark functions by algorithms (best results in bold).
Table 10: values calculated for -test on hybrid benchmark functions.

In the following subsections, the details of the results and discussion at each group of benchmark functions are provided.

Unimodal functions can evaluate the capability of exploitation of an optimization algorithm because they have only one global optimum. For function F1, Sphere Function, only QMBA could get a solution near the global optimal solution, and other algorithms are trapped into local minimum. F2 is Schwegel’s problem 2.22. For this function, HSBA and QMBA outperform all the other algorithms; however, QMBA offered the highest accuracy of solutions. For function F3, Schwegel’s problem 1.2, this function is difficult to optimize because there are too many terms for high dimensional ones, but the mean fitness value of QMBA was the best among all the algorithms. In the results for F4, QMBA performed significantly better than any other algorithm. The fifth function F5 is known as Rosenbrock function which has a very narrow valley from local optimum to global optimum. So the algorithms are easily trapped into local optimum. But QMBA obtained the best solution among all the algorithms and its advantages over the competitors were statistically significant. In results obtained for F6, QMBA offered the highest accuracy. As can be seen in Table 6, the values indicate that QMBA achieves significant improvement in all the unimodal benchmark functions compared to other algorithms.

The results of the multimodal functions (F7–F12) are provided in Table 5. It should be noted that these benchmark functions have many local optimums with the number increasing exponentially with dimension, so they are useful for evaluating the exploration ability of an optimization algorithm. As observed in Tables 5 and 6, QMBA performs better than other algorithms in all benchmark multimodal functions. That means QMBA has also a very good exploration ability.

The results of shifted and rotated functions are provided in Tables 7 and 8. These functions can be used to test the algorithms’ robustness. As the results of mean, standard deviation, and minimum and maximum show, QMBA performs better than the other algorithms in two of the shifted and rotated (F14, F16) functions. The HSBA has the best results in two, and IBA and MBA have only one in category III, separately. It should be noted that HSBA and MBA are not significantly better than QMBA in functions F13 and F15, respectively. So it can be claimed that the results of QMBA in these functions are slightly better than the other algorithms.

The results of hybrid composite benchmark functions are provided in Tables 9 and 10. For function F19, QMBA has the best results, but the values of -test in Table 10 show that the results of QMBA are not significantly better than HSBA. In the remaining functions (F22, F23, and F24), the results for QMBA are significantly better than the other algorithms except for functions F20 and F21. Therefore, the results can strongly show that the QMBA has high performance in dealing with complex problem as well.

Furthermore, the convergence graphs of BA and other variants of BA are shown in Figures 124 which present the process of optimization. The fitness values shown in Figures 124 are the mean objective function optimum obtained from 30 independent runs for each algorithm. Figure 1 shows the results achieved from the six methods when Sphere Function is applied. From Figures 1, 2, 3, 4, 5, and 6, clearly, we can draw the conclusion that QMBA is significantly superior to all the other algorithms during the process of optimization, and it also has the fastest convergence rate compared with other algorithms.

Figure 1: The curve of fitness value for F1.
Figure 2: The curve of fitness value for F2.
Figure 3: The curve of fitness value for F3.
Figure 4: The curve of fitness value for F4.
Figure 5: The curve of fitness value for F5.
Figure 6: The curve of fitness value for F6.
Figure 7: The curve of fitness value for F7.
Figure 8: The curve of fitness value for F8.
Figure 9: The curve of fitness value for F9.
Figure 10: The curve of fitness value for F10.
Figure 11: The curve of fitness value for F11.
Figure 12: The curve of fitness value for F12.
Figure 13: The curve of fitness value for F13.
Figure 14: The curve of fitness value for F14.
Figure 15: The curve of fitness value for F15.
Figure 16: The curve of fitness value for F16.
Figure 17: The curve of fitness value for F17.
Figure 18: The curve of fitness value for F18.
Figure 19: The curve of fitness value for F19.
Figure 20: The curve of fitness value for F20.
Figure 21: The curve of fitness value for F21.
Figure 22: The curve of fitness value for F22.
Figure 23: The curve of fitness value for F23.
Figure 24: The curve of fitness value for F24.

Figures 7, 8, 9, 10, 11, and 12 are the curves of fitness value for multimodal functions. As can be seen from these figures, BA is more easily trapped into local minimum, especially in functions F8 and F9, and it has the slowest convergence rate; however, QMBA has also the best results and the fastest convergence rate. HSBA is showed to have the second best overall performance.

Figures 13, 14, 15, 16, 17, and 18 are the curves of fitness value for shifted rotated functions. As can be seen in these figures, CBA has faster convergence rate than other algorithms; however, it cannot obtain the better results except F15 and F18. This is probably due to CBA suffering from prematuration.

Figures 19, 20, 21, 22, 23, and 24 are the curves of fitness value hybrid composite functions. From these figures, we can observe that QMBA has the fastest convergence rate compared to other algorithms in four of six functions, while IBA and CBA severely suffer from premature convergence problem in the complex cases, especially in functions F22, F23, and F24.

5. Conclusions

In this paper, a variant of novel bat algorithm, namely, QMBA, is proposed by introducing quantum-behaved bat with the direction of mean best position during the searching process. In QMBA, the position of each bat not only depends on the current optimal solution, but also is determined by the mean best position with the iteration proceeding. In the process of searching, quantum behavior of bats is introduced to increase the diversity of population and avoid all bats getting trapped into local minimum. In other words, the current optimal solution leads global search to guarantee that all the bats converge, while the quantum-behaved bat and mean best position lead local search to jump out of local positions.

This new method can also speed up the global convergence rate without losing the strong robustness of the basic BA. Twenty-four benchmark test functions are tested, and compared with other variant bat algorithms for numerical optimization, from the analysis of the simulation results, we observed that the proposed QMBA makes good use of the information in past solutions more effectively to generate better quality solutions frequently and when compared to the other variants of BA showed that this approach is simple and efficient and can achieve a more accurate solution.

Our future work will focus on applying the QMBA algorithm to engineering optimization problems and developing new metahybrid approach to solve more complex optimization problem.

Competing Interests

All authors declare that there are no competing interests regarding the publication of this paper.

Acknowledgments

This work is supported by National Science Foundation of China under Grant no. 61201177.

References

  1. M. Srinivas and L. M. Patnaik, “Adaptive probabilities of crossover and mutation in genetic algorithms,” IEEE Transactions on Systems, Man and Cybernetics, vol. 24, no. 4, pp. 656–667, 1994. View at Publisher · View at Google Scholar · View at Scopus
  2. R. Storn and K. Price, “Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341–359, 1997. View at Publisher · View at Google Scholar · View at MathSciNet
  3. M. Dorigo, V. Maniezzo, and A. Colorni, “Ant system: optimization by a colony of cooperating agents,” IEEE Transactions on Systems, Man, and Cybernetics Part B: Cybernetics, vol. 26, no. 1, pp. 29–41, 1996. View at Publisher · View at Google Scholar · View at Scopus
  4. I. Fister Jr., X.-S. Yang, and J. Brest, “A comprehensive review of firefly algorithms,” Swarm and Evolutionary Computation, vol. 13, no. 1, pp. 34–46, 2013. View at Publisher · View at Google Scholar · View at Scopus
  5. X. Yang, “Firefly algorithms for multimodal optimization,” in Stochastic Algorithms Foundations & Applications, vol. 5792, pp. 169–178, Springer, Berlin, Germany, 2010. View at Google Scholar
  6. X. S. Yang and S. Deb, “Cuckoo search via Lévy flights,” in Proceedings of the World Congress on Nature & Biologically Inspired Computing (NaBIC '09), IEEE, Coimbatote, India, 2009.
  7. X.-S. Yang and S. Deb, “Cuckoo search: recent advances and applications,” Neural Computing and Applications, vol. 24, no. 1, pp. 169–174, 2014. View at Publisher · View at Google Scholar · View at Scopus
  8. J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proceedings of the IEEE International Conference on Neural Networks, vol. 4, pp. 1942–1948, Perth, Australia, November-December 1995. View at Publisher · View at Google Scholar
  9. D. Karaboga, “An idea based on honey bee swarm for numerical optimization,” Tech. Rep. tr06, Erciyes University, Engineering Faculty, Computer Engineering Department, 2005. View at Google Scholar
  10. X. S. Yang, “A new metaheuristic bat-inspired algorithm,” in Nature Inspired Cooperative Strategies for Optimization (NICSO 2010), vol. 284, pp. 65–74, Springer, 2010. View at Publisher · View at Google Scholar
  11. X. Yang, Nature-Inspired Metaheuristic Algorithms, Luniver Press, 2010.
  12. E. Talbi, Metaheuristics: From Design to Implementation, John Wiley & Sons, 2009.
  13. J. Xie, Y. Zhou, and H. Chen, “A novel bat algorithm based on differential operator and Lévy flights trajectory,” Computational Intelligence and Neuroscience, vol. 2013, Article ID 453812, 13 pages, 2013. View at Publisher · View at Google Scholar · View at Scopus
  14. Y. Zhou, J. Xie, and H. Zheng, “A hybrid bat algorithm with path relinking for capacitated vehicle routing problem,” Mathematical Problems in Engineering, vol. 2013, Article ID 392789, 10 pages, 2013. View at Publisher · View at Google Scholar · View at MathSciNet
  15. J. Xie, Y. Zhou, and H. Zheng, “A hybrid metaheuristic for multiple runways aircraft landing problem based on bat algorithm,” Journal of Applied Mathematics, vol. 2013, Article ID 742653, 8 pages, 2013. View at Publisher · View at Google Scholar · View at Scopus
  16. A. H. Gandomi, X.-S. Yang, A. H. Alavi, and S. Talatahari, “Bat algorithm for constrained optimization tasks,” Neural Computing and Applications, vol. 22, no. 6, pp. 1239–1255, 2013. View at Publisher · View at Google Scholar · View at Scopus
  17. X.-S. Yang and A. Hossein Gandomi, “Bat algorithm: a novel approach for global engineering optimization,” Engineering Computations, vol. 29, no. 5, pp. 464–483, 2012. View at Publisher · View at Google Scholar · View at Scopus
  18. A. Kaveh and P. Zakian, “Enhanced bat algorithm for optimal design of skeletal structures,” Asian Journal of Civil Engineering, vol. 15, no. 2, pp. 179–212, 2014. View at Google Scholar · View at Scopus
  19. X.-S. Yang, “Bat algorithm for multi-objective optimisation,” International Journal of Bio-Inspired Computation, vol. 3, no. 5, pp. 267–274, 2011. View at Publisher · View at Google Scholar · View at Scopus
  20. A. Rezaee Jordehi, “Chaotic bat swarm optimisation (CBSO),” Applied Soft Computing, vol. 26, pp. 523–530, 2015. View at Publisher · View at Google Scholar · View at Scopus
  21. J. Lin, C. Chou, C. Yang, and H. Tsai, “A chaotic Levy flight bat algorithm for parameter estimation in nonlinear dynamic biological systems,” Computer and Information Technology, vol. 2, no. 2, pp. 56–63, 2012. View at Google Scholar
  22. S. Yılmaz and E. U. Kucuksille, “Improved Bat Algorithm (IBA) on continuous optimization problems,” Lecture Notes on Software Engineering, vol. 1, no. 3, pp. 279–283, 2013. View at Publisher · View at Google Scholar
  23. G. Wang and L. Guo, “A novel hybrid bat algorithm with harmony search for global numerical optimization,” Journal of Applied Mathematics, vol. 2013, Article ID 696491, 21 pages, 2013. View at Publisher · View at Google Scholar · View at MathSciNet
  24. S. Yılmaz, E. U. Kucuksille, and Y. Cengiz, “Modified bat algorithm,” Elektronika Ir Elektrotechnika, vol. 20, no. 2, pp. 71–78, 2014. View at Publisher · View at Google Scholar · View at Scopus
  25. H. Afrabandpey, M. Ghaffari, A. Mirzaei, and M. Safayani, “A novel Bat Algorithm based on chaos for optimization tasks,” in Proceedings of the Iranian Conference on Intelligent Systems (ICIS '14), pp. 1–6, IEEE, Bam, Iran, February 2014. View at Publisher · View at Google Scholar · View at Scopus
  26. A. H. Gandomi and X.-S. Yang, “Chaotic bat algorithm,” Journal of Computational Science, vol. 5, no. 2, pp. 224–232, 2014. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  27. S. Kirkpatrick, C. D. Gelatt, and M. P. Vecchi, “Optimization by simulated annealing,” Science, vol. 220, no. 4598, pp. 671–680, 1983. View at Publisher · View at Google Scholar · View at MathSciNet
  28. J. Sun, B. Feng, and W. Xu, “Particle swarm optimization with particles having quantum behavior,” in Proceedings of the Congress on Evolutionary Computation (CEC '04), vol. 1, pp. 325–331, June 2004. View at Publisher · View at Google Scholar
  29. J. Sun, W. Fang, V. Palade, X. Wu, and W. Xu, “Quantum-behaved particle swarm optimization with Gaussian distributed local attractor point,” Applied Mathematics and Computation, vol. 218, no. 7, pp. 3763–3775, 2011. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  30. J. Sun, W. Fang, X. Wu, V. Palade, and W. Xu, “Quantum-behaved particle swarm optimization: analysis of individual particle behavior and parameter selection,” Evolutionary Computation, vol. 20, no. 3, pp. 349–393, 2012. View at Publisher · View at Google Scholar · View at Scopus
  31. S. Mirjalili, S. M. Mirjalili, and X. Yang, “Binary bat algorithm,” Neural Computing and Applications, vol. 25, no. 3-4, pp. 663–681, 2014. View at Google Scholar
  32. P. N. Suganthan, N. Hansen, J. J. Liang et al., “Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization,” KanGAL Report, 2005. View at Google Scholar
  33. J. Derrac, S. García, D. Molina, and F. Herrera, “A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms,” Swarm and Evolutionary Computation, vol. 1, no. 1, pp. 3–18, 2011. View at Publisher · View at Google Scholar · View at Scopus