Table of Contents Author Guidelines Submit a Manuscript
Scientific Programming
Volume 2016, Article ID 8031560, 13 pages
Research Article

Modified Bat Algorithm Based on Lévy Flight and Opposition Based Learning

1School of Science, China University of Petroleum, Qingdao 266580, China
2College of Mechanical and Electronic Engineering, China University of Petroleum, Qingdao 266580, China
3School of Economics and Management, China University of Petroleum, Qingdao 266580, China

Received 14 July 2016; Accepted 25 October 2016

Academic Editor: Xiang Li

Copyright © 2016 Xian Shan et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Bat Algorithm (BA) is a swarm intelligence algorithm which has been intensively applied to solve academic and real life optimization problems. However, due to the lack of good balance between exploration and exploitation, BA sometimes fails at finding global optimum and is easily trapped into local optima. In order to overcome the premature problem and improve the local searching ability of Bat Algorithm for optimization problems, we propose an improved BA called OBMLBA. In the proposed algorithm, a modified search equation with more useful information from the search experiences is introduced to generate a candidate solution, and Lévy Flight random walk is incorporated with BA in order to avoid being trapped into local optima. Furthermore, the concept of opposition based learning (OBL) is embedded to BA to enhance the diversity and convergence capability. To evaluate the performance of the proposed approach, 16 benchmark functions have been employed. The results obtained by the experiments demonstrate the effectiveness and efficiency of OBMLBA for global optimization problems. Comparisons with some other BA variants and other state-of-the-art algorithms have shown the proposed approach significantly improves the performance of BA. Performances of the proposed algorithm on large scale optimization problems and real world optimization problems are not discussed in the paper, and it will be studied in the future work.

1. Introduction

With the development of natural science and technology, a large number of complicated optimization problems arise in a variety of fields, such as engineering, manufacturing, information technology, and economic management. These real world problems usually have an objective function with lots of decision variables and constrains, appearing with discontinuous, nonconvex features [1]. Optimization methods are always needed to obtain the optimal solution of these problems.

Optimization algorithms proposed by researchers can be categorized into two groups: deterministic algorithms and stochastic algorithms. Deterministic algorithms usually need the information of gradient. They are effective for problems with one global optimum, while they might be invalid for problems with several local optima or problems with gradient information unavailable. Compared with deterministic algorithms, stochastic algorithms require only the information of the objective function [2]. They have been successfully used in many optimization problems characterized as nondifferentiable and nonconvex.

Swarm intelligence algorithms, regarded as a subset of stochastic algorithms, have been widely used to solve optimization problems during the past decades. They are inspired by the collective behavior of swarms in nature. Researchers have observed such behaviors of animals, plants, or humans, analyzed the driving force behind the phenomena, and then proposed various types of algorithms. For example, Genetic Algorithm (GA) [3] and Differential Evolution (DE) [4] were generalized by inspiration of the biological evolution and natural selection. Particle Swarm Optimization (PSO) [5] was proposed by simulating the social and cognitive behavior of fish or bird swarms. Ant Colony Algorithm (ACO) [6] and Artificial Bee Colony (ABC) [7] were introduced by the foraging behavior of ants and bees. Cat Swarm Algorithm (CSO) [8], Cuckoo Search Algorithm (CS) [9], and Bat Algorithm (BA) [10] were also proposed by inspiration from the intelligence features in behaviors or organisms of swarms.

Bat Algorithm, proposed by Yang in 2010, is a swarm intelligence algorithm inspired by the echolocation behavior of bats. Echolocation is a type of sonar which is used by bats to detect prey, hunt, and avoid obstacles. With the help of echolocation, not only can the bats be able to detect the distance of the prey, but also they can identify the shape, position, and angle of the prey [10]. Compared with other existing swarm intelligence algorithms, BA has many advantages such as less control parameters, excellent global optimization ability, and implementation simplicity, which have shown excellent performances in some classes of optimization problems. Due to its simplicity, convergence speed, and population feature, it has been intensively used to solve academic and real life problems, like multiobjective optimization [11], engineering optimization [12], cluster analysis [13], scheduling problems [14], structure optimization [15], image processing [16], manufacturing designing [17], and various other problems.

It has been shown that BA is an excellent and powerful algorithm for global optimization problems, discrete optimization problems, and constrained optimization problems. However, due to the lack of good balance between exploration and exploitation in basic BA, the algorithm sometimes fails at finding global optimum and is easily trapped into local optima. Much efforts have been made to improve the performance of BA. These improvements or modifications can be divided into two categories.

Introduction of New Search Mechanisms for Generating Candidate Solutions. Inspired by DE, Xie et al. [18] presented a modified approach DLBA with a new solution update mechanism combining the concept of differential operator and Lévy Flight trajectory. Li and Zhou [19] embedded the complex value encoding mechanism into Bat Algorithm to improve the diversity and convergence performance. Gandomi and Yang [20] introduced chaotic maps into the population initialization and position update equation and proposed a modified algorithm. Bahmani-Firouzi and Azizipanah-Abarghooee [21] proposed four improved velocity update equations in order to achieve a better balance between the exploration and exploitation. Jaddi et al. [22] divided the swarm population into two subpopulation groups with each group designed to generate solutions according to different equations, respectively. Yilmaz and Küçüksille [23] defined two modification structures of the velocity equation inspired by PSO and then hybridized the modified BA with Invasive Weed Optimization algorithm. The velocity equations used information of the best-so-far solution and the neighborhood solutions, which can effectively enhance the search ability of BA.

Hybridization of BA with Other Operators. For example, Khan and Sahai [24] proposed a hybrid approach involving PSO, HS, and SA for generating new solutions. Wang and Guo [25] introduced a mutation operator into BA using SA. He et al. [26] developed a hybrid BA which combines SA and Gauss distribution. Sadeghi et al. [27] developed a hybrid BA with local search based on PSO. Lin et al. [28] developed a chaotic BA with Lévy Flights and parameters estimated by chaotic maps. Meng et al. [29] incorporated the bats’ habitat selection and their self-adaptive compensation for Doppler effect in echoes into the basic BA and proposed a new self-adaptive modified algorithm NBA.

The study is not limited to the above two aspects and more work can be seen in [3034] and so on.

As technology and science develop dramatically fast, more and more complicated optimization problems need to be solved by advanced numerical methods. As an important heuristic optimization algorithm, researches on how to improve the convergence accuracy and efficiency of BA are of great significance to improve the heuristic optimization theory and solve the real world optimization problems.

Exploration and exploitation are two critical characteristics in the updating process of an algorithm. Exploration is the ability of the algorithm to find the global optimum solutions in different areas of the search space, while exploitation refers to the capability of the algorithm to find better solutions with previous experience. Researches in the literature show that, for a swarm intelligence algorithm, the exploration capability should be employed first so as to search in the whole space, while the exploitation capability should be considered later by improving the quantity of the solution in the local search process [23].

To achieve a better balance between exploration and exploitation behavior of BA, it is highly required to modify the global search approach to explore the search region and develop a local search method to exploit in the nonvisited regions. Accordingly, modifications of the search equation and local search strategies for BA are studied in the research. A modified algorithm called OBMLBA is proposed in this study. The main difference between OBMLBA and other BAs is the interaction behavior between bats. In the proposed approach, the bat explores the search space with the help of the best-so-far solution and the neighborhood solutions by using echolocation, which can well balance the exploration and exploitation. The new population is generated by using a new modified position update equation with frequency defined by sinusoidal function. Lévy Flight random walk is used to exploit the local search space whereas opposition based learning is used to introduce opposition based solutions in population OP. Individuals of and OP are merged together, and the best individuals are used as a new population for the next generation. The proposed method has been tested on 16 benchmark functions. Comparison with variants of BA and other state-of-the-art algorithms demonstrates the effectiveness and superiority of OBMLBA in solving optimization problems.

The paper is organized as follows. After the introduction, the basic concept of BA is introduced in Section 2. The proposed OBMLBA is presented in Section 3. Section 4 evaluates the performance of the proposed algorithm by comparing results with other algorithms. Finally, conclusions and future work are discussed in Section 5.

2. Basic Bat Algorithm

The optimization problems considered in this paper are single-objective, unconstrained continuous problems to be minimized.

Bat Algorithm is a heuristic algorithm proposed by Yang in 2010. It is based on the echolocation capability of micro bats guiding them on their foraging behavior. In BA, the position of a bat represents a possible solution of the given optimization problem. The position of the food source found by the th bat can be expressed as . The fitness of corresponds to the quality of the position the bat locates in.

2.1. Initiation

At the beginning, bats do not know the location of food sources. They generate a randomly distributed population of solutions, where denotes the number of bats in the population. Each solution can be produced within the search space as follows:where . and denote the upper and lower bounds of the solution in dimension , respectively. is a uniformly distributed value generated in the range of .

2.2. Generation of New Solutions

Bats modify the solution based on the information of the current location and the best source food location. Bats navigate by adjusting their flying directions using their own and other swarm members’ best experience to find the optimum of the problem.

At this stage, for each food source position , a new position was formulated as follows:where represents each bat in the population, , and represents the th iteration. and are the position and velocity components of the th bat in the population at the th iteration. denotes the pulse frequency that affects the velocity of the th bat. and represent the maximum and minimum of . is a random number between . is the best position found by the whole population.

2.3. Local Search

Once the new solutions are generated, the local search is invoked by bats’ random walk. If the pulse emission rate of the th bat is smaller than a random number, select from the population and generate a new position as follows:where represents a solution chosen in current population by some mechanism and is a random vector drawn from a uniform distribution. is the average loudness of all bats at the time step .

2.4. Solutions, Loudness, and Pulse Emission Rate Updating

If a random number is bigger than the loudness and , accept the new solution . At the same time, the loudness is decreased while its pulse emission is increased as follows:where and are constants. The initial loudness and initial pulse emission rate are randomly generated numbers in the range of and , respectively.

The pseudocode of Bat Algorithm is presented in Algorithm 1.

Algorithm 1: Basic Bat Algorithm.

3. Enhanced Bat Algorithm

3.1. Location Update Equation Based on DE

Yilmaz and Küçüksille [23] pointed that the update equations of velocity in basic BA provide local search only with the guidance of the best-so-far solution, which makes the BA easily entrapped into a local minimum. In order to improve the search ability, they proposed a modified equation:where is a solution randomly chosen from the population and and are learning factors ranging from 0 to 1. is the initial value of . is the maximal number of iterations. is the number of iterations, and is a nonlinear index.

The second term in the modified search equation is a factor affecting the velocity of the solution with guidance of the best solution , which emphasizes exploitation. The third term is another factor that affects the velocity with help of randomly selected solution which emphasizes exploration. Effects of the best solution and the kth solution are adjusted by changing the value of . Experiments results indicate that the modified algorithm can outperform BA in most test functions.

Xie et al. [18] proposed an improved BA based on Differential Evolution operator. The position updating equations were defined as follows:where is the global best location in current population. () is a randomly chosen solution in the population. and are defined as follows:where and are minimum and maximum frequency values of , while and are minimum and maximum frequency values of . is a fixed parameter. is the number of iterations. is a random vector drawing a uniform distribution. The modified algorithm has shown a better performance than classical BA.

Inspired by Yilmaz and Zhou’s methods, we propose a modified location updating equation:where are individuals randomly selected from the population. and are frequencies.

As in mutation operation of DE, the frequency is an important parameter that affects the convergence performance of the algorithm. The value of changes according to the predefined formula. In basic BA and DLBA, is defined in (2), (11), and (12), respectively. Here we use a sinusoidal formulation [35] permitting certain flexibility in changing the direction of . The frequencies and are defined as follows:where represents frequency of the sinusoidal function.

3.2. Local Search Based on Lévy Flight

Lévy Flight, which is based on the flight behavior of animals and insects, has been variously studied in literatures [36]. It is defined as a kind of non-Gauss stochastic process with a random walk whose step sizes are based on Lévy stable distribution.

There are lots of versions of Lévy distribution. Mantegna Algorithm [37] has been tested to be the most efficient method to generate Lévy distribution values. It has been adopted in many evolution algorithms in order to escape from local minimum.

When generating a new solution for the th solution by performing Lévy Flight, the new candidate is defined as follows:where is a random step size parameter, is a Lévy Flight distribution parameter, and denotes entry wise multiplication.

In Mantegna Algorithm, the step size can be defined as follows:where and are obtained from normal distribution; that is,with

The pseudocode of Lévy Flight Search is shown in Algorithm 2.

Algorithm 2: Pseudocode of Lévy Flight Search.
3.3. Opposition Based Learning

Opposition based learning was proposed by Rahnamayan et al. [38] in 2005. It has been successfully used in machine learning and evolutionary computing [39]. For an optimization problem, if the candidate solutions are near the global optima, it is easy to find the ideal solutions with fast convergence speed. But if the candidate solutions are far away from the global optima, it will take more computational efforts to get the required solutions. The concept of OBL is to consider the candidate solutions and their opposite counterpart solutions simultaneously in an optimization algorithm. Then the greedy selection is applied between them according to their objective function values. With the help of OBL, the probability of finding global optima is increased.

In basic OBL, let be a -dimensional solution in the search space, with component variables . Then the opposite point is defined by its coordinates , where

OBL can be used not only to initialize the population initialization, but also to improve the population diversity during the evolutionary process. Once the opposite population generated, it is combined with the original population for selection. Individuals with fitness ranking in the top half of all candidate solutions will be selected as the current population. The pseudocode of OBL is given in Algorithm 3.

Algorithm 3: Opposition based optimization.
3.4. Proposed Modified Bat Algorithm

Based on the basic BA and the above considerations, we propose a modified method, called OBMLBA. It is clear that local search with Lévy Flight can improve the ability of exploitation. Incorporating with OBL strategy can enhance the population diversity. Inspired by DLBA, the new location update equation takes advantage of the information of both the global best solution and randomly chosen solution near the candidate solution. So the modified BA can get a good balance between the exploration and exploitation. The main steps of the algorithm are summarized in Algorithm 4.

Algorithm 4: Pseudocode of OBMLBA.

4. Experimental Results and Discussion

In order to evaluate the performance of the proposed algorithm, we select 16 standard benchmark functions to investigate the efficiency of the proposed approach. These functions are presented in Table 1. The function set comprises unimodal functions and multimodal functions, which can be used to test the convergence speed and local premature convergence problem, respectively. ~ are continuous unimodal functions. is a discontinuous step function. is a noisy quartic function. ~ are multimodal functions with high dimension. These functions are categorized into low-dimension, middle-dimension, and high-dimension functions according to , 30, and 60, respectively.

Table 1: Benchmark problems.

Related parameters of the basic BA, DLBA, and OBMLBA are presented in Table 2. The maximum number of iterations is set to be 500, 1000, and 2000 in case of , 30, and 60.

Table 2: Parameters setting.

Experiments have been carried out on the test functions with different dimensions. The computational results are summarized in Tables 35 in terms of the mean error and standard deviation of the function error values. In addition, some convergence characteristics graphs are shown in Figure 1.

Table 3: Results of comparisons of BAs on 15-dimension benchmark functions.
Table 4: Results of comparisons of BAs on 30-dimension benchmark functions.
Table 5: Results of comparisons of BAs on 60-dimension benchmark functions.
Figure 1: Convergence performance of BAs on the test functions.

As seen from Tables 35, OBMLBA can get the global optimal on ~, and ~ with , 30, and 60. On and , the precision of solutions obtained by OBMLBA is smaller than the order of and the order for , respectively. Results indicate that OBMLBA can get solutions extremely close to the global optimal values on most of the functions.

Solutions obtained by BAs are compared with results obtained by MLBA and OBMLBA. As shown in Tables 35, OBMLBA performs significantly better than basic BA, LBA, and DLBA on ~, and . All algorithms except for basic BA and LBA perform well on ~, and . OBMLBA is comparable to DLBA and MLBA on functions and .

Compared with MLBA, OBMLBA provides better performance on functions ~, and . For , and , results obtained by MLBA and OBMLBA are quite close to each other. Both MLBA and OBMLBA get the global optimum of ~, and .

From the convergence characteristic graphs, it can be obviously observed that OBMLBA is the fastest algorithm in all the considered BAs. MLBA is slower than OBMLBA but faster than DLBA, LBA, and basic BA. OBMLBA exhibits the best accuracy and achieves the fastest convergence speed. The results indeed indicate the advantage of the modified position update equation and OBL strategy.

On the whole, algorithms proposed in this study such as MLBA and OBMLBA work better than other BAs considered. And OBMLBA is more effective than MLBA.

In Table 6, OBMLBA is compared with some state-of-the-art algorithms, such as ABC, PSO, OCABC, CLPSO, and OLPSO-G. Results of these algorithms are all derived directly from their corresponding literatures [40]. Results show that OBMLBA has outperformed other methods on 6 of 7 functions. OBMLBA performs worse than ABC and OCABC only on Rosenbrock function.

Table 6: Comparisons of the state-of-the-art algorithms on 30-dimension benchmark functions.

According to the analyses above, it can be clearly observed that OBMLBA provides outstanding performance with fast convergence speed and high convergence accuracy on most of the test functions.

5. Conclusions

Bat Algorithm, a metaheuristic algorithm proposed recently, has been successfully used to solve different types of optimization problems. However, it also has some inefficiency on exploration and exploitation capabilities.

In this study, a new variant of Bat Algorithm called OBMLBA is presented. The exploration and exploitation abilities of BA are balanced and enhanced in the search process by combining the information of the best-so-far solution and the neighborhood solutions into the search equations. At the same time, the proposed equation uses sinusoidal function to define the bat pulse frequency , allowing flexibility in changing the direction of . Furthermore, Lévy Flight random walk is taken into consideration to escape from local optima. The concept of opposition based learning is introduced to the algorithm to improve the diversity and convergence capability. The proposed approach has been successfully used on 16 benchmark functions. Results and comparisons with the basic BA, DLBA, and other state-of-the-art algorithms show that OBMLBA has outperformed considered approaches in most of the functions. However, these experiments are just small scale optimization problems. The performance of the proposed algorithm on large scale optimization problems and real world optimization problems should be deeply investigated. The future study of this method may focus on investigating the mathematical foundations and applications of the proposed algorithm.

Competing Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.


This work was supported in part by the National Nature Science Foundation of China under Grant 61402534, by the Shandong Provincial Natural Science Foundation, China, under Grant ZR2014FQ002, and by the Fundamental Research Funds for the Central Universities under Grant 16CX02010A.


  1. S. Rao, Engineering Optimization: Theory and Practice, New Age International, 1996.
  2. X. Yang, Nature-Inspired Metaheuristic Algorithms, Luniver Press, 2nd edition, 2010.
  3. K. S. Tang, K. F. Man, S. Kwong, and Q. He, “Genetic algorithms and their applications,” IEEE Signal Processing Magazine, vol. 13, no. 6, pp. 22–37, 1996. View at Publisher · View at Google Scholar · View at Scopus
  4. R. Storn and K. Price, “Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341–359, 1997. View at Publisher · View at Google Scholar · View at MathSciNet
  5. J. Kennedy and R. C. Eberhart, “Particle swarm optimization,” in Proceedings of the IEEE International Conference on Neural Networks, vol. 4, pp. 1942–1948, Perth, Australia, December 1995. View at Scopus
  6. M. Dorigo and T. Stutzle, Ant Colony Optimization, MIT Press, Cambridge, Mass, USA, 2004.
  7. D. Karaboga, “An idea based on honey bee swarm for numerical optimization,” Tech. Rep. TR06, Erciyes University, Engineering Faculty, Computer Engineering Department, 2005. View at Google Scholar
  8. S.-C. Chu, P.-W. Tsai, and J.-S. Pan, “Cat swarm optimization,” in PRICAI 2006: Trends in Artificial Intelligence, Q. Yang and G. Webb, Eds., vol. 4099 of Lecture Notes in Computer Science, pp. 854–858, Springer, Berlin, Germany, 2006. View at Publisher · View at Google Scholar
  9. X.-S. Yang and S. Deb, “Cuckoo search via Lévy flights,” in Proceedings of the World Congress on Nature and Biologically Inspired Computing (NABIC '09), pp. 210–214, Coimbatore, India, December 2009. View at Publisher · View at Google Scholar · View at Scopus
  10. X.-S. Yang, “A new metaheuristic bat-inspired algorithm,” in Nature Inspired Cooperative Strategies for Optimization, pp. 65–74, Springer, 2010. View at Publisher · View at Google Scholar
  11. T. C. Bora, L. D. S. Coelho, and L. Lebensztajn, “Bat-inspired optimization approach for the brushless DC wheel motor problem,” IEEE Transactions on Magnetics, vol. 48, no. 2, pp. 947–950, 2012. View at Publisher · View at Google Scholar · View at Scopus
  12. M. R. Sathya and M. M. T. Ansari, “Load frequency control using Bat inspired algorithm based dual mode gain scheduling of PI controllers for interconnected power system,” International Journal of Electrical Power & Energy Systems, vol. 64, pp. 365–374, 2015. View at Publisher · View at Google Scholar · View at Scopus
  13. S. Mishra, K. Shaw, and D. Mishra, “A new meta-heuristic bat inspired classification approach for microarray data,” Procedia Technology, vol. 4, pp. 802–806, 2012. View at Publisher · View at Google Scholar
  14. P. Musikapun and P. Pongcharoen, “Solving multi-stage multi-machine multi-product scheduling problem using bat algorithm,” in Proceedings of the 2nd International Conference on Management and Artificial Intelligence (IPEDR '12), vol. 35, pp. 98–102, 2012.
  15. O. Hasançebi, T. Teke, and O. Pekcan, “A bat-inspired algorithm for structural optimization,” Computers and Structures, vol. 128, pp. 77–90, 2013. View at Publisher · View at Google Scholar · View at Scopus
  16. J. Zhang and G. Wang, “Image matching using a bat algorithm with mutation,” Applied Mechanics and Materials, vol. 203, pp. 88–93, 2012. View at Publisher · View at Google Scholar · View at Scopus
  17. E. S. Ali, “Optimization of Power System Stabilizers using BAT search algorithm,” International Journal of Electrical Power and Energy Systems, vol. 61, pp. 683–690, 2014. View at Publisher · View at Google Scholar · View at Scopus
  18. J. Xie, Y. Zhou, and H. Chen, “A novel bat algorithm based on differential operator and Lévy flights trajectory,” Computational Intelligence and Neuroscience, vol. 2013, Article ID 453812, 13 pages, 2013. View at Publisher · View at Google Scholar · View at Scopus
  19. L. L. Li and Y. Q. Zhou, “A novel complex-valued bat algorithm,” Neural Computing and Applications, vol. 25, no. 6, pp. 1369–1381, 2014. View at Publisher · View at Google Scholar · View at Scopus
  20. A. H. Gandomi and X.-S. Yang, “Chaotic bat algorithm,” Journal of Computational Science, vol. 5, no. 2, pp. 224–232, 2014. View at Publisher · View at Google Scholar · View at Scopus
  21. B. Bahmani-Firouzi and R. Azizipanah-Abarghooee, “Optimal sizing of battery energy storage for micro-grid operation management using a new improved bat algorithm,” International Journal of Electrical Power and Energy Systems, vol. 56, pp. 42–54, 2014. View at Publisher · View at Google Scholar · View at Scopus
  22. N. S. Jaddi, S. Abdullah, and A. R. Hamdan, “Multi-population cooperative bat algorithm-based optimization of artificial neural network model,” Information Sciences, vol. 294, pp. 628–644, 2015. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  23. S. Yilmaz and E. U. Küçüksille, “A new modification approach on bat algorithm for solving optimization problems,” Applied Soft Computing, vol. 28, pp. 259–275, 2015. View at Publisher · View at Google Scholar · View at Scopus
  24. K. Khan and A. Sahai, “A comparison of BA, GA, PSO, BP and LM for training feed forward neural networks in e-learning context,” International Journal of Intelligent Systems and Applications, vol. 4, no. 7, pp. 23–29, 2012. View at Publisher · View at Google Scholar
  25. G. Wang and L. Guo, “A novel hybrid bat algorithm with harmony search for global numerical optimization,” Journal of Applied Mathematics, vol. 2013, Article ID 696491, 21 pages, 2013. View at Publisher · View at Google Scholar · View at MathSciNet
  26. X.-S. He, W.-J. Ding, and X.-S. Yang, “Bat algorithm based on simulated annealing and gaussian perturbations,” Neural Computing and Applications, vol. 25, no. 2, pp. 459–468, 2014. View at Publisher · View at Google Scholar · View at Scopus
  27. J. Sadeghi, S. M. Mousavi, S. T. A. Niaki, and S. Sadeghi, “Optimizing a bi-objective inventory model of a three-echelon supply chain using a tuned hybrid bat algorithm,” Transportation Research Part E: Logistics and Transportation Review, vol. 70, no. 1, pp. 274–292, 2014. View at Publisher · View at Google Scholar · View at Scopus
  28. J.-H. Lin, C.-W. Chou, C.-H. Yang, and H.-L. Tsai, “A chaotic levy flight bat algorithm for parameter estimation in nonlinear dynamic biological systems,” Journal of Computing and Information Technology, vol. 2, no. 2, pp. 56–63, 2012. View at Google Scholar
  29. X.-B. Meng, X. Z. Gao, Y. Liu, and H. Zhang, “A novel bat algorithm with habitat selection and Doppler effect in echoes for optimization,” Expert Systems with Applications, vol. 42, no. 17-18, pp. 6350–6364, 2015. View at Publisher · View at Google Scholar · View at Scopus
  30. X. Cai, W. Li, L. Wang, Q. Kang, Q. Wu, and X. Huang, “Bat algorithm with Gaussian walk for directing orbits of chaotic systems,” International Journal of Computing Science and Mathematics, vol. 5, no. 2, pp. 198–208, 2014. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  31. S. Yilmaz and E. U. Kucuksille, “Improved Bat Algorithm (IBA) on continuous optimization problems,” Lecture Notes on Software Engineering, vol. 1, no. 3, pp. 279–283, 2013. View at Publisher · View at Google Scholar
  32. A. M. Taha and A. Y. C. Tang, “Bat algorithm for rough set attribute reduction,” Journal of Theoretical and Applied Information Technology, vol. 51, no. 1, pp. 1–8, 2013. View at Google Scholar · View at Scopus
  33. P.-W. Tsai, J.-S. Pan, B.-Y. Liao, M.-J. Tsai, and V. Istanda, “Bat algorithm inspired algorithm for solving numerical optimization problems,” Applied Mechanics and Materials, vol. 148-149, pp. 134–137, 2012. View at Publisher · View at Google Scholar · View at Scopus
  34. I. Pavlyukevich, “Lévy flights, non-local search and simulated annealing,” Journal of Computational Physics, vol. 226, no. 2, pp. 1830–1844, 2007. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  35. A. Draa, S. Bouzoubia, and I. Boukhalfa, “A sinusoidal differential evolution algorithm for numerical optimisation,” Applied Soft Computing Journal, vol. 27, pp. 99–126, 2015. View at Publisher · View at Google Scholar · View at Scopus
  36. X.-S. Yang and S. Deb, “Eagle strategy using Lévy walk and firefly algorithms for stochastic optimization,” Studies in Computational Intelligence, vol. 284, pp. 101–111, 2010. View at Publisher · View at Google Scholar · View at Scopus
  37. R. N. Mantegna, “Fast, accurate algorithm for numerical simulation of Lévy stable stochastic processes,” Physical Review E, vol. 49, no. 5, pp. 4677–4683, 1994. View at Publisher · View at Google Scholar · View at Scopus
  38. S. Rahnamayan, H. R. Tizhoosh, and M. M. A. Salama, “Opposition-based differential evolution for optimization of noisy problems,” in Proceedings of the 2006 IEEE Congress on Evolutionary Computation (CEC '06), pp. 1865–1872, Vancouver, Canada, July 2006. View at Scopus
  39. H. R. Tizhoosh, “Opposition-based learning: a new scheme for machine intelligence,” in Proceedings of the International Conference on Computational Intelligence for Modelling, Control and Automation, pp. 695–701, 2005.
  40. W.-F. Gao, S.-Y. Liu, and L.-L. Huang, “A novel artificial bee colony algorithm based on modified search equation and orthogonal learning,” IEEE Transactions on Cybernetics, vol. 43, no. 3, pp. 1011–1024, 2013. View at Publisher · View at Google Scholar · View at Scopus