Table of Contents Author Guidelines Submit a Manuscript
Corrigendum

A corrigendum for this article has been published. To view the corrigendum, please click here.

Mathematical Problems in Engineering
Volume 2015, Article ID 715635, 14 pages
http://dx.doi.org/10.1155/2015/715635
Research Article

Cuckoo Search Algorithm with Chaotic Maps

College of Computer and & Information Science, Fujian Agriculture and Forestry University, Fuzhou 350002, China

Received 5 March 2015; Revised 25 June 2015; Accepted 28 June 2015

Academic Editor: Evangelos J. Sapountzakis

Copyright © 2015 Lijin Wang and Yiwen Zhong. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Cuckoo search algorithm is a novel nature-inspired optimization technique based on the obligate brood parasitic behavior of some cuckoo species. It iteratively employs Lévy flights random walk with a scaling factor and biased/selective random walk with a fraction probability. Unfortunately, these two parameters are used in constant value schema, resulting in a problem sensitive to solution quality and convergence speed. In this paper, we proposed a variable value schema cuckoo search algorithm with chaotic maps, called CCS. In CCS, chaotic maps are utilized to, respectively, define the scaling factor and the fraction probability to enhance the solution quality and convergence speed. Extensive experiments with different chaotic maps demonstrate the improvement in efficiency and effectiveness.

1. Introduction

Cuckoo search algorithm (CS) is a novel nature-inspired approach based on the obligate brood parasitic behavior of some cuckoo species in combination with the Lévy flights behavior of some birds and fruit flies [1, 2]. Subsequent investigations [2, 3] have demonstrated that CS is a simple yet very promising population-based stochastic search technique by using Lévy flights random walk (LFRW) and biased/selective random walk (BSRW). LFRW with a scaling factor parameter uses a mutation operator to generate new solutions based on a best solution obtained so far, while BSRW with a fraction probability parameter employs a complex crossover operator to search new solutions. After each random walk, a greedy strategy is utilized to select a better solution from the current and new generated solutions according to their fitness.

Due to its promising performance, CS has received much attention. Some studies have focused on improving LFRW [410] and BSRW [1115]. Some attempts have been made to combine CS with other optimization techniques like particle swarm optimization [16, 17], Tabu search [18], differential evolution [19], ant colony optimization [20], and cooperative coevolutionary framework [21, 22]. The above studies have shown their contribution to the research on CS. Except for the literatures [9, 10], however, these studies used the definition of the scaling factor and the fraction probability in the constant value way, resulting in making CS sensitive to the optimization problems. This motivates us to study the scaling factor and the fraction probability using the variable value schema.

One of the mathematical approaches for the variable value schema is chaos. Chaos theory is related to the study of chaotic dynamical systems that are highly sensitive to the initial conditions [23]. Recently, chaos theory has been integrated into genetic algorithm [24], differential evolution [25], firefly algorithm [26], krill herd [27, 28], and biogeography-based optimization [23, 29], and these have shown the effectiveness and efficiency of chaos theory. In light of the above investigations, we propose chaotic cuckoo search algorithm, called CCS, which utilizes chaotic maps to define the scaling factor and the fraction probability. The comprehensive experiments are carried out on 20 benchmark functions, and the results show that chaotic maps can improve the solution quality and convergence speed of CS effectively and efficiently.

The main contribution of this paper is to define the variable value for the scaling factor and the fraction probability using chaotic maps. This leads to the major advantages of our approach as follows: (i) since the scaling factor and the fraction probability are used in constant value way, the variable value schema of two parameters is generally more suitable for the optimization problems, resulting in better performance; (ii) due to the simpleness of chaotic maps, our approach does not increase the overall complexity of CS; (iii) our approach does not destroy the structure of CS; thus, it is still very simple.

The remainder of this paper is organized as follows. Section 2 describes the standard cuckoo search algorithm. Section 3 presents the cuckoo search algorithm with chaos. Section 4 reports the experimental results. Section 5 draws conclusion on this paper.

2. Cuckoo Search Algorithm

CS, developed recently by Yang and Deb [1, 2], is a simple yet very promising population-based stochastic search technique. In general, when CS is used to solve an objective function with the solution space [], , a nest represents a candidate solution .

In the initialization phase, CS initializes solutions that are randomly sampled from solution space bywhere represents a uniformly distributed random variable on the range and is the population size.

After initialization, CS goes into an iterative phase where two random walks: Lévy flights random walk and biased/selective random walk, are employed to search for new solutions. After each random walk, CS selects a better solution according to the new generated and current solutions fitness using the greedy strategy. At the end of each iteration process, the best solution is updated.

2.1. Lévy Flights Random Walk

Broadly speaking, LFRW is a random walk whose step-size is drawn from Lévy distribution. At generation   , LFRW can be formulated as follows:where α is a step-size which is related to the scales of the problem. In CS, LFRW is employed to search for new solutions around the best solution obtained so far. Therefore, the step-size can be obtained by the following equation [2]:where is a scaling factor (generally, ) and represents the best solution obtained so far.

The product means entry-wise multiplications. Lévy(β) is a random number, which is drawn from a Lévy distribution for large steps:

In implementation, Lévy(β) can be calculated as follows [2]:where is a constant and set to 1.5 in the standard software implementation of CS [2], and are random numbers drawn from a normal distribution with mean of 0 and standard deviation of 1, and is a gamma function.

Obviously, (2) can be reformulated as

2.2. Biased/Selective Random Walk

BSRW is used to discover new solutions far enough away from the current best solution by far field randomization [1]. First, a trial solution is built with a mutation of the current solution as base vector and two randomly selected solutions as perturbed vectors. Second, a new solution is generated by a crossover operator from the current and the trial solutions. BSRW can be formulated as follows:where the random indexes and are the th and th solutions in the population, respectively, is the th dimension of the solution, and are random numbers on the range , and is a fraction probability.

3. Chaotic Cuckoo Search Algorithm

In this section, we first present different chaotic maps. Then, we apply them to define the scaling factor and the fraction probability. We last propose the framework of cuckoo search algorithm with chaotic maps, called CCS.

3.1. Chaotic Maps

Chaos theory is a field of study in mathematics, with applications in several disciplines including physics, engineering, economics, biology, and philosophy. Chaos theory studies the behavior of dynamical systems that are highly sensitive to initial conditions, an effect which is popularly referred to as the butterfly effect. One of ways to make quantitative statements about the behavior of chaotic systems is chaotic map like Circle map [30], Gauss map [30], Logistic map [31], Piecewise map [32], Sine map [33], Singer map [34], Sinusoidal map [31], and Tent map [35], shown in Table 1. Additionally, the visualization of these chaotic maps with the initial point at 0.7 is plotted in Figure 1. The other chaotic maps can be found in [26, 28].

Table 1: Chaotic maps.
Figure 1: Visualization of different chaotic maps.
3.2. Chaotic Maps for the Scaling Factor

As seen from (6), the large scaling factor does not fit the problems with the narrow search space because it may make Lévy flights random walk become too aggressive and then jump outside of the search domain, resulting in wasting of function evaluations. In addition, for the wide search space, the small scaling factor cannot make contribution to the efficiency of search. Obviously, utilizing the constant value scaling factor is not more optimum for the problems. Therefore, we employ the chaotic maps to provide the chaotic behaviors for cuckoo search to define the scaling factor and rewrite (6) as follows:where is a chaotic sequence.

3.3. Chaotic Maps for the Fraction Probability

In (7), the fraction probability is used to control how many dimensions in expectation are changed in a solution. For low values of , a large number of dimensions of a solution are changed in each generation. In this case, it is in favor of the exploration of CS. On the other hand, high values of cause most of the directions of the new solution to be inherited from itself. This is beneficial to the exploitation of CS. Apparently, a variable value can dynamically balance the exploration and exploitation. Thus, we utilize chaotic maps to define the fraction probability to balance the exploration and exploitation and rewrite (7) as follows:where is a chaotic sequence.

3.4. Framework of CCS

According to the above descriptions, we give the framework of CCS in Algorithm 1.

Algorithm 1: CCS.

4. Simulation and Results

In this section, a suit of 20 benchmark functions used in [36] is utilized to verify the performance of the proposed approach. These 20 benchmark functions can be divided into three groups: (i) unimodal functions including and ; (ii) multimodal functions containing , , , , , , , and ; and (iii) rotated and/or shifted functions . A more detailed description of them can be found in [36, 37]. Additionally, we use Error, Evaluation, and Convergence graphs as performance evaluation criteria.

Error is the function error which is defined as , where is the global optimum of the function and is the best solution obtained by the algorithm in a given run. In addition, Error is also recorded in different runs, and the average and the standard deviation of Error are calculated and noted as “” used in different tables. Moreover, the Wilcoxon signed-rank test at the 0.05 significance level is used to show significance between two algorithms. The “−” symbol shows that the null hypothesis is rejected, and the first algorithm outperforms the second one. The “+” symbol means the null hypothesis is rejected and the first algorithm is inferior to the second one. The “” symbol reveals that the null hypothesis is accepted and the first algorithm ties the second one. Additionally, the total number of each symbol “” is summarized at the bottom of different tables.

Evaluation is the number of function evaluations needed for reaching the accuracy level or 10−2 suggested in [36] within the maximum number of fitness evaluations set to , where is the dimension of function. Furthermore, we also recorded Evaluation in different runs and calculate the average and standard deviation of it which are signed as “,” where denotes the number of successful runs in which an algorithm within the maximum number of fitness evaluations could reach the accuracy level ɛ.

Convergence graphs are the convergence curve graphs of each algorithm for the problems within the maximum number of fitness evaluations. These graphs show the average Error performance of the total runs, in respective experiments.

4.1. Sensitivities to Chaotic Maps

It can observed from Figure 1 that different chaotic maps show different chaotic behaviors. In this section, therefore, we analyze the performance of CCS affected by different chaotic maps. To verify the sensitivity of different chaotic maps to the performance, we utilize a simple combination where different chaotic maps are employed to define the scaling factor, and the fraction probability is defined by using Gauss map according to low constant value which is used in CS. In this case, we have cCCS with Circle map, gCCS with Gauss map, lgCCS with Logistic map, pCCS with Piecewise map, seCCS with Sine map, srCCS with Singer map, slCCS with Sinusoidal map, and tCCS with Tent map. Table 2 lists the average Error of CCS with different chaotic maps, and Table 3 gives the results of the Friedman test similarly done in [38].

Table 2: Average Error values obtained by CCS with different chaotic maps for 20 benchmark functions at .
Table 3: Average ranking of eight algorithms by the Friedman test for 20 functions at .

As observed from Table 2, for most functions, CCS with different chaotic maps shows similar average Error. However, Table 3 shows that cCCS is best, followed by srCCS, pCCS, tCCS, seCCS, slCCS, lgCCS, and gCCS. This suggests that the performance of CCS for part of functions is slightly sensitive to chaotic maps, and the combination of Cycle map and Gauss map is the better selection for cuckoo search algorithm. It is worthy saying that there are many combinations of chaotic maps to be used. Thus, in the future work, we will comprehensively test different combinations in CCS.

4.2. Comparison with CS via Random Value

Note that the random value can also be regarded as the variable value schema. To show the advantage of CS with chaotic maps, CS with random value, called rCS, is tested on 20 benchmark functions at . In rCS, the random strategy is used to define the scaling factor and the fraction probability whose values are sampled from a uniform distribution on range between 0 and 1. Table 4 lists the statistical Error, and Table 5 reports the multiple problems statistical analysis between CCS and rCS for all functions based on the Wilcoxon test similarly done in [38, 39].

Table 4: Error obtained by rCS and CCS at .
Table 5: Results of the multiple-problem Wilcoxon’s test for CCS and rCS for 20 functions at .

We can find from Table 4 that rCS and CCS, respectively, show their advantage on different functions. These two algorithms have the same performance on a handful of functions like , , , , , and . Moreover, rCS performs better on , , , , and , while CCS gains better performance on , , and , especially on rotated and/or shifted functions like , , , , , and . According to the results of “,” CCS is superior to rCS on 8 out of 20 functions, is equal to rCS on 10 out of 20 ones, and is inferior to rCS on 2 out of 2 ones.

Additionally, it can be seen from Table 5 that CCS gets higher value than value. The above suggests that chaotic sequences make greater and more stable contribution to the performance of CS than random sampling sequences. This is because chaotic sequences are in fact generated deterministically from the dynamical system, while random sampling sequences are nondeterministic and different, even if the initial state is the same.

4.3. Effect of Chaotic Maps on CS

To show how chaotic maps can improve the performance of CS, we carry out experiments on the 20 benchmark functions at with population size , at with population size , and at with population size , respectively, since a part of benchmark functions are defined for up to [37]. CS and CCS are tested 25 times for each function, respectively. The fraction probability of CS is 0.25, while the Cycle map and the Gauss map, whose initial values are 0.7 similarly done in [23, 26], are used to define the scaling factor and the fraction probability , respectively. Table 6 shows Error of two algorithms at different dimensions.

Table 6: Error obtained by CS and CCS for 20 functions at , , and .

Table 6 clearly shows that chaotic maps can overall significantly improve the performance of CS according to the average Error at , , and .

In the case of , observed from Table 6, CCS can gain solutions with higher accuracy for all functions except for . In terms of the Wilcoxon signed-rank test, CCS performs better on 19 out of 20 functions and shows equivalent performance to CS on 1 out of 20 ones.

In the case of , for unimodal functions, CCS outperforms CS significantly. For multimodal functions, CCS apparently achieves higher accurate solutions than CS does. In addition, CCS obtains the global optimal solution to . For rotated and/or shifted functions, CCS is not significantly inferior to CS on and equivalent to CS on . However, CCS performs better than CS for the other 8 out of 10 functions. Especially on , CCS achieves the global optima. In all, in terms of “,” compared with CS, CCS, respectively, shows better and equivalent performance on 17 and 3 out of 20 benchmark functions.

When , the accuracy of both solutions of two algorithms is reduced on most functions. However, compared with CS, CCS still achieves higher accurate solutions to all functions except for . In addition, CCS reaches the global optimal solution to . According to the statistical results, CCS outperforms CS on 18 out of 20 benchmark functions.

Furthermore, to show the convergence speed of CCS reaching the accuracy level , Table 7 lists the Evaluation performance of two algorithms at . Table 7 clearly shows that CCS performs the overall more stable convergence to the accuracy level . For example, CS and CCS both reach the accuracy level ɛ steadily on , , , , and , but CCS converges faster than CS does. Moreover, CCS has more stable convergence on , , and . In addition, for , although CS converges steadily to the accuracy level, CCS has faster convergence speed.

Table 7: Average Evaluation obtained by CS and CCS at .

Additionally, convergence graphs of CS and CCS for some functions at are plotted in Figure 2. It can be observed that CCS apparently converges faster than CS in terms of convergence curves.

Figure 2: Convergence graphs of CS and CCS.

According to Error, Evaluation, and Convergence graphs, CCS overall significantly improves the solution quality and convergence speed of CS. This is because chaotic maps can provide various search step information, and more probabilistic learning from others, which are beneficial to improve the search ability of CS. Additionally, the analysis of scalability suggests that the advantage of CCS over CS is overall stable when the dimensionality of the problems increases.

4.4. Sensitivities to Initial Value of Chaotic Maps

It is worthy pointing out that the chaotic sequences are highly sensitive to initial condition. To show the performance of CCS affected by the initial value, we perform the experiments on chaotic maps with different initial values. The results are listed in Table 8, where the initial values are 0.25 and 0.5, resulting in CCS25 and CCS5, respectively. The other parameters are kept unchanged.

Table 8: Error obtained by CCS25, CCS5, and CCS for 20 functions at .

Seen from Table 8, we can find that the performance of CCS will be influenced weakly by the initial value of chaotic maps in terms of Error. CCS25 obtains the highest accuracy on , , and , while CCS5 brings the highest accurate solutions to , , , , and . However, CCS achieves the solutions with highest accuracy for most functions. Nevertheless, according to their statistical results “,” CCS shows better performance than CCS25 and CCS5 on 6 out of 20 functions and draws a tie of CCS25 and CCS5 on 11 and 12 out of 20 functions, respectively. This suggests that the initial value 0.7 in default is the better selection.

4.5. Comparison with Other Improved CS Algorithms

To show the competitiveness of CCS with the other improved CS algorithms, we compare it at with three improved versions, called ICS [9], CSPSO [16], and OLCS [40]. Note that ICS defines the scaling factor and the fraction probability in variable value schema based on two maximum and minimum parameters. The results are reported in Tables 9, 10, and 11, respectively.

Table 9: Error obtained by ICS, CSPSO, OLCS, and CCS for 20 functions at .
Table 10: Results of the multiple-problem Wilcoxon’s test for ICS, CSPSO, OLCS, and CCS for 20 functions at .
Table 11: Average ranking of eight algorithms by the Friedman test for 20 functions at .

As observed from Table 9, each algorithm shows its advantage on parts of functions. For example, ICS performs better on , , , , and . CSPSO gains the highest accurate solution to . OLCS obtains the solutions with higher accuracy on , , , and and reaches the global optima on and . CCS achieves the global solutions to and and presents its advantage on rotated or shifted functions like , , , , and . Nevertheless, with the help of “,” CCS outperforms ICS, CSPSO, and OLCS on 9, 16, and 13 out of 20 functions. Moreover, Table 10 shows that CCS yields the higher values than values in all cases. In addition, it can be seen from Table 11 clearly that CCS gains the first average ranking, followed by ICS, OLCS, and CSPSO.

4.6. Discussion

CCS shows its promising performance by using two chaotic maps simultaneously to define the scaling factor and the fraction probability. In this case, two chaotic maps make cooperative contribution to the performance of CCS. In this section, therefore, we discuss the contribution of each chaotic map to the performance of CCS. To analyze the contribution of each chaotic map, we consider two derived algorithms: CCS1 and CCS2. The former uses chaotic map to define the scaling factor and keeps the original BSRW, while the later utilizes chaotic map to define the fraction probability and keeps the original LFRW. CCS1 and CCS2 are performed on 20 benchmark functions at , and the results are listed in Table 12.

Table 12: Error obtained by CS, CCS1, CCS2, and CCS for 20 functions at .

It can be observed from Table 12 that the single chaotic map makes different contribution to the performance of CCS for different functions. Compared with CS, CCS1 singly brings solutions with higher accuracy to , , , , , , , and , while CCS2 alone achieves higher accurate solutions to . Due to the contribution of these higher accurate solutions, CCS yields better performance. Moreover, it can be suggested from Table 12 that CCS1 and CCS2 both achieve better performance and cooperatively make contribution to the performance of CCS. For example, for most of rotated and/or shifted functions, for example, , , , , and , CCS1 and CCS2 obtain the slightly higher accurate solutions, but CCS further performs better due to their cooperative contribution.

5. Conclusion and Future Work

In CS, the scaling factor and the fraction probability parameters are used in constant value way, resulting in a problem sensitive to solution quality and convergence speed. In this paper, we employed chaotic maps to define the scaling factor and the fraction probability in variable value schema and proposed chaotic cuckoo search algorithm, called CCS. Comprehensive experiments were carried out on 20 benchmark functions to test the performances of CCS. The results show that chaotic maps can improve the performance of CS effectively and efficiently. The scalability study reveals that the advantage of CCS over CS is overall stable when increasing the dimensionality of problems. The results in comparison with another study on the scaling factor and the fraction probability verify that chaotic maps are a better selection to define the variable value schema.

There are several interesting directions for future work. First, it is interesting to test the different combinations of chaotic maps to find the optimal one. Second, we plan to integrate chaotic maps into improved CS algorithms to further verify their efficiency and effectiveness. Last but not least, we also plan to apply CCS to some real-world optimization problems for further examinations.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

The authors are very grateful to the editor and the anonymous reviewers for their constructive comments and suggestions to this paper. This work was supported by the Natural Science Foundation of Fujian Province of China under Grant no. 2013J01216.

References

  1. X. S. Yang and S. Deb, “Cuckoo search via Lévy flights,” in Proceedings of the World Congress on Nature & Biologically Inspired Computing (NaBIC '09), pp. 210–214, IEEE, Coimbatore, India, 2009. View at Publisher · View at Google Scholar
  2. X.-S. Yang and S. Deb, “Engineering optimisation by cuckoo search,” International Journal of Mathematical Modelling and Numerical Optimisation, vol. 1, no. 4, pp. 330–343, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  3. P. Civicioglu and E. Besdok, “A conceptual comparison of the Cuckoo-search, particle swarm optimization, differential evolution and artificial bee colony algorithms,” Artificial Intelligence Review, vol. 39, no. 4, pp. 315–346, 2013. View at Publisher · View at Google Scholar · View at Scopus
  4. S. Walton, O. Hassan, K. Morgan, and M. R. Brown, “Modified cuckoo search: a new gradient free optimisation algorithm,” Chaos, Solitons & Fractals, vol. 44, no. 9, pp. 710–718, 2011. View at Publisher · View at Google Scholar · View at Scopus
  5. P. Nasa-Ngium, K. Sunat, and S. Chiewchanwattana, “Enhancing modified cuckoo search by using Mantegna Lévy flights and chaotic sequences,” in Proceedings of the 10th International Joint Conference on Computer Science and Software Engineering (JCSSE '13), pp. 53–57, IEEE, Maha Sarakham, Thailand, May 2013. View at Publisher · View at Google Scholar
  6. S. Das, P. Dasgupta, and B. K. Panigrahi, “Inter-species Cuckoo search via different Lévy flights,” in Proceedings of the 4th International Conference on Swarm, Evolutionary, and Memetic Computing (SEMCCO '13), B. K. Panigrahi, P. N. Suganthan, S. Das, and S. S. Dash, Eds., vol. 8297 of Lecture Notes in Computer Science, part I, pp. 515–526, Springer International Publishing, Cham, Switzerland, 2013. View at Google Scholar
  7. X. M. Ding, Z. K. Xu, N. J. Cheung, and X. H. Liu, “Parameter estimation of Takagi-Sugeno fuzzy system using heterogeneous cuckoo search algorithm,” Neurocomputing, vol. 151, no. 3, pp. 1332–1342, 2015. View at Publisher · View at Google Scholar · View at Scopus
  8. Z. X. Zhang and Y. J. Chen, “An improved cuckoo search algorithm with adaptive method,” in Proceedings of the 7th International Joint Conference on Computational Sciences and Optimization (CSO '14), pp. 204–207, Beijing, China, July 2014. View at Publisher · View at Google Scholar · View at Scopus
  9. E. Valian, S. Mohanna, and S. Tavakoli, “Improved cuckoo search algorithm for global optimization,” International Journal of Communications and Information Technology, vol. 1, no. 1, pp. 31–44, 2011. View at Google Scholar
  10. L. J. Wang, Y. L. Yin, and Y. W. Zhong, “Cuckoo search with varied scaling factor,” Froniters of Computer Science, 2015. View at Publisher · View at Google Scholar
  11. M. Tuba, M. Subotic, and N. Stanarevic, “Performance of a modified cuckoo search algorithm for unconstrained optimization problems,” WSEAS Transactions on Systems, vol. 2, no. 1, pp. 263–268, 2012. View at Google Scholar
  12. L. J. Wang, Y. L. Yin, and Y. W. Zhong, “Cuckoo search algorithm with dimension by dimension improvement,” Journal of Software, vol. 24, no. 11, pp. 2687–2698, 2013. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  13. S. K. Fateen and A. Bonilla-Petriciolet, “Gradient-based cuckoo search for global optimization,” Mathematical Problems in Engineering, vol. 2014, Article ID 493740, 12 pages, 2014. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  14. S.-E. K. Fateen and A. Bonilla-Petriciolet, “A note on effective phase stability calculations using gradient-based cuckoo search algorithm,” Fluid Phase Equilibria, vol. 375, pp. 360–366, 2014. View at Publisher · View at Google Scholar · View at Scopus
  15. Y. Q. Zhou and H. Q. Zheng, “A novel complex valued cuckoo search algorithm,” The Scientific World Journal, vol. 2013, Article ID 597803, 6 pages, 2013. View at Publisher · View at Google Scholar · View at Scopus
  16. F. Wang, X. S. He, L. G. Luo, and Y. Wang, “Hybrid optimization algorithm of PSO and cuckoo search,” in Proceedings of the 2nd International Conference on Artificial Intelligence, Management Science and Electric Commerce (AIMSEC '11), pp. 1172–1175, Zhengzhou, China, 2011.
  17. A. Ghodrati and S. Lotfi, “A hybrid CS/PSO algorithm for global optimization,” in Intelligent Information and Database Systems: 4th Asian Conference, ACIIDS 2012, Kaohsiung, Taiwan, March 19-21, 2012, Proceedings, Part III, vol. 7198 of Lecture Notes in Computer Science, pp. 89–98, Springer, Berlin, Germany, 2012. View at Publisher · View at Google Scholar
  18. P. R. Srivastava, R. Khandelwal, S. Khandelwal, S. Kumar, and S. S. Ranganatha, “Automated test data generation using cuckoo search and tabu search (CSTS) algorithm,” Journal of Intelligent Systems, vol. 21, no. 2, pp. 195–224, 2012. View at Publisher · View at Google Scholar · View at Scopus
  19. G. G. Wang, L. H. Guo, H. Duan, and L. Liu, “A hybrid meta-heuristic DE/CS Algorithm for UCAV path planning,” Journal of Information & Computational Science, vol. 9, no. 16, pp. 4811–4818, 2012. View at Google Scholar · View at Scopus
  20. R. G. Babukartik and P. Dhavachelvan, “Hybrid algorithm using the advantage of ACO and Cuckoo Search for Job Scheduling,” International Journal of Information Technology Convergence and Services, vol. 2, no. 4, pp. 25–34, 2012. View at Publisher · View at Google Scholar
  21. X.-X. Hu and Y.-L. Yin, “Cooperative co-evolutionary cuckoo search algorithm for continuous function optimization problems,” Pattern Recognition and Artificial Intelligence, vol. 26, no. 11, pp. 1041–1049, 2013. View at Google Scholar · View at Scopus
  22. H. Zheng and Y. Zhou, “A cooperative coevolutionary cuckoo search algorithm for optimization problem,” Journal of Applied Mathematics, vol. 2013, Article ID 912056, 9 pages, 2013. View at Publisher · View at Google Scholar · View at MathSciNet
  23. S. Saremi, S. Mirjalili, and A. Lewis, “Biogeography-based optimisation with chaos,” Neural Computing & Applications, vol. 25, no. 5, pp. 1077–1097, 2014. View at Publisher · View at Google Scholar · View at Scopus
  24. R. Ebrahimzadeh and M. Jampour, “Chaotic genetic algorithm based on lorenz chaotic system for optimization problems,” International Journal of Intelligent Systems and Applications, vol. 5, no. 5, pp. 19–24, 2013. View at Publisher · View at Google Scholar
  25. Z. Y. Guo, B. Chen, M. Ye, and B. Cao, “Self-adaptive chaos differential evolution,” in Advances in Natural Computation: Second International Conference, ICNC 2006, Xi'an, China, September 24-28, 2006. Proceedings, Part I, L. Jiao, L. Wang, X.-B. Gao, J. Liu, and F. Wu, Eds., vol. 4221 of Lecture Notes in Computer Science, pp. 972–975, Springer, Berlin, Germany, 2006. View at Publisher · View at Google Scholar
  26. A. H. Gandomi, X.-S. Yang, S. Talatahari, and A. H. Alavi, “Firefly algorithm with chaos,” Communications in Nonlinear Science and Numerical Simulation, vol. 18, no. 1, pp. 89–98, 2013. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  27. S. Saremi, S. M. Mirjalili, and S. Mirjalili, “Chaotic krill herd optimization algorithm,” Procedia Technology, vol. 12, pp. 180–185, 2014. View at Publisher · View at Google Scholar
  28. G.-G. Wang, L. H. Guo, A. H. Gandomi, G.-S. Hao, and H. Wang, “Chaotic krill herd algorithm,” Information Sciences, vol. 274, pp. 17–34, 2014. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  29. S. Saremi and S. Mirjalili, “Integrating chaos to biogeography-based optimization algorithm,” International Journal of Computer and Communication Engineering, vol. 2, no. 6, pp. 655–658, 2013. View at Publisher · View at Google Scholar
  30. R. C. Hilborn, Chaos and Nonlinear Dynamics: An Introduction for Scientists and Engineers, Oxford University Press, New York, NY, USA, 2nd edition, 2004.
  31. Y. Li, S. Deng, and D. Xiao, “A novel hash algorithm construction based on chaotic neural network,” Neural Computing and Applications, vol. 20, no. 1, pp. 133–141, 2011. View at Publisher · View at Google Scholar · View at Scopus
  32. A. G. Tomida, “Matlab toolbox and GUI for analyzing one-dimensional chaotic maps,” in Proceedings of the International Conference on Computational Sciences and Its Applications (ICCSA '08), pp. 321–330, IEEE Press, Perugia, Italy, July 2008. View at Publisher · View at Google Scholar · View at Scopus
  33. R. L. Devaney, An Introduction to Chaotic Dynamical Systems, Addison-Wesley, Redwood City, Calif, USA, 1986. View at MathSciNet
  34. H.-O. Peitgen, H. Jurgens, and D. Saupe, Chaos and Fractals, Springer, Berlin, Germany, 1992. View at Publisher · View at Google Scholar · View at MathSciNet
  35. E. Ott, Chaos in Dynamical Systems, Cambridge University Press, Cambridge, UK, 2002.
  36. N. Noman and H. Iba, “Accelerating differential evolution using an adaptive local search,” IEEE Transactions on Evolutionary Computation, vol. 12, no. 1, pp. 107–125, 2008. View at Publisher · View at Google Scholar · View at Scopus
  37. P. N. Suganthan, N. Hansen, J. J. Liang et al., “Problem definitions and evaluation criteria for the CEC2005 special session on real-parameter optimization,” Tech. Rep., Nanyang Technological University, Singapore, 2005. View at Google Scholar
  38. W. Y. Gong and Z. H. Cai, “Differential evolution with ranking-based mutation operators,” IEEE Transactions on Cybernetics, vol. 43, no. 6, pp. 2066–2081, 2013. View at Publisher · View at Google Scholar · View at Scopus
  39. S. García, D. Molina, M. Lozano, and F. Herrera, “A study on the use of non-parametric tests for analyzing the evolutionary algorithms' behaviour: a case study on the CEC'2005 Special Session on Real Parameter Optimization,” Journal of Heuristics, vol. 15, no. 6, pp. 617–644, 2009. View at Publisher · View at Google Scholar · View at Scopus
  40. X. T. Li, J. N. Wang, and M. H. Yin, “Enhancing the performance of cuckoo search algorithm using orthogonal learning method,” Neural Computing & Applications, vol. 24, no. 6, pp. 1233–1247, 2014. View at Publisher · View at Google Scholar · View at Scopus