Cuckoo Search Algorithm with Chaotic Maps
Cuckoo search algorithm is a novel nature-inspired optimization technique based on the obligate brood parasitic behavior of some cuckoo species. It iteratively employs Lévy flights random walk with a scaling factor and biased/selective random walk with a fraction probability. Unfortunately, these two parameters are used in constant value schema, resulting in a problem sensitive to solution quality and convergence speed. In this paper, we proposed a variable value schema cuckoo search algorithm with chaotic maps, called CCS. In CCS, chaotic maps are utilized to, respectively, define the scaling factor and the fraction probability to enhance the solution quality and convergence speed. Extensive experiments with different chaotic maps demonstrate the improvement in efficiency and effectiveness.
Cuckoo search algorithm (CS) is a novel nature-inspired approach based on the obligate brood parasitic behavior of some cuckoo species in combination with the Lévy flights behavior of some birds and fruit flies [1, 2]. Subsequent investigations [2, 3] have demonstrated that CS is a simple yet very promising population-based stochastic search technique by using Lévy flights random walk (LFRW) and biased/selective random walk (BSRW). LFRW with a scaling factor parameter uses a mutation operator to generate new solutions based on a best solution obtained so far, while BSRW with a fraction probability parameter employs a complex crossover operator to search new solutions. After each random walk, a greedy strategy is utilized to select a better solution from the current and new generated solutions according to their fitness.
Due to its promising performance, CS has received much attention. Some studies have focused on improving LFRW [4–10] and BSRW [11–15]. Some attempts have been made to combine CS with other optimization techniques like particle swarm optimization [16, 17], Tabu search , differential evolution , ant colony optimization , and cooperative coevolutionary framework [21, 22]. The above studies have shown their contribution to the research on CS. Except for the literatures [9, 10], however, these studies used the definition of the scaling factor and the fraction probability in the constant value way, resulting in making CS sensitive to the optimization problems. This motivates us to study the scaling factor and the fraction probability using the variable value schema.
One of the mathematical approaches for the variable value schema is chaos. Chaos theory is related to the study of chaotic dynamical systems that are highly sensitive to the initial conditions . Recently, chaos theory has been integrated into genetic algorithm , differential evolution , firefly algorithm , krill herd [27, 28], and biogeography-based optimization [23, 29], and these have shown the effectiveness and efficiency of chaos theory. In light of the above investigations, we propose chaotic cuckoo search algorithm, called CCS, which utilizes chaotic maps to define the scaling factor and the fraction probability. The comprehensive experiments are carried out on 20 benchmark functions, and the results show that chaotic maps can improve the solution quality and convergence speed of CS effectively and efficiently.
The main contribution of this paper is to define the variable value for the scaling factor and the fraction probability using chaotic maps. This leads to the major advantages of our approach as follows: (i) since the scaling factor and the fraction probability are used in constant value way, the variable value schema of two parameters is generally more suitable for the optimization problems, resulting in better performance; (ii) due to the simpleness of chaotic maps, our approach does not increase the overall complexity of CS; (iii) our approach does not destroy the structure of CS; thus, it is still very simple.
The remainder of this paper is organized as follows. Section 2 describes the standard cuckoo search algorithm. Section 3 presents the cuckoo search algorithm with chaos. Section 4 reports the experimental results. Section 5 draws conclusion on this paper.
2. Cuckoo Search Algorithm
CS, developed recently by Yang and Deb [1, 2], is a simple yet very promising population-based stochastic search technique. In general, when CS is used to solve an objective function with the solution space , , a nest represents a candidate solution .
In the initialization phase, CS initializes solutions that are randomly sampled from solution space bywhere represents a uniformly distributed random variable on the range and is the population size.
After initialization, CS goes into an iterative phase where two random walks: Lévy flights random walk and biased/selective random walk, are employed to search for new solutions. After each random walk, CS selects a better solution according to the new generated and current solutions fitness using the greedy strategy. At the end of each iteration process, the best solution is updated.
2.1. Lévy Flights Random Walk
Broadly speaking, LFRW is a random walk whose step-size is drawn from Lévy distribution. At generation , LFRW can be formulated as follows:where α is a step-size which is related to the scales of the problem. In CS, LFRW is employed to search for new solutions around the best solution obtained so far. Therefore, the step-size can be obtained by the following equation :where is a scaling factor (generally, ) and represents the best solution obtained so far.
The product means entry-wise multiplications. Lévy(β) is a random number, which is drawn from a Lévy distribution for large steps:
In implementation, Lévy(β) can be calculated as follows :where is a constant and set to 1.5 in the standard software implementation of CS , and are random numbers drawn from a normal distribution with mean of 0 and standard deviation of 1, and is a gamma function.
Obviously, (2) can be reformulated as
2.2. Biased/Selective Random Walk
BSRW is used to discover new solutions far enough away from the current best solution by far field randomization . First, a trial solution is built with a mutation of the current solution as base vector and two randomly selected solutions as perturbed vectors. Second, a new solution is generated by a crossover operator from the current and the trial solutions. BSRW can be formulated as follows:where the random indexes and are the th and th solutions in the population, respectively, is the th dimension of the solution, and are random numbers on the range , and is a fraction probability.
3. Chaotic Cuckoo Search Algorithm
In this section, we first present different chaotic maps. Then, we apply them to define the scaling factor and the fraction probability. We last propose the framework of cuckoo search algorithm with chaotic maps, called CCS.
3.1. Chaotic Maps
Chaos theory is a field of study in mathematics, with applications in several disciplines including physics, engineering, economics, biology, and philosophy. Chaos theory studies the behavior of dynamical systems that are highly sensitive to initial conditions, an effect which is popularly referred to as the butterfly effect. One of ways to make quantitative statements about the behavior of chaotic systems is chaotic map like Circle map , Gauss map , Logistic map , Piecewise map , Sine map , Singer map , Sinusoidal map , and Tent map , shown in Table 1. Additionally, the visualization of these chaotic maps with the initial point at 0.7 is plotted in Figure 1. The other chaotic maps can be found in [26, 28].
(a) Circle map
(b) Gauss map
(c) Logistic map
(d) Piecewise map
(e) Sine map
(f) Singer map
(g) Sinusoidal map
(h) Tent map
3.2. Chaotic Maps for the Scaling Factor
As seen from (6), the large scaling factor does not fit the problems with the narrow search space because it may make Lévy flights random walk become too aggressive and then jump outside of the search domain, resulting in wasting of function evaluations. In addition, for the wide search space, the small scaling factor cannot make contribution to the efficiency of search. Obviously, utilizing the constant value scaling factor is not more optimum for the problems. Therefore, we employ the chaotic maps to provide the chaotic behaviors for cuckoo search to define the scaling factor and rewrite (6) as follows:where is a chaotic sequence.
3.3. Chaotic Maps for the Fraction Probability
In (7), the fraction probability is used to control how many dimensions in expectation are changed in a solution. For low values of , a large number of dimensions of a solution are changed in each generation. In this case, it is in favor of the exploration of CS. On the other hand, high values of cause most of the directions of the new solution to be inherited from itself. This is beneficial to the exploitation of CS. Apparently, a variable value can dynamically balance the exploration and exploitation. Thus, we utilize chaotic maps to define the fraction probability to balance the exploration and exploitation and rewrite (7) as follows:where is a chaotic sequence.
3.4. Framework of CCS
According to the above descriptions, we give the framework of CCS in Algorithm 1.
4. Simulation and Results
In this section, a suit of 20 benchmark functions used in  is utilized to verify the performance of the proposed approach. These 20 benchmark functions can be divided into three groups: (i) unimodal functions including and ; (ii) multimodal functions containing , , , , , , , and ; and (iii) rotated and/or shifted functions . A more detailed description of them can be found in [36, 37]. Additionally, we use Error, Evaluation, and Convergence graphs as performance evaluation criteria.
Error is the function error which is defined as , where is the global optimum of the function and is the best solution obtained by the algorithm in a given run. In addition, Error is also recorded in different runs, and the average and the standard deviation of Error are calculated and noted as “” used in different tables. Moreover, the Wilcoxon signed-rank test at the 0.05 significance level is used to show significance between two algorithms. The “−” symbol shows that the null hypothesis is rejected, and the first algorithm outperforms the second one. The “+” symbol means the null hypothesis is rejected and the first algorithm is inferior to the second one. The “” symbol reveals that the null hypothesis is accepted and the first algorithm ties the second one. Additionally, the total number of each symbol “” is summarized at the bottom of different tables.
Evaluation is the number of function evaluations needed for reaching the accuracy level or 10−2 suggested in  within the maximum number of fitness evaluations set to , where is the dimension of function. Furthermore, we also recorded Evaluation in different runs and calculate the average and standard deviation of it which are signed as “,” where denotes the number of successful runs in which an algorithm within the maximum number of fitness evaluations could reach the accuracy level ɛ.
Convergence graphs are the convergence curve graphs of each algorithm for the problems within the maximum number of fitness evaluations. These graphs show the average Error performance of the total runs, in respective experiments.
4.1. Sensitivities to Chaotic Maps
It can observed from Figure 1 that different chaotic maps show different chaotic behaviors. In this section, therefore, we analyze the performance of CCS affected by different chaotic maps. To verify the sensitivity of different chaotic maps to the performance, we utilize a simple combination where different chaotic maps are employed to define the scaling factor, and the fraction probability is defined by using Gauss map according to low constant value which is used in CS. In this case, we have cCCS with Circle map, gCCS with Gauss map, lgCCS with Logistic map, pCCS with Piecewise map, seCCS with Sine map, srCCS with Singer map, slCCS with Sinusoidal map, and tCCS with Tent map. Table 2 lists the average Error of CCS with different chaotic maps, and Table 3 gives the results of the Friedman test similarly done in .
As observed from Table 2, for most functions, CCS with different chaotic maps shows similar average Error. However, Table 3 shows that cCCS is best, followed by srCCS, pCCS, tCCS, seCCS, slCCS, lgCCS, and gCCS. This suggests that the performance of CCS for part of functions is slightly sensitive to chaotic maps, and the combination of Cycle map and Gauss map is the better selection for cuckoo search algorithm. It is worthy saying that there are many combinations of chaotic maps to be used. Thus, in the future work, we will comprehensively test different combinations in CCS.
4.2. Comparison with CS via Random Value
Note that the random value can also be regarded as the variable value schema. To show the advantage of CS with chaotic maps, CS with random value, called rCS, is tested on 20 benchmark functions at . In rCS, the random strategy is used to define the scaling factor and the fraction probability whose values are sampled from a uniform distribution on range between 0 and 1. Table 4 lists the statistical Error, and Table 5 reports the multiple problems statistical analysis between CCS and rCS for all functions based on the Wilcoxon test similarly done in [38, 39].
We can find from Table 4 that rCS and CCS, respectively, show their advantage on different functions. These two algorithms have the same performance on a handful of functions like , , , , , and . Moreover, rCS performs better on , , , , and , while CCS gains better performance on , , and , especially on rotated and/or shifted functions like , , , , , and . According to the results of “,” CCS is superior to rCS on 8 out of 20 functions, is equal to rCS on 10 out of 20 ones, and is inferior to rCS on 2 out of 2 ones.
Additionally, it can be seen from Table 5 that CCS gets higher value than value. The above suggests that chaotic sequences make greater and more stable contribution to the performance of CS than random sampling sequences. This is because chaotic sequences are in fact generated deterministically from the dynamical system, while random sampling sequences are nondeterministic and different, even if the initial state is the same.
4.3. Effect of Chaotic Maps on CS
To show how chaotic maps can improve the performance of CS, we carry out experiments on the 20 benchmark functions at with population size , at with population size , and at with population size , respectively, since a part of benchmark functions are defined for up to . CS and CCS are tested 25 times for each function, respectively. The fraction probability of CS is 0.25, while the Cycle map and the Gauss map, whose initial values are 0.7 similarly done in [23, 26], are used to define the scaling factor and the fraction probability , respectively. Table 6 shows Error of two algorithms at different dimensions.
Table 6 clearly shows that chaotic maps can overall significantly improve the performance of CS according to the average Error at , , and .
In the case of , observed from Table 6, CCS can gain solutions with higher accuracy for all functions except for . In terms of the Wilcoxon signed-rank test, CCS performs better on 19 out of 20 functions and shows equivalent performance to CS on 1 out of 20 ones.
In the case of , for unimodal functions, CCS outperforms CS significantly. For multimodal functions, CCS apparently achieves higher accurate solutions than CS does. In addition, CCS obtains the global optimal solution to . For rotated and/or shifted functions, CCS is not significantly inferior to CS on and equivalent to CS on . However, CCS performs better than CS for the other 8 out of 10 functions. Especially on , CCS achieves the global optima. In all, in terms of “,” compared with CS, CCS, respectively, shows better and equivalent performance on 17 and 3 out of 20 benchmark functions.
When , the accuracy of both solutions of two algorithms is reduced on most functions. However, compared with CS, CCS still achieves higher accurate solutions to all functions except for . In addition, CCS reaches the global optimal solution to . According to the statistical results, CCS outperforms CS on 18 out of 20 benchmark functions.
Furthermore, to show the convergence speed of CCS reaching the accuracy level , Table 7 lists the Evaluation performance of two algorithms at . Table 7 clearly shows that CCS performs the overall more stable convergence to the accuracy level . For example, CS and CCS both reach the accuracy level ɛ steadily on , , , , and , but CCS converges faster than CS does. Moreover, CCS has more stable convergence on , , and . In addition, for , although CS converges steadily to the accuracy level, CCS has faster convergence speed.
Additionally, convergence graphs of CS and CCS for some functions at are plotted in Figure 2. It can be observed that CCS apparently converges faster than CS in terms of convergence curves.
According to Error, Evaluation, and Convergence graphs, CCS overall significantly improves the solution quality and convergence speed of CS. This is because chaotic maps can provide various search step information, and more probabilistic learning from others, which are beneficial to improve the search ability of CS. Additionally, the analysis of scalability suggests that the advantage of CCS over CS is overall stable when the dimensionality of the problems increases.
4.4. Sensitivities to Initial Value of Chaotic Maps
It is worthy pointing out that the chaotic sequences are highly sensitive to initial condition. To show the performance of CCS affected by the initial value, we perform the experiments on chaotic maps with different initial values. The results are listed in Table 8, where the initial values are 0.25 and 0.5, resulting in CCS25 and CCS5, respectively. The other parameters are kept unchanged.
Seen from Table 8, we can find that the performance of CCS will be influenced weakly by the initial value of chaotic maps in terms of Error. CCS25 obtains the highest accuracy on , , and , while CCS5 brings the highest accurate solutions to , , , , and . However, CCS achieves the solutions with highest accuracy for most functions. Nevertheless, according to their statistical results “,” CCS shows better performance than CCS25 and CCS5 on 6 out of 20 functions and draws a tie of CCS25 and CCS5 on 11 and 12 out of 20 functions, respectively. This suggests that the initial value 0.7 in default is the better selection.
4.5. Comparison with Other Improved CS Algorithms
To show the competitiveness of CCS with the other improved CS algorithms, we compare it at with three improved versions, called ICS , CSPSO , and OLCS . Note that ICS defines the scaling factor and the fraction probability in variable value schema based on two maximum and minimum parameters. The results are reported in Tables 9, 10, and 11, respectively.
As observed from Table 9, each algorithm shows its advantage on parts of functions. For example, ICS performs better on , , , , and . CSPSO gains the highest accurate solution to . OLCS obtains the solutions with higher accuracy on , , , and and reaches the global optima on and . CCS achieves the global solutions to and and presents its advantage on rotated or shifted functions like , , , , and . Nevertheless, with the help of “,” CCS outperforms ICS, CSPSO, and OLCS on 9, 16, and 13 out of 20 functions. Moreover, Table 10 shows that CCS yields the higher values than values in all cases. In addition, it can be seen from Table 11 clearly that CCS gains the first average ranking, followed by ICS, OLCS, and CSPSO.
CCS shows its promising performance by using two chaotic maps simultaneously to define the scaling factor and the fraction probability. In this case, two chaotic maps make cooperative contribution to the performance of CCS. In this section, therefore, we discuss the contribution of each chaotic map to the performance of CCS. To analyze the contribution of each chaotic map, we consider two derived algorithms: CCS1 and CCS2. The former uses chaotic map to define the scaling factor and keeps the original BSRW, while the later utilizes chaotic map to define the fraction probability and keeps the original LFRW. CCS1 and CCS2 are performed on 20 benchmark functions at , and the results are listed in Table 12.
It can be observed from Table 12 that the single chaotic map makes different contribution to the performance of CCS for different functions. Compared with CS, CCS1 singly brings solutions with higher accuracy to , , , , , , , and , while CCS2 alone achieves higher accurate solutions to . Due to the contribution of these higher accurate solutions, CCS yields better performance. Moreover, it can be suggested from Table 12 that CCS1 and CCS2 both achieve better performance and cooperatively make contribution to the performance of CCS. For example, for most of rotated and/or shifted functions, for example, , , , , and , CCS1 and CCS2 obtain the slightly higher accurate solutions, but CCS further performs better due to their cooperative contribution.
5. Conclusion and Future Work
In CS, the scaling factor and the fraction probability parameters are used in constant value way, resulting in a problem sensitive to solution quality and convergence speed. In this paper, we employed chaotic maps to define the scaling factor and the fraction probability in variable value schema and proposed chaotic cuckoo search algorithm, called CCS. Comprehensive experiments were carried out on 20 benchmark functions to test the performances of CCS. The results show that chaotic maps can improve the performance of CS effectively and efficiently. The scalability study reveals that the advantage of CCS over CS is overall stable when increasing the dimensionality of problems. The results in comparison with another study on the scaling factor and the fraction probability verify that chaotic maps are a better selection to define the variable value schema.
There are several interesting directions for future work. First, it is interesting to test the different combinations of chaotic maps to find the optimal one. Second, we plan to integrate chaotic maps into improved CS algorithms to further verify their efficiency and effectiveness. Last but not least, we also plan to apply CCS to some real-world optimization problems for further examinations.
Conflict of Interests
The authors declare that there is no conflict of interests regarding the publication of this paper.
The authors are very grateful to the editor and the anonymous reviewers for their constructive comments and suggestions to this paper. This work was supported by the Natural Science Foundation of Fujian Province of China under Grant no. 2013J01216.
P. Nasa-Ngium, K. Sunat, and S. Chiewchanwattana, “Enhancing modified cuckoo search by using Mantegna Lévy flights and chaotic sequences,” in Proceedings of the 10th International Joint Conference on Computer Science and Software Engineering (JCSSE '13), pp. 53–57, IEEE, Maha Sarakham, Thailand, May 2013.View at: Publisher Site | Google Scholar
S. Das, P. Dasgupta, and B. K. Panigrahi, “Inter-species Cuckoo search via different Lévy flights,” in Proceedings of the 4th International Conference on Swarm, Evolutionary, and Memetic Computing (SEMCCO '13), B. K. Panigrahi, P. N. Suganthan, S. Das, and S. S. Dash, Eds., vol. 8297 of Lecture Notes in Computer Science, part I, pp. 515–526, Springer International Publishing, Cham, Switzerland, 2013.View at: Google Scholar
E. Valian, S. Mohanna, and S. Tavakoli, “Improved cuckoo search algorithm for global optimization,” International Journal of Communications and Information Technology, vol. 1, no. 1, pp. 31–44, 2011.View at: Google Scholar
M. Tuba, M. Subotic, and N. Stanarevic, “Performance of a modified cuckoo search algorithm for unconstrained optimization problems,” WSEAS Transactions on Systems, vol. 2, no. 1, pp. 263–268, 2012.View at: Google Scholar
F. Wang, X. S. He, L. G. Luo, and Y. Wang, “Hybrid optimization algorithm of PSO and cuckoo search,” in Proceedings of the 2nd International Conference on Artificial Intelligence, Management Science and Electric Commerce (AIMSEC '11), pp. 1172–1175, Zhengzhou, China, 2011.View at: Google Scholar
A. Ghodrati and S. Lotfi, “A hybrid CS/PSO algorithm for global optimization,” in Intelligent Information and Database Systems: 4th Asian Conference, ACIIDS 2012, Kaohsiung, Taiwan, March 19-21, 2012, Proceedings, Part III, vol. 7198 of Lecture Notes in Computer Science, pp. 89–98, Springer, Berlin, Germany, 2012.View at: Publisher Site | Google Scholar
G. G. Wang, L. H. Guo, H. Duan, and L. Liu, “A hybrid meta-heuristic DE/CS Algorithm for UCAV path planning,” Journal of Information & Computational Science, vol. 9, no. 16, pp. 4811–4818, 2012.View at: Google Scholar
X.-X. Hu and Y.-L. Yin, “Cooperative co-evolutionary cuckoo search algorithm for continuous function optimization problems,” Pattern Recognition and Artificial Intelligence, vol. 26, no. 11, pp. 1041–1049, 2013.View at: Google Scholar
Z. Y. Guo, B. Chen, M. Ye, and B. Cao, “Self-adaptive chaos differential evolution,” in Advances in Natural Computation: Second International Conference, ICNC 2006, Xi'an, China, September 24-28, 2006. Proceedings, Part I, L. Jiao, L. Wang, X.-B. Gao, J. Liu, and F. Wu, Eds., vol. 4221 of Lecture Notes in Computer Science, pp. 972–975, Springer, Berlin, Germany, 2006.View at: Publisher Site | Google Scholar
R. C. Hilborn, Chaos and Nonlinear Dynamics: An Introduction for Scientists and Engineers, Oxford University Press, New York, NY, USA, 2nd edition, 2004.
R. L. Devaney, An Introduction to Chaotic Dynamical Systems, Addison-Wesley, Redwood City, Calif, USA, 1986.View at: MathSciNet
E. Ott, Chaos in Dynamical Systems, Cambridge University Press, Cambridge, UK, 2002.
P. N. Suganthan, N. Hansen, J. J. Liang et al., “Problem definitions and evaluation criteria for the CEC2005 special session on real-parameter optimization,” Tech. Rep., Nanyang Technological University, Singapore, 2005.View at: Google Scholar
S. García, D. Molina, M. Lozano, and F. Herrera, “A study on the use of non-parametric tests for analyzing the evolutionary algorithms' behaviour: a case study on the CEC'2005 Special Session on Real Parameter Optimization,” Journal of Heuristics, vol. 15, no. 6, pp. 617–644, 2009.View at: Publisher Site | Google Scholar