Table of Contents Author Guidelines Submit a Manuscript
Mathematical Problems in Engineering
Volume 2016, Article ID 4839763, 11 pages
http://dx.doi.org/10.1155/2016/4839763
Research Article

Cuckoo Search Algorithm with Hybrid Factor Using Dimensional Distance

1College of Computer and Information Science, Fujian Agriculture and Forestry University, Fuzhou 350002, China
2College of Management, Fujian University of Traditional Chinese Medicine, Fuzhou 350002, China

Received 15 May 2016; Accepted 6 November 2016

Academic Editor: Salvatore Alfonzetti

Copyright © 2016 Yaohua Lin et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

This paper proposes a hybrid factor strategy for cuckoo search algorithm by combining constant factor and varied factor. The constant factor is used to the dimensions of each solution which are closer to the corresponding dimensions of the best solution, while the varied factor using a random or a chaotic sequence is utilized to farer dimensions. For each solution, the dimension whose distance to the corresponding one of the best solution is shorter than mean distance of all dimensional distances will be regarded as the closer one, otherwise as the farer one. A suit of 20 benchmark functions are employed to verify the performance of the proposed strategy, and the results show the improvement in effectiveness and efficiency of the hybridization.

1. Introduction

Cuckoo search algorithm (CS), proposed by Yang and Deb in 2009, is a new nature-inspired method for solving real-valued numerical optimization problems [1, 2]. The method utilizes Lévy flights random walk (LFRW) and biased random walk to search for new solutions and achieves the promising performance for many tough problems. This has attracted a lot of researchers, and many studies have been proposed. Some studies have focused on the combination CS with other optimization methods [313]. Some attempts have been made to improve search ability of LFRW and BSW [1432]. Other attentions have been played on CS for the combinational and multiobject problems [3341].

The above studies have made great contributions to CS. Nevertheless, according to the implementation in the literature [2], Lévy flights random walk (LFRW), one of search components, is used iteratively to search for new solutions. LFRW uses a mutation operator to generate new solutions based on the best solution obtained so far. A factor in LFRW is utilized to control Lévy flights not to be too aggressive; thus, it is suggested to a constant value, typically 0.01 [2]. In this case, it is beneficial to the solutions which are close to the best one, but it is a disadvantage to those far away from the best one. To avoid the above, Wang et al. [22] proposed a varied factor strategy for CS, named as VCS, where the random sequence factor obeying uniformly distribution is used to replace the constant one. Wang and Zhong [23] used a chaotic sequence factor instead of the constant one, called CCS. However, the above researches make the scale of step size of all dimensions of one solution not different due to the same factor. This may cause a part of dimensions to be too aggressive when the large factor is sampled or too inefficient in the case of the small factor.

In this paper, we aim at avoiding the above problem by using the different factor for the dimensions of each solution and then propose a hybrid factor based cuckoo search algorithm, termed as HFCS. The hybrid factor strategy (HF) combines the constant factor and the varied factor. The constant factor, typically 0.01, is used to benefit the dimensions which are closer to the corresponding ones of the best solution. The varied factor using a random sequence or a chaotic sequence is employed to drive the farer dimensions to be near the corresponding ones of the best one. HFCS selects the dimension of each solution by using the dimensional distance that can be defined as the distance between one dimension and the corresponding one of the best solution. If the dimensional distance of one dimension is shorter than the average of all dimensional distances, then this dimension is selected as the closer one, else as the farer one. The experiments are carried out on 20 benchmark functions to test HFCS, and the results show the improvement in effectiveness and efficiency of hybrid factor strategy.

The remainder of this paper is organized as follows. Section 2 describes the cuckoo search algorithm and the variants. Section 3 presents the proposed algorithm. Section 4 reports the experimental results. Section 5 concludes this paper.

2. Cuckoo Search Algorithm

2.1. CS

CS, a new nature-inspired algorithm based on the obligate brood parasitic behavior of some cuckoo species in combination with the Lévy flights behavior of some birds and fruit flies [1, 2] is a simple yet very promising population-based stochastic search technique. Generally, a nest represents a candidate solution , when solving an objective function with the solution space , , . Like evolutionary algorithms, the iteration process of CS includes the initial phase and evolutional phase.

In the initial phase, the whole population called solution is randomly sampled from solution space bywhere represents a uniformly distributed random variable on the range [] and is the population size.

According to the implementation of CS shown in the literature [2], CS iteratively uses two random walks: Lévy flights random walk (LFRW) and biased random walk (BRW) to search for new solutions.

LFRW is a random walk whose step size is drawn from Lévy distribution. At generation (), LFRW can be formulated as follows:where is a step size related to the scales of the problem. The ⊕ means entry-wise multiplications. Lévy(β) is drawn from a Lévy distribution for large steps:In CS, LFRW is employed to search for new solutions around the best solution obtained so far and implemented according to the following equation [2]:where is a factor (generally, ) and represents the best solution obtained so far:where is a constant and suggested to be 1.5, and are random numbers drawn from a normal distribution with mean of 0 and standard deviation of 1, and is a gamma function.

BRW is used to discover new solutions far enough away from the current best solution by far field randomization [1]. First, a trial solution is built with a mutation of the current solution as base vector and two randomly selected solutions as perturbed vectors. Second, a new solution is generated by a crossover operator from the current and the trial solutions. BSRW can be formulated as follows:where the random indexes and are the th and th solutions in the population, respectively, is the th dimension of the solution, and are random numbers on the range , and is a fraction probability.

After each random walk, CS selects a better solution according to the new generated and the current solutions fitness using the greedy strategy. At the end of each iteration process, the best solution is updated.

2.2. Variants of CS

CS is developed recently, but this algorithm has been researched a lot.

Some studies are an attempt to combine CS with other optimization techniques. Wang et al. [3] and Ghodrati and Lotfi [4], respectively, proposed a hybrid CS with particle swarm optimization. Wang et al. [5] applied differential evolution to optimize the process of selecting cuckoo of the CS model during the process of cuckoo in nest updating. Babukartik and Dhavachelvan [6] proposed the hybrid algorithm combining ant colony optimization and CS. Srivastava et al. [7] combined the CS algorithm's strength of converging to the solution in minimal time along with the tabu mechanism of backtracking from local optimal by Lévy flight. Liu and Fu [8] applied the local search mechanism of the frog leaping algorithm to enhance the local search ability of cuckoo search. Other techniques, such as the orthogonal learning strategy [9], cooperative coevolutionary (CC) framework [1012], and the teaching-learning-based optimization [13], are also hybridized to enhance the search ability of cuckoo search.

Some variants have paid attention to improve search ability of LFRW and BSW. Walton et al. [14] made a modification to the step size of Lévy flights decreasing as the number of generations increases in order to increase the convergence rate. Ljouad et al. [15] modified Lévy flights model with an adaptive step size based on the number of generations. Valian et al. [16], Wang and Zhou [17], and Mohapatra et al. [18] proposed adaptive step sizes of Lévy flights according to different equations with maximal and minimal step sizes, respectively. Wang et al. [19] and Huang et al. [20] used the chaotic sequence to change the step size of Lévy flights, respectively. Jia et al. [21] proposed the variable step length of Lévy flights and a method of discovering probability. Wang et al. [22, 23], respectively, used random sequence and chaotic sequence as the factor instead of the constant 0.01 in Lévy flights. Coelho et al. [24] integrated the differential operator into Lévy flights to search for new solutions. Mlakar et al. [25] proposed the hybrid algorithm using explicit control of exploration search strategies with the CS algorithm. Ding et al. [26] proposed heterogeneous search strategies based on the quantum mechanism. Wang et al. [27] employed a probabilistic mutation to enhance the Lévy flights. Wang and Zhong [28] added a crossover-like operator in search schema of Lévy flights using one-position inheritance mechanism. Inspired by the social learning and cognitive learning, Li and Yin [29] added these two learning parts into Lévy flights and into BSW. Wang et al. [30, 31] utilized orthogonal crossover and dimension by dimension improvement to enhance the search ability of BSW, respectively. Li and Yin [32] used two new mutation operators based on the rand and best individuals among the entire population to enhance the search ability of BSW.

Other versions have focused on the combinational and multiobject problems. Yang and Deb [33], Hanoun et al. [34], and Chandrasekaran and Simon [35] modified the cuckoo search to solve multiobjective optimization problems. Ouyang et al. [36] and Quaarab et al. [37] proposed the improved CS to solve the travelling salesman problem. Zhou et al. [38] applied an improved CS for solving planar graph coloring problem. Marichelvam et al. [39] and Dasgupta and Das [40] presented the discrete versions for the flow shop scheduling problem. Teymourian et al. [41] applied CS for solving the capacitated vehicle routing problem.

3. HFCS

According to the implementation of CS [2], the factor, for example, 0.01, is used to control Lévy flights not to be too aggressive and to try to make the solutions not jump outside of the search space. In this case, a small step size will be got due to this small factor. Obviously, this makes a contribution to the solutions nearby the best one, but it is not more helpful to the solutions far away from the best one, resulting in the slow convergence. Wang et al. [22, 23] utilized the random sequence and the chaotic sequence instead of the constant one and proved the improvement on the convergence and solution quality. However, the factor with constant or with random sequence or chaotic sequence makes the scale of each dimension of each solution be the same. This perhaps results in a problem that some dimensions near the corresponding dimensions of the best one have larger scale when using random or chaotic sequence, while some dimensions far away from the corresponding dimensions of the best one get the smaller scale in the case of the constant factor. To remedy the above problem, a hybrid factor based cuckoo search is proposed, called HFCS, and presented in Algorithm 1.

Algorithm 1: HFCS.

It is worth pointing out that the key part of HFCS is to select the dimension where the constant factor or the varied factor is used. Herein, the dimensional distance and the mean dimension distance, similarly done in [42], are used and shown in (7) and (8)where presents the th dimensional distance from the th solution to the best onewhere is the average dimensional distance of the th solution.

For each dimension of each solution, the th dimension is closer to the corresponding dimension of the best solution, called the closer dimension, when the th dimensional distance is shorter than the mean dimensional distance; else the th dimension is called the farer one. For the closer dimension, the constant factor 0.01 is used in (4), while the random sequence or the chaotic sequence is employed in (4) for the farer one.

4. Experiments and Results

In this section, HFCS is tested on 20 benchmark functions [43]. These 20 benchmark functions include 2 unimodal functions and ; 8 multimodal functions , , , , , , , and ; and 10 rotated and/or shifted functions . More detail about 20 functions can be found in [43, 44].

HFCS has the same parameters as CS, and we use the same setting for them, unless a change is mentioned. The parameter is 0.25. Each algorithm is performed 25 times for each function with the dimension , 30, and 50, respectively. The population size of each algorithm is when and , while it is 30 in the case of . The maximum function evaluations are . Note that HFCS can hybridize the constant factor and the random sequence or the chaotic sequence, resulting in two algorithms: HFCS with the combination of the constant factor and the random sequence similarly done in [22], termed as rHFCS, and HFCS with the combination of the constant factor and the chaotic sequence same generated in [23], called cHFCS.

Error and the convergence speed are employed to analyze HFCS. Error, which is the function fitness Error for the solution obtained by the algorithms, is defined as , where is the known global optimum of function. Moreover, the average and standard deviation of the best error values, presented as “,” are used in the different tables. Additionally, the Wilcoxon signed-rank at the 5% significance level is used to show the differences of Error between two algorithms. The “+” symbol shows that HFCS outperforms the compared algorithm at the 5% significant level, the “−” symbol shows that the compared algorithm does better than HFCS, and the “=” symbol means that HFCS is equal to the compared algorithm. We summarily give the total number of statistical significant cases at the bottom of each table.

The convergence speed is measured by using the function evaluations (FES) which are spent till the algorithm reaches the given Error which is 10−6 or 10−2 as suggested in [43] within the maximum function evaluations. The average and standard deviation of FES and successful runs (SR) to the given Error during 25 runs, termed as “ (SR),” are utilized in Table 3. Also, the convergence speed is shown graphically by using the convergence graphs where the mean Error of the best solution at iteration process over the total run is presented.

4.1. Performance of HFCS

To show the effect of the proposed algorithm, Table 1 lists Error obtained by CS, rHFCS, and cHFCS, and Table 2 lists the results of the multiple-problem Wilcoxon’s test which was done similarly in [45, 46].

Table 1: Error obtained by CS, rHFCS, and cHFCS for 30-dimensional functions.
Table 2: Results of the multiple-problem Wilcoxon’s test for HFCS and CS for 20 functions at .
Table 3: FES obtained by CS, rHFCS, and cHFCS at .

It can be seen from Table 1 that the hybrid factor strategy can overall improve the solution quality in terms of Error. As for 2 unimodal functions, rHFCS and cHFCS bring more accurate solutions than CS. As for 8 multimodal functions, rHFCS obtains the solutions with higher accuracy, while cHFCS does the same. As for 10 rotated and/or shifted functions, rHFCS and cHFCS both make the great improvement on the accuracy of solutions except for and . According to the summary of “+,” “=,” and “−,” rHFCS wins CS on 16 out of 20 functions, ties CS on 2 out of 20 functions, and loses CS on 2 out of functions. cHFCS is superior to CS on 16 out of 20 functions, is equal to CS on 2 out of 20 functions, and is inferior to CS on 2 out of 20 functions. In addition, in terms of the results in Table 2, rHFCS and cHFCS both get apparently higher value than value. This suggests that rHFCS and cHFCS outperform CS obviously.

Moreover, Table 3 lists the required function evaluations (FES) to the given Error within the maximum function evaluations. It can be observed from Table 2 that rHFCS and cHFCS overall have a quicker convergence to the given Error within the maximum function evaluations. For instance, CS, rHFCS, and cHFCS share the same stable convergence to the given Error in terms of SR on , , and ; however, rHFCS and cHFCS converge quicker according to FES. As for , , and , rHFCS and cHFCS show the better stability and the quicker convergence. As for , , and , rHFCS and cHFCS still converge quicker than CS, although these two algorithms do not gain the stability convergence with the help of SR. As for , CS stably converge to the given Error, but rHFCS and cHFCS obtain the quicker convergence.

To further show the convergence performance, the convergence curves obtained by CS, rHFCS, and cHFCS for parts of functions are plotted in Figure 1.

Figure 1: Convergence curves of CS, rHFCS, and cHFCS.

It can be observed from Figure 1 that rHFCS and cHFCS achieve overall the better convergence speed in terms of curves. As for the functions solved better by rHFCS and cHFCS, for example, , , , , and , rHFCS and cHFCS still converge quicker than CS; see Figures 1(a), 1(c), 1(d), 1(e), and 1(f). As for , cHFCS gains the same accurate solution with CS, but cHFCS obtains a quicker convergence than CS on the beginning of iteration. It is interesting that CS does better than rHFCS and cHFCS when converging to the given Error, shown in Table 3; however, because of the lack of convergence stability, rHFCS and cHFCS show better convergence curves; see Figure 1(f).

According to the Error, FES, and convergence curve, HFCS overall makes an improvement on solution quality and convergence speed. This is because different factors provide different step size information, resulting in the improvement of search ability.

4.2. Scalability of HFCS

The scalability study is investigated to show the performance of HFCS when the dimensionality of problem changes. The experiments are carried out on the 20 functions at 10- and 50- due to their definition up to 50- [44]. The results are tabulated in Tables 4 and 5.

Table 4: Error obtained by CS and HFCS for 10- and 50-dimensional benchmark functions.
Table 5: Results of the multiple-problem Wilcoxon’s test for HFCS and CS for 20 functions at and .

In the case of , according to Error, HFCS exhibits a great improvement on solutions to most of functions. For example, except for , , and , rHFCS brings the higher accurate solutions, while cHFCS gains the solutions with higher accuracy except for and . Furthermore, in terms of the total of “+,” “=,” and “−,” rHFCS performs better than CS on 16 out of 20 functions, shows equivalence to CS on 3 out of 20 functions, and performs worse than CS on 1 out of 20 ones. cHFCS wins CS on 17 out of 20 functions, ties CS on 2 out of 20 ones, and loses CS on 1 out 20 ones. Additionally, rHFCS and cHFCS gain higher value than value markedly with the help of the results listed in Table 5.

When , HFCS still can bring solutions with higher quality to most of functions. For instance, rHFCS achieves the better solutions except for , , , and , while cHFCS does well except for , , , and . rHFCS outperforms, draws a tie, and loses CS on 14, 4, and 2 out of 20 functions, respectively. cHFCS is superior to CS on 15 out of 20 functions, is equal to CS on 3 out of 20 functions, and is inferior to CS on 2 out of 20 ones. Moreover, by the aid of the results reported in Table 5, rHFCS and cHFCS also obtain remarkable higher value than value.

Summarily, it suggests that the improvement of HFCS is stable when the dimensionality of problems increases.

4.3. Comparison with VCS and CCS

The comparison is made between HFCS with VCS and CCS due to the random sequence and the chaotic sequence used in these two algorithms, respectively. Table 6 lists the Error obtained by VCS and rHFCS, while Table 7 shows the Error achieved by cHFCS and CCS where CCS employs the chaotic sequence as factor in (4) and the parameter is 0.25, termed as fCCS. Table 8 reports the results of the multiple-problem Wilcoxon’s test between HFCS and VCS and fCCS for all functions.

Table 6: Error obtained by VCS and rHFCS at .
Table 7: Error obtained by fCCS and cHFCS at .
Table 8: Results of the multiple-problem Wilcoxon’s test for HFCS, VCS, and fCCS for 20 functions at .

It can be observed from Table 6 that the hybrid factor of the constant and the random sequence overall perform better than the random sequence. As for unimodal functions, compared with VCS, rHFCS brings solutions with higher accuracy. Moreover, as for multimodal functions, rHFCS not only keeps the same Error as VCS for , , and , but also enhances the accuracy except for and . Additionally, as for rotated and/or shifted functions, the same conclusion on multimodal functions can be drawn on rHFCS. In all, rHFCS outperforms VCS on 7 out of 20 functions, ties VCS on 11 out of 20 functions, and loses VCS on 2 out of 20 ones. Additionally, rHFCS gets higher value than value, and rHFCS and VCS are of significant difference at two significant levels. It suggests that rHFCS is overall better than VCS.

Table 7 shows that the hybrid factor of the constant and the chaotic sequence overall perform better than the chaotic sequence. Compared with fCCS, cHFCS works better on unimodal functions. As for multimodal functions, cHFCS gains the same performance as fCCS for , , and and increases the accuracy of solutions to , , and . As for rotated and/or shifted functions, cHFCS performs better on , , , , and . In summary, cHFCS is superior to fCCS on 6 out of 20 functions, is equal to fCCS on 12 out of 20 functions, and is inferior to fCCS on 2 out of 20 ones. In addition, cHFCS gains higher value than value, although cHFCS and fCCS are not significant differences at two significant levels. It reveals that cHFCS is overall better.

4.4. Effect of Integration into Improved Algorithms

In this section, we investigate the performance of the hybrid factor integrated into improved algorithms to analyze the suitability. Considering the implementation of the improved algorithm, we choose one-position inheritance cuckoo search algorithm, called OPICS, to be integrated, resulting in rHFOPICS and cHFOPICS. rHFOPCIS presents that OPICS is enhanced with the hybrid factor of the constant and the random sequence, while cHFOPICS presents that OPICS is combined with the hybrid factor of the constant and chaotic sequence. Table 9 lists Error obtained by rHFOPICS, cHFOPICS, and OPICS, where the better Error between rHFOPICS and OPICS is highlighted in boldface, and the better Error between cHFOPICS and OPICS is marked with asterisk. Table 10 lists the results of the multiple-problem Wilcoxon’s test between OPICS and HFOPICS for all functions.

Table 9: Error obtained by OPICS and HFOPICS at .
Table 10: Results of the multiple-problem Wilcoxon’s test for OPICS and HFOPICS for 20 functions at .

It can be seen from Table 7 that the hybrid factor can make the OPICS enhance the accuracy of solutions on most of functions. In terms of Error, the results can be kept when rHFOPICS and cHFOPICS solve for , , , , and . As for unimodal functions, rHFOPICS and cHFOPICS bring the higher accurate solutions to . As for multimodal functions, except for , rHFOPICS achieves the solutions with higher accuracy, while cHFOPICS does the same things except for and . As for rotated and/or shifted functions, rHFOPICS and cHFOPICS perform better than OPICS except for . Moreover, Table 8 shows that rHFOPICS gains higher value than value, and rHFOPICS and OPICS are of significant difference at two significant levels. It suggests that rHFOPICS is significantly better than OPICS. We can see from Table 8 that cHFOPICS also achieves higher value than value, and OPICS is significantly inferior to cHFOPICS at significant level.

5. Conclusion

In this paper, we presented a hybrid factor strategy for CS, called HFCS. The hybrid factor strategy was combined with the constant and the random sequence or the chaotic sequence. HFCS employed the dimensional distance to select dimensions to use the constant or the random sequence or the chaotic sequence. HFCS was tested on a suit of 20 benchmark functions. The results show that the hybrid factor strategy can effectively improve the performance of CS for most of functions including solution quality and convergence speed. The results also show that the hybrid factor strategy is effective and stable when the dimension of problems increases. In addition, the hybrid factor can be easy to integrate into other improved algorithms.

There are several interesting directions for future work. Experimentally, the mean dimensional distance is used to select some dimensions; thus, the next work will be performed using different distance, for example, the median dimensional distance. Secondly, we plan to investigate the hybrid factor strategy for other improved algorithms. Last but not least, we plan to apply HFCS to some real-world optimization problems.

Competing Interests

The authors declare that they have no competing interests.

Acknowledgments

This work was supported by the Natural Science Foundation of Fujian Province of China under Grant no. 2016J01280 and the Project of Science and Technology of Fujian Department of Education of China under Grant no. 14156.

References

  1. X.-S. Yang and S. Deb, “Cuckoo search via Lévy flights,” in Proceedings of the World Congress on Nature and Biologically Inspired Computing (NaBIC '09), pp. 210–214, IEEE, Coimbatore, India, December 2009. View at Publisher · View at Google Scholar · View at Scopus
  2. X.-S. Yang and S. Deb, “Engineering optimisation by cuckoo search,” International Journal of Mathematical Modelling and Numerical Optimisation, vol. 1, no. 4, pp. 330–343, 2010. View at Publisher · View at Google Scholar · View at Scopus
  3. F. Wang, L. G. Luo, X.-S. He, and Y. Wang, “Hybrid optimization algorithm of PSO and Cuckoo Search,” in Proceedings of the 2nd International Conference on Artificial Intelligence, Management Science and Electronic Commerce (AIMSEC '11), pp. 1172–1175, Dengfeng, China, August 2011. View at Publisher · View at Google Scholar · View at Scopus
  4. A. Ghodrati and S. Lotfi, “A hybrid CS/PSO algorithm for global optimization,” in Intelligent Information and Database Systems, vol. 7198 of Lecture Notes in Computer Science, pp. 89–98, Springer, Berlin Heidelberg, 2012. View at Google Scholar
  5. G. Wang, L. Guo, H. Duan, L. Liu, H. Wang, and J. Wang, “A hybrid meta-heuristic DE/CS Algorithm for UCAV path planning,” Journal of Information and Computational Science, vol. 9, no. 16, pp. 4811–4818, 2012. View at Google Scholar · View at Scopus
  6. R. G. Babukartik and P. Dhavachelvan, “Hybrid algorithm using the advantage of ACO and cuckoo search for job scheduling,” International Journal of Information Technology Convergence and Services, vol. 2, no. 4, pp. 25–34, 2012. View at Publisher · View at Google Scholar
  7. P. R. Srivastava, R. Khandelwal, S. Khandelwal, S. Kumar, and S. S. Ranganatha, “Automated test data generation using cuckoo search and tabu search (CSTS) algorithm,” Journal of Intelligent Systems, vol. 21, no. 2, pp. 195–224, 2012. View at Publisher · View at Google Scholar · View at Scopus
  8. X. Y. Liu and M. L. Fu, “Cuckoo search algorithm based on frog leaping local search and chaos theory,” Applied Mathematics and Computation, vol. 266, pp. 1083–1092, 2015. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  9. X. Li, J. N. Wang, and M. H. Yin, “Enhancing the performance of cuckoo search algorithm using orthogonal learning method,” Neural Computing and Applications, vol. 24, no. 6, pp. 1233–1247, 2014. View at Publisher · View at Google Scholar · View at Scopus
  10. H. Q. Zheng and Y. Q. Zhou, “A cooperative coevolutionary Cuckoo search algorithm for optimization problem,” Journal of Applied Mathematics, vol. 2013, Article ID 912056, 9 pages, 2013. View at Publisher · View at Google Scholar · View at MathSciNet
  11. X.-X. Hu and Y.-L. Yin, “Cooperative co-evolutionary cuckoo search algorithm for continuous function optimization problems,” Pattern Recognition and Artificial Intelligence, vol. 26, no. 11, pp. 1041–1049, 2013. View at Google Scholar · View at Scopus
  12. L. J. Wang, Y. W. Zhong, and Y. L. Yin, “A hybrid cooperative cuckoo search algorithm with particle swarm optimisation,” International Journal of Computing Science and Mathematics, vol. 6, no. 1, pp. 18–29, 2015. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  13. J. D. Huang, L. Gao, and X. Y. Li, “An effective teaching-learning-based cuckoo search algorithm for parameter optimization problems in structure designing and machining processes,” Applied Soft Computing Journal, vol. 36, pp. 349–356, 2015. View at Publisher · View at Google Scholar · View at Scopus
  14. S. Walton, O. Hassan, K. Morgan, and M. R. Brown, “Modified cuckoo search: a new gradient free optimisation algorithm,” Chaos, Solitons & Fractals, vol. 44, no. 9, pp. 710–718, 2011. View at Publisher · View at Google Scholar · View at Scopus
  15. T. Ljouad, A. Amine, and M. Rziza, “A hybrid mobile object tracker based on the modified Cuckoo Search algorithm and the Kalman Filter,” Pattern Recognition, vol. 47, no. 11, pp. 3597–3613, 2014. View at Publisher · View at Google Scholar · View at Scopus
  16. E. Valian, S. Mohanna, and S. Tavakoli, “Improved cuckoo search algorithm for global optimization,” International Journal of Communications and Information Technology, vol. 1, no. 1, pp. 31–44, 2011. View at Google Scholar
  17. J. Wang and B. H. Zhou, “A hybrid adaptive cuckoo search optimization algorithm for the problem of chaotic systems parameter estimation,” Neural Computing & Applications, vol. 27, no. 6, pp. 1511–1517, 2016. View at Publisher · View at Google Scholar · View at Scopus
  18. P. Mohapatra, S. Chakravarty, and P. K. Dash, “An improved cuckoo search based extreme learning machine for medical data classification,” Swarm and Evolutionary Computation, vol. 24, pp. 25–49, 2015. View at Publisher · View at Google Scholar · View at Scopus
  19. G.-G. Wang, S. Deb, A. H. Gandomi, Z. Zhang, and A. H. Alavi, “Chaotic cuckoo search,” Soft Computing, vol. 20, no. 9, pp. 3349–3362, 2015. View at Publisher · View at Google Scholar · View at Scopus
  20. L. Huang, S. Ding, S. H. Yu, J. Wang, and K. Lu, “Chaos-enhanced Cuckoo search optimization algorithms for global optimization,” Applied Mathematical Modelling, vol. 40, no. 5-6, pp. 3860–3875, 2016. View at Publisher · View at Google Scholar · View at Scopus
  21. B. Jia, B. Yu, Q. Wu, C. Wei, and R. Law, “Adaptive affinity propagation method based on improved cuckoo search,” Knowledge-Based Systems, vol. 111, pp. 27–35, 2016. View at Publisher · View at Google Scholar
  22. L. Wang, Y. Yin, and Y. Zhong, “Cuckoo search with varied scaling factor,” Frontiers of Computer Science, vol. 9, no. 4, pp. 623–635, 2015. View at Publisher · View at Google Scholar · View at Scopus
  23. L. J. Wang and Y. W. Zhong, “Cuckoo search algorithm with chaotic maps,” Mathematical Problems in Engineering, vol. 2015, Article ID 715635, 14 pages, 2015. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  24. L. D. S. Coelho, C. E. Klein, S. L. Sabat, and V. C. Mariani, “Optimal chiller loading for energy conservation using a new differential cuckoo search approach,” Energy, vol. 75, pp. 237–243, 2014. View at Publisher · View at Google Scholar · View at Scopus
  25. U. Mlakar, I. Fister Jr., and I. Fister, “Hybrid self-adaptive cuckoo search for global optimization,” Swarm and Evolutionary Computation, vol. 29, pp. 47–72, 2016. View at Publisher · View at Google Scholar
  26. X. Ding, Z. Xu, N. J. Cheung, and X. Liu, “Parameter estimation of Takagi-Sugeno fuzzy system using heterogeneous cuckoo search algorithm,” Neurocomputing, vol. 151, no. 3, pp. 1332–1342, 2015. View at Publisher · View at Google Scholar · View at Scopus
  27. L. Wang, Y. Zhong, and Y. Yin, “Nearest neighbour cuckoo search algorithm with probabilistic mutation,” Applied Soft Computing, vol. 49, pp. 498–509, 2016. View at Publisher · View at Google Scholar
  28. L. J. Wang and Y. W. Zhong, “One-position inheritance based cuckoo search algorithm,” International Journal of Computing Science and Mathematics, vol. 6, no. 6, pp. 546–554, 2015. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  29. X. T. Li and M. H. Yin, “A particle swarm inspired cuckoo search algorithm for real parameter optimization,” Soft Computing, vol. 20, no. 4, pp. 1389–1413, 2016. View at Publisher · View at Google Scholar
  30. L. J. Wang, Y. W. Zhong, and Y. L. Yin, “Orthogonal crossover cuckoo search algorithm with external archive,” Journal of Computer Research and Development, vol. 52, no. 11, pp. 2496–2507, 2015. View at Publisher · View at Google Scholar · View at Scopus
  31. L. J. Wang, Y. L. Yin, and Y. W. Zhong, “Cuckoo search algorithm with dimension by dimension improvement,” Journal of Software, vol. 24, no. 11, pp. 2687–2698, 2013. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  32. X. Li and M. Yin, “Modified cuckoo search algorithm with self adaptive parameter method,” Information Sciences, vol. 298, pp. 80–97, 2015. View at Publisher · View at Google Scholar · View at Scopus
  33. X.-S. Yang and S. Deb, “Multiobjective cuckoo search for design optimization,” Computers and Operations Research, vol. 40, no. 6, pp. 1616–1624, 2013. View at Publisher · View at Google Scholar · View at Scopus
  34. S. Hanoun, D. Creighton, and S. Nahavandi, “A hybrid cuckoo search and variable neighborhood descent for single and multiobjective scheduling problems,” The International Journal of Advanced Manufacturing Technology, vol. 75, no. 9–12, pp. 1501–1516, 2014. View at Publisher · View at Google Scholar · View at Scopus
  35. K. Chandrasekaran and S. P. Simon, “Multi-objective scheduling problem: hybrid approach using fuzzy assisted cuckoo search algorithm,” Swarm and Evolutionary Computation, vol. 5, pp. 1–16, 2012. View at Publisher · View at Google Scholar · View at Scopus
  36. X. X. Ouyang, Y. Q. Zhou, Q. F. Luo, and H. Chen, “A novel discrete cuckoo search algorithm for spherical traveling salesman problem,” Applied Mathematics & Information Sciences, vol. 7, no. 2, pp. 777–784, 2013. View at Publisher · View at Google Scholar · View at Scopus
  37. A. Ouaarab, B. Ahiod, and X.-S. Yang, “Discrete cuckoo search algorithm for the travelling salesman problem,” Neural Computing and Applications, vol. 24, no. 7-8, pp. 1659–1669, 2014. View at Publisher · View at Google Scholar · View at Scopus
  38. Y. Q. Zhou, H. Q. Zheng, Q. F. Luo, and J. Wu, “An improved cuckoo search algorithm for solving planar graph coloring problem,” Applied Mathematics and Information Sciences, vol. 7, no. 2, pp. 785–792, 2013. View at Publisher · View at Google Scholar · View at Scopus
  39. M. K. Marichelvam, T. Prabaharan, and X. S. Yang, “Improved cuckoo search algorithm for hybrid flow shop scheduling problems to minimize makespan,” Applied Soft Computing Journal, vol. 19, pp. 93–101, 2014. View at Publisher · View at Google Scholar · View at Scopus
  40. P. Dasgupta and S. Das, “A discrete inter-species cuckoo search for flowshop scheduling problems,” Computers and Operations Research, vol. 60, pp. 111–120, 2015. View at Publisher · View at Google Scholar · View at Scopus
  41. E. Teymourian, V. Kayvanfar, G. Komaki, and M. Zandieh, “Enhanced intelligent water drops and cuckoo search algorithms for solving the capacitated vehicle routing problem,” Information Sciences, vol. 334-335, pp. 354–378, 2016. View at Publisher · View at Google Scholar
  42. X. Jin, Y. Q. Liang, D. P. Tian, and F. Zhuang, “Particle swarm optimization using dimension selection methods,” Applied Mathematics and Computation, vol. 219, no. 10, pp. 5185–5197, 2013. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  43. N. Noman and H. Iba, “Accelerating differential evolution using an adaptive local search,” IEEE Transactions on Evolutionary Computation, vol. 12, no. 1, pp. 107–125, 2008. View at Publisher · View at Google Scholar · View at Scopus
  44. P. N. Suganthan, N. Hansen, J. J. Liang et al., “Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization,” KanGAL Report 2005005, Nanyang Technological University, Singapore, 2005. View at Google Scholar
  45. W. Y. Gong and Z. H. Cai, “Differential evolution with ranking-based mutation operators,” IEEE Transactions on Cybernetics, vol. 43, no. 6, pp. 2066–2081, 2013. View at Publisher · View at Google Scholar · View at Scopus
  46. S. García, D. Molina, M. Lozano, and F. Herrera, “A study on the use of non-parametric tests for analyzing the evolutionary algorithms' behaviour: a case study on the CEC'2005 Special Session on Real Parameter Optimization,” Journal of Heuristics, vol. 15, no. 6, pp. 617–644, 2009. View at Publisher · View at Google Scholar · View at Scopus