Abstract

This paper proposes a hybrid factor strategy for cuckoo search algorithm by combining constant factor and varied factor. The constant factor is used to the dimensions of each solution which are closer to the corresponding dimensions of the best solution, while the varied factor using a random or a chaotic sequence is utilized to farer dimensions. For each solution, the dimension whose distance to the corresponding one of the best solution is shorter than mean distance of all dimensional distances will be regarded as the closer one, otherwise as the farer one. A suit of 20 benchmark functions are employed to verify the performance of the proposed strategy, and the results show the improvement in effectiveness and efficiency of the hybridization.

1. Introduction

Cuckoo search algorithm (CS), proposed by Yang and Deb in 2009, is a new nature-inspired method for solving real-valued numerical optimization problems [1, 2]. The method utilizes Lévy flights random walk (LFRW) and biased random walk to search for new solutions and achieves the promising performance for many tough problems. This has attracted a lot of researchers, and many studies have been proposed. Some studies have focused on the combination CS with other optimization methods [313]. Some attempts have been made to improve search ability of LFRW and BSW [1432]. Other attentions have been played on CS for the combinational and multiobject problems [3341].

The above studies have made great contributions to CS. Nevertheless, according to the implementation in the literature [2], Lévy flights random walk (LFRW), one of search components, is used iteratively to search for new solutions. LFRW uses a mutation operator to generate new solutions based on the best solution obtained so far. A factor in LFRW is utilized to control Lévy flights not to be too aggressive; thus, it is suggested to a constant value, typically 0.01 [2]. In this case, it is beneficial to the solutions which are close to the best one, but it is a disadvantage to those far away from the best one. To avoid the above, Wang et al. [22] proposed a varied factor strategy for CS, named as VCS, where the random sequence factor obeying uniformly distribution is used to replace the constant one. Wang and Zhong [23] used a chaotic sequence factor instead of the constant one, called CCS. However, the above researches make the scale of step size of all dimensions of one solution not different due to the same factor. This may cause a part of dimensions to be too aggressive when the large factor is sampled or too inefficient in the case of the small factor.

In this paper, we aim at avoiding the above problem by using the different factor for the dimensions of each solution and then propose a hybrid factor based cuckoo search algorithm, termed as HFCS. The hybrid factor strategy (HF) combines the constant factor and the varied factor. The constant factor, typically 0.01, is used to benefit the dimensions which are closer to the corresponding ones of the best solution. The varied factor using a random sequence or a chaotic sequence is employed to drive the farer dimensions to be near the corresponding ones of the best one. HFCS selects the dimension of each solution by using the dimensional distance that can be defined as the distance between one dimension and the corresponding one of the best solution. If the dimensional distance of one dimension is shorter than the average of all dimensional distances, then this dimension is selected as the closer one, else as the farer one. The experiments are carried out on 20 benchmark functions to test HFCS, and the results show the improvement in effectiveness and efficiency of hybrid factor strategy.

The remainder of this paper is organized as follows. Section 2 describes the cuckoo search algorithm and the variants. Section 3 presents the proposed algorithm. Section 4 reports the experimental results. Section 5 concludes this paper.

2. Cuckoo Search Algorithm

2.1. CS

CS, a new nature-inspired algorithm based on the obligate brood parasitic behavior of some cuckoo species in combination with the Lévy flights behavior of some birds and fruit flies [1, 2] is a simple yet very promising population-based stochastic search technique. Generally, a nest represents a candidate solution , when solving an objective function with the solution space , , . Like evolutionary algorithms, the iteration process of CS includes the initial phase and evolutional phase.

In the initial phase, the whole population called solution is randomly sampled from solution space bywhere represents a uniformly distributed random variable on the range [] and is the population size.

According to the implementation of CS shown in the literature [2], CS iteratively uses two random walks: Lévy flights random walk (LFRW) and biased random walk (BRW) to search for new solutions.

LFRW is a random walk whose step size is drawn from Lévy distribution. At generation (), LFRW can be formulated as follows:where is a step size related to the scales of the problem. The ⊕ means entry-wise multiplications. Lévy(β) is drawn from a Lévy distribution for large steps:In CS, LFRW is employed to search for new solutions around the best solution obtained so far and implemented according to the following equation [2]:where is a factor (generally, ) and represents the best solution obtained so far:where is a constant and suggested to be 1.5, and are random numbers drawn from a normal distribution with mean of 0 and standard deviation of 1, and is a gamma function.

BRW is used to discover new solutions far enough away from the current best solution by far field randomization [1]. First, a trial solution is built with a mutation of the current solution as base vector and two randomly selected solutions as perturbed vectors. Second, a new solution is generated by a crossover operator from the current and the trial solutions. BSRW can be formulated as follows:where the random indexes and are the th and th solutions in the population, respectively, is the th dimension of the solution, and are random numbers on the range , and is a fraction probability.

After each random walk, CS selects a better solution according to the new generated and the current solutions fitness using the greedy strategy. At the end of each iteration process, the best solution is updated.

2.2. Variants of CS

CS is developed recently, but this algorithm has been researched a lot.

Some studies are an attempt to combine CS with other optimization techniques. Wang et al. [3] and Ghodrati and Lotfi [4], respectively, proposed a hybrid CS with particle swarm optimization. Wang et al. [5] applied differential evolution to optimize the process of selecting cuckoo of the CS model during the process of cuckoo in nest updating. Babukartik and Dhavachelvan [6] proposed the hybrid algorithm combining ant colony optimization and CS. Srivastava et al. [7] combined the CS algorithm's strength of converging to the solution in minimal time along with the tabu mechanism of backtracking from local optimal by Lévy flight. Liu and Fu [8] applied the local search mechanism of the frog leaping algorithm to enhance the local search ability of cuckoo search. Other techniques, such as the orthogonal learning strategy [9], cooperative coevolutionary (CC) framework [1012], and the teaching-learning-based optimization [13], are also hybridized to enhance the search ability of cuckoo search.

Some variants have paid attention to improve search ability of LFRW and BSW. Walton et al. [14] made a modification to the step size of Lévy flights decreasing as the number of generations increases in order to increase the convergence rate. Ljouad et al. [15] modified Lévy flights model with an adaptive step size based on the number of generations. Valian et al. [16], Wang and Zhou [17], and Mohapatra et al. [18] proposed adaptive step sizes of Lévy flights according to different equations with maximal and minimal step sizes, respectively. Wang et al. [19] and Huang et al. [20] used the chaotic sequence to change the step size of Lévy flights, respectively. Jia et al. [21] proposed the variable step length of Lévy flights and a method of discovering probability. Wang et al. [22, 23], respectively, used random sequence and chaotic sequence as the factor instead of the constant 0.01 in Lévy flights. Coelho et al. [24] integrated the differential operator into Lévy flights to search for new solutions. Mlakar et al. [25] proposed the hybrid algorithm using explicit control of exploration search strategies with the CS algorithm. Ding et al. [26] proposed heterogeneous search strategies based on the quantum mechanism. Wang et al. [27] employed a probabilistic mutation to enhance the Lévy flights. Wang and Zhong [28] added a crossover-like operator in search schema of Lévy flights using one-position inheritance mechanism. Inspired by the social learning and cognitive learning, Li and Yin [29] added these two learning parts into Lévy flights and into BSW. Wang et al. [30, 31] utilized orthogonal crossover and dimension by dimension improvement to enhance the search ability of BSW, respectively. Li and Yin [32] used two new mutation operators based on the rand and best individuals among the entire population to enhance the search ability of BSW.

Other versions have focused on the combinational and multiobject problems. Yang and Deb [33], Hanoun et al. [34], and Chandrasekaran and Simon [35] modified the cuckoo search to solve multiobjective optimization problems. Ouyang et al. [36] and Quaarab et al. [37] proposed the improved CS to solve the travelling salesman problem. Zhou et al. [38] applied an improved CS for solving planar graph coloring problem. Marichelvam et al. [39] and Dasgupta and Das [40] presented the discrete versions for the flow shop scheduling problem. Teymourian et al. [41] applied CS for solving the capacitated vehicle routing problem.

3. HFCS

According to the implementation of CS [2], the factor, for example, 0.01, is used to control Lévy flights not to be too aggressive and to try to make the solutions not jump outside of the search space. In this case, a small step size will be got due to this small factor. Obviously, this makes a contribution to the solutions nearby the best one, but it is not more helpful to the solutions far away from the best one, resulting in the slow convergence. Wang et al. [22, 23] utilized the random sequence and the chaotic sequence instead of the constant one and proved the improvement on the convergence and solution quality. However, the factor with constant or with random sequence or chaotic sequence makes the scale of each dimension of each solution be the same. This perhaps results in a problem that some dimensions near the corresponding dimensions of the best one have larger scale when using random or chaotic sequence, while some dimensions far away from the corresponding dimensions of the best one get the smaller scale in the case of the constant factor. To remedy the above problem, a hybrid factor based cuckoo search is proposed, called HFCS, and presented in Algorithm 1.

;
solutions using Eq. (1);
the solution ;
;
the best solution according to the Fitness;
WHILE ()
;
 varied a rand or a chaotic factor from sequence
 FOR ( from 1 to )
  FOR (j from 1 to )
   If the th of the th solution is closer to the th of
    
   Else
    
   Endif
    th of new solution with Eq. (4)
  ENDFOR
   and select from and ;
  ;
 ENDFOR
 FOR ( from 1 to NP)
   a new solution with Eq. (6)
   and select from and ;
  ;
 ENDFOR
and update the best solution;
ENDWHILE

It is worth pointing out that the key part of HFCS is to select the dimension where the constant factor or the varied factor is used. Herein, the dimensional distance and the mean dimension distance, similarly done in [42], are used and shown in (7) and (8)where presents the th dimensional distance from the th solution to the best onewhere is the average dimensional distance of the th solution.

For each dimension of each solution, the th dimension is closer to the corresponding dimension of the best solution, called the closer dimension, when the th dimensional distance is shorter than the mean dimensional distance; else the th dimension is called the farer one. For the closer dimension, the constant factor 0.01 is used in (4), while the random sequence or the chaotic sequence is employed in (4) for the farer one.

4. Experiments and Results

In this section, HFCS is tested on 20 benchmark functions [43]. These 20 benchmark functions include 2 unimodal functions and ; 8 multimodal functions , , , , , , , and ; and 10 rotated and/or shifted functions . More detail about 20 functions can be found in [43, 44].

HFCS has the same parameters as CS, and we use the same setting for them, unless a change is mentioned. The parameter is 0.25. Each algorithm is performed 25 times for each function with the dimension , 30, and 50, respectively. The population size of each algorithm is when and , while it is 30 in the case of . The maximum function evaluations are . Note that HFCS can hybridize the constant factor and the random sequence or the chaotic sequence, resulting in two algorithms: HFCS with the combination of the constant factor and the random sequence similarly done in [22], termed as rHFCS, and HFCS with the combination of the constant factor and the chaotic sequence same generated in [23], called cHFCS.

Error and the convergence speed are employed to analyze HFCS. Error, which is the function fitness Error for the solution obtained by the algorithms, is defined as , where is the known global optimum of function. Moreover, the average and standard deviation of the best error values, presented as “,” are used in the different tables. Additionally, the Wilcoxon signed-rank at the 5% significance level is used to show the differences of Error between two algorithms. The “+” symbol shows that HFCS outperforms the compared algorithm at the 5% significant level, the “−” symbol shows that the compared algorithm does better than HFCS, and the “=” symbol means that HFCS is equal to the compared algorithm. We summarily give the total number of statistical significant cases at the bottom of each table.

The convergence speed is measured by using the function evaluations (FES) which are spent till the algorithm reaches the given Error which is 10−6 or 10−2 as suggested in [43] within the maximum function evaluations. The average and standard deviation of FES and successful runs (SR) to the given Error during 25 runs, termed as “ (SR),” are utilized in Table 3. Also, the convergence speed is shown graphically by using the convergence graphs where the mean Error of the best solution at iteration process over the total run is presented.

4.1. Performance of HFCS

To show the effect of the proposed algorithm, Table 1 lists Error obtained by CS, rHFCS, and cHFCS, and Table 2 lists the results of the multiple-problem Wilcoxon’s test which was done similarly in [45, 46].

It can be seen from Table 1 that the hybrid factor strategy can overall improve the solution quality in terms of Error. As for 2 unimodal functions, rHFCS and cHFCS bring more accurate solutions than CS. As for 8 multimodal functions, rHFCS obtains the solutions with higher accuracy, while cHFCS does the same. As for 10 rotated and/or shifted functions, rHFCS and cHFCS both make the great improvement on the accuracy of solutions except for and . According to the summary of “+,” “=,” and “−,” rHFCS wins CS on 16 out of 20 functions, ties CS on 2 out of 20 functions, and loses CS on 2 out of functions. cHFCS is superior to CS on 16 out of 20 functions, is equal to CS on 2 out of 20 functions, and is inferior to CS on 2 out of 20 functions. In addition, in terms of the results in Table 2, rHFCS and cHFCS both get apparently higher value than value. This suggests that rHFCS and cHFCS outperform CS obviously.

Moreover, Table 3 lists the required function evaluations (FES) to the given Error within the maximum function evaluations. It can be observed from Table 2 that rHFCS and cHFCS overall have a quicker convergence to the given Error within the maximum function evaluations. For instance, CS, rHFCS, and cHFCS share the same stable convergence to the given Error in terms of SR on , , and ; however, rHFCS and cHFCS converge quicker according to FES. As for , , and , rHFCS and cHFCS show the better stability and the quicker convergence. As for , , and , rHFCS and cHFCS still converge quicker than CS, although these two algorithms do not gain the stability convergence with the help of SR. As for , CS stably converge to the given Error, but rHFCS and cHFCS obtain the quicker convergence.

To further show the convergence performance, the convergence curves obtained by CS, rHFCS, and cHFCS for parts of functions are plotted in Figure 1.

It can be observed from Figure 1 that rHFCS and cHFCS achieve overall the better convergence speed in terms of curves. As for the functions solved better by rHFCS and cHFCS, for example, , , , , and , rHFCS and cHFCS still converge quicker than CS; see Figures 1(a), 1(c), 1(d), 1(e), and 1(f). As for , cHFCS gains the same accurate solution with CS, but cHFCS obtains a quicker convergence than CS on the beginning of iteration. It is interesting that CS does better than rHFCS and cHFCS when converging to the given Error, shown in Table 3; however, because of the lack of convergence stability, rHFCS and cHFCS show better convergence curves; see Figure 1(f).

According to the Error, FES, and convergence curve, HFCS overall makes an improvement on solution quality and convergence speed. This is because different factors provide different step size information, resulting in the improvement of search ability.

4.2. Scalability of HFCS

The scalability study is investigated to show the performance of HFCS when the dimensionality of problem changes. The experiments are carried out on the 20 functions at 10- and 50- due to their definition up to 50- [44]. The results are tabulated in Tables 4 and 5.

In the case of , according to Error, HFCS exhibits a great improvement on solutions to most of functions. For example, except for , , and , rHFCS brings the higher accurate solutions, while cHFCS gains the solutions with higher accuracy except for and . Furthermore, in terms of the total of “+,” “=,” and “−,” rHFCS performs better than CS on 16 out of 20 functions, shows equivalence to CS on 3 out of 20 functions, and performs worse than CS on 1 out of 20 ones. cHFCS wins CS on 17 out of 20 functions, ties CS on 2 out of 20 ones, and loses CS on 1 out 20 ones. Additionally, rHFCS and cHFCS gain higher value than value markedly with the help of the results listed in Table 5.

When , HFCS still can bring solutions with higher quality to most of functions. For instance, rHFCS achieves the better solutions except for , , , and , while cHFCS does well except for , , , and . rHFCS outperforms, draws a tie, and loses CS on 14, 4, and 2 out of 20 functions, respectively. cHFCS is superior to CS on 15 out of 20 functions, is equal to CS on 3 out of 20 functions, and is inferior to CS on 2 out of 20 ones. Moreover, by the aid of the results reported in Table 5, rHFCS and cHFCS also obtain remarkable higher value than value.

Summarily, it suggests that the improvement of HFCS is stable when the dimensionality of problems increases.

4.3. Comparison with VCS and CCS

The comparison is made between HFCS with VCS and CCS due to the random sequence and the chaotic sequence used in these two algorithms, respectively. Table 6 lists the Error obtained by VCS and rHFCS, while Table 7 shows the Error achieved by cHFCS and CCS where CCS employs the chaotic sequence as factor in (4) and the parameter is 0.25, termed as fCCS. Table 8 reports the results of the multiple-problem Wilcoxon’s test between HFCS and VCS and fCCS for all functions.

It can be observed from Table 6 that the hybrid factor of the constant and the random sequence overall perform better than the random sequence. As for unimodal functions, compared with VCS, rHFCS brings solutions with higher accuracy. Moreover, as for multimodal functions, rHFCS not only keeps the same Error as VCS for , , and , but also enhances the accuracy except for and . Additionally, as for rotated and/or shifted functions, the same conclusion on multimodal functions can be drawn on rHFCS. In all, rHFCS outperforms VCS on 7 out of 20 functions, ties VCS on 11 out of 20 functions, and loses VCS on 2 out of 20 ones. Additionally, rHFCS gets higher value than value, and rHFCS and VCS are of significant difference at two significant levels. It suggests that rHFCS is overall better than VCS.

Table 7 shows that the hybrid factor of the constant and the chaotic sequence overall perform better than the chaotic sequence. Compared with fCCS, cHFCS works better on unimodal functions. As for multimodal functions, cHFCS gains the same performance as fCCS for , , and and increases the accuracy of solutions to , , and . As for rotated and/or shifted functions, cHFCS performs better on , , , , and . In summary, cHFCS is superior to fCCS on 6 out of 20 functions, is equal to fCCS on 12 out of 20 functions, and is inferior to fCCS on 2 out of 20 ones. In addition, cHFCS gains higher value than value, although cHFCS and fCCS are not significant differences at two significant levels. It reveals that cHFCS is overall better.

4.4. Effect of Integration into Improved Algorithms

In this section, we investigate the performance of the hybrid factor integrated into improved algorithms to analyze the suitability. Considering the implementation of the improved algorithm, we choose one-position inheritance cuckoo search algorithm, called OPICS, to be integrated, resulting in rHFOPICS and cHFOPICS. rHFOPCIS presents that OPICS is enhanced with the hybrid factor of the constant and the random sequence, while cHFOPICS presents that OPICS is combined with the hybrid factor of the constant and chaotic sequence. Table 9 lists Error obtained by rHFOPICS, cHFOPICS, and OPICS, where the better Error between rHFOPICS and OPICS is highlighted in boldface, and the better Error between cHFOPICS and OPICS is marked with asterisk. Table 10 lists the results of the multiple-problem Wilcoxon’s test between OPICS and HFOPICS for all functions.

It can be seen from Table 7 that the hybrid factor can make the OPICS enhance the accuracy of solutions on most of functions. In terms of Error, the results can be kept when rHFOPICS and cHFOPICS solve for , , , , and . As for unimodal functions, rHFOPICS and cHFOPICS bring the higher accurate solutions to . As for multimodal functions, except for , rHFOPICS achieves the solutions with higher accuracy, while cHFOPICS does the same things except for and . As for rotated and/or shifted functions, rHFOPICS and cHFOPICS perform better than OPICS except for . Moreover, Table 8 shows that rHFOPICS gains higher value than value, and rHFOPICS and OPICS are of significant difference at two significant levels. It suggests that rHFOPICS is significantly better than OPICS. We can see from Table 8 that cHFOPICS also achieves higher value than value, and OPICS is significantly inferior to cHFOPICS at significant level.

5. Conclusion

In this paper, we presented a hybrid factor strategy for CS, called HFCS. The hybrid factor strategy was combined with the constant and the random sequence or the chaotic sequence. HFCS employed the dimensional distance to select dimensions to use the constant or the random sequence or the chaotic sequence. HFCS was tested on a suit of 20 benchmark functions. The results show that the hybrid factor strategy can effectively improve the performance of CS for most of functions including solution quality and convergence speed. The results also show that the hybrid factor strategy is effective and stable when the dimension of problems increases. In addition, the hybrid factor can be easy to integrate into other improved algorithms.

There are several interesting directions for future work. Experimentally, the mean dimensional distance is used to select some dimensions; thus, the next work will be performed using different distance, for example, the median dimensional distance. Secondly, we plan to investigate the hybrid factor strategy for other improved algorithms. Last but not least, we plan to apply HFCS to some real-world optimization problems.

Competing Interests

The authors declare that they have no competing interests.

Acknowledgments

This work was supported by the Natural Science Foundation of Fujian Province of China under Grant no. 2016J01280 and the Project of Science and Technology of Fujian Department of Education of China under Grant no. 14156.