Abstract

In order to improve convergence velocity and optimization accuracy of the cuckoo search (CS) algorithm for solving the function optimization problems, a new improved cuckoo search algorithm based on the repeat-cycle asymptotic self-learning and self-evolving disturbance (RC-SSCS) is proposed. A disturbance operation is added into the algorithm by constructing a disturbance factor to make a more careful and thorough search near the bird’s nests location. In order to select a reasonable repeat-cycled disturbance number, a further study on the choice of disturbance times is made. Finally, six typical test functions are adopted to carry out simulation experiments, meanwhile, compare algorithms of this paper with two typical swarm intelligence algorithms particle swarm optimization (PSO) algorithm and artificial bee colony (ABC) algorithm. The results show that the improved cuckoo search algorithm has better convergence velocity and optimization accuracy.

1. Introduction

Cuckoo search (CS) algorithm, a new biological heuristic algorithm, is put forward by Yang and Deb in 2009. It simulates the cuckoo’s seeking nest and spawning behavior and introduces Levy flight mechanism into it, which is able to quickly and efficiently find the optimal solution [1, 2]. Studies have proved that CS algorithm is better than other swarm intelligence algorithms in convergence rate and optimization accuracy, such as genetic algorithm (GA), particle swarm optimization (PSO) algorithm, and artificial bee colony (ABC) algorithm [3]. Because this algorithm has the characteristics of fewer parameters and being simple and easy to implement, now it has been successfully applied in a variety of engineering optimization problems. So the CS algorithm has a very high potential research value [4, 5].

CS algorithm is a new type of bionic algorithm. Many scholars carry out many researches in CS algorithm and put forward the corresponding improvement strategies. In literature [6], it gains insight into search mechanisms of CS algorithm and analyzes why it is efficient; meanwhile, it discusses the essence of algorithms and its link to self-organizing systems [6]. In literature [7], in order to increase CS efficiency, it exploits several parameters of the CS algorithm involving the Levy distribution factor beta () and the probability factor () and by seeking optimum values of these parameters efficiency of CS algorithm are improved [7]. In literature [8], it studied the algorithmic structure and behavior of CS and Levy distribution in detail, and then by comparing with widely used optimization algorithms (i.e., DE and GA) statistical results verified that CS has a more superior problem-solving ability [8]. For the purpose of enhancing the search ability of the cuckoo search (CS) algorithm, an improved robust approach, called harmony search (HS), is put forward, in which method a mutation operator is added to the process of the cuckoos updating to speed up convergence [9]. In literature [10], in order to solve combinatorial problems, it extends and improves CS by reconstructing its population and introducing a new category of cuckoos [10]. A cuckoo optimal algorithm based on the exchange operator and the chaotic disturbance is proposed, which introduces the exchange operator theory of the particle swarm optimization algorithm to improve convergence rate and optimization accuracy [11]. A cooperative coevolutionary cuckoo search algorithm is put forward by applying the framework of cooperative coevolutionary, which divides the solution vectors of population into several subvectors and constructs the corresponding subswarms [12]. A novel cuckoo search optimization algorithm based on Gauss distribution is proposed by adding a Gauss distribution to CS algorithm to improve convergence rate [13]. A new self-adaptive cuckoo search algorithm is proposed by using a self-adaptive parameter control strategy to adjust the step size of CS to enhance its search ability [14]. Because the CS algorithm is highly random search according to Levy flight mechanism and shows a strong leaping, it is easy to jump from one region to another region, which makes the search that around each bird’s nest location not careful and thorough and cannot make full use of information nearby the bird’s nest locations. So the CS algorithm has characteristics of weak local searching ability, slow convergence rate, and low optimization accuracy.

In order to make up the defect of the algorithm on this aspect, a kind of improved cuckoo search algorithm based on the repeat-cycle asymptotic self-learning and self-evolving disturbance (RC-SSCS) is proposed. In order to obtain better disturbance effect, the learning and updating strategy of the worst frog in the shuffled frog leaping algorithm (SFLA) and part of differential evolution (DE) thought are introduced into the constructing of disturbance factor, thus, which makes every disturbance with the effect of nest’s self-learning and self-evolving. Finally, the six typical test functions are chosen for simulation experiment and the simulation results proved that the improved cuckoo search algorithm has better convergence rate and optimization accuracy. The paper is organized as follows. In Section 2, the cuckoo search algorithm is introduced. The improved cuckoo search algorithm based on the repeat-cycle asymptotic self-learning and self-evolving disturbance is presented in Section 3. In Section 4, the simulation experiments and results analysis are introduced in details. Finally, the conclusion illustrates the last part.

2. Cuckoo Search Algorithm

Cuckoo search algorithm (CS) is developed by Xin-she Yang by observing the magical nature phenomenon and then giving an artificial processing, which is a new type of heuristic search algorithm [15, 16]. This algorithm is mainly based on two aspects: the cuckoo’s parasitic reproduction mechanism and Levy flights search principle. In nature, cuckoos use a random manner or a quasirandom manner to seek bird’s nest location [10]. Most of cuckoos lay their eggs in other bird nests and let the host raise their cubs instead of them. If a host found that the eggs are not its owns, it will either throw these alien eggs away from the nest or abandon its nest and build a new nest in other places. However, there are some cuckoos choosing nest that the color and shape of the host’s egg are similar with their own to win the host’s love, which can reduce the possibility of their eggs being abandoned and increase the reproduction rate of the cuckoos.

In general, each cuckoo can only lay one egg and each egg on behalf of one solution (cuckoo). The purpose is to make the new and potentially better solutions replace the not-so-good solutions (cuckoos). In order to study the cuckoo search algorithm better, the simplest method is adopted, that is to say, only one egg is in each nest. In this case, an egg, a bird’s nest, or a cuckoo is no different, which is to say, each nest corresponding to a cuckoo’s egg. For simplicity in describing the cuckoo search algorithm, Yang and Deb use the following three idealized rules to construct the cuckoo algorithm [17].(1)Each cuckoo only lays one egg at a time and randomly chooses bird’s nest to hatch the egg.(2)The best nest will carry over to the next generation.(3)The number of available host nests is fixed, and the probability of a host discovering an alien egg is . In this case, the host bird may either throw the alien egg away or abandon its nest so as to build a new nest in a new location.

Under the above conditions, the specific steps of CS algorithm are described as follows.

(1) Initialization Setting. Randomly generate bird’s nest location , and then take the bird’s nest positions into test functions for choosing experiments. Through testing, the best bird’s nest location is chosen and carried over to the next generation.

(2) Searching of Bird’s Nest Location. Equation (1) is used to realize the location update and search the bird’s nest location of the next generation in order to gain a new set of bird’s nest locations, which are taken into the test function again for testing experiments. After comparing with the last generation nest locations, the best bird’s nest location is chosen and entered into the next step. Consider

(3) Selection of Bird’s Nest Locations. The probability of a host discovering an alien egg is compared with the random number that obeys uniform distribution . If , the value of is randomly changed. Otherwise it does not need to be changed.

Then the changed bird’s nest locations are calculated with the test functions, which are compared with the optimal position of previous generation and best bird’s nest locations is recorded. Finally, the optimal nest position is chosen.

(4) Accuracy or Iterations Judgment. Calculate and judge whether achieves the object accuracy or the terminating conditions. If it meets the requirements, is the global optimal solution ; if it is not be met, is kept to the next generation and return to step and start the next loop iteration and update again.

According to these four steps described above of cuckoo search algorithm, the cuckoo algorithm not only uses the Levy flight (global search) search method, but also introduces the elite reserved strategy (local search), which makes the algorithm have both global search ability and local search ability. The purpose of step in this algorithm is to increase the diversity of solution so that the algorithm is prevented from being caught into local optimum and achieving the global optimum.

The search path of CS algorithm is different with other swarm algorithms. The CS algorithm uses Levy flight, which has character of strong randomness. Broadly speaking, Levy flight is a random walk, whose step size obeys Levy distribution, and the direction of travel is subject to uniform distribution. The step size vector of CS algorithm is determined by Mantegna rule of Levy distribution characteristics. In Mantegna rules, the step size is designed as follows:where and obey the normal distribution, that is to saywhere

But here the chosen method of the direction obeys uniform distribution. The searching pattern of CS algorithm is Levy flight, for instance, the th cuckoo of the th generation generates the next generation solution :where is a point to point multiplication and the step size of obeys the Levy distribution, which can be expressed as follows:

Here, the Mantegna rules are used to calculate the step size. In (5), is the step controlled quantity mainly used to control the direction and step size. In that its distribution is a power function, Levy distribution has infinite variance and its increment obeys heavy-tailed distribution [18, 19]. Levy flight is seemingly Brownian motion under the status of a long-distance flight, or it may be described that Levy flight consists of frequent short-jump and occasional long-jump. The long-jump can help the CS algorithm jump out of local optimum. Considerwhere is the search space size of optimization problems.

Thus, the generation of some new solutions is gradually through the Levy flight and rand walk around the optimal solution to obtain the optimal solution, which can speed up the local searching. On the contrary, a part of new solutions is far away from the current optimal solution, because they are randomly generated by deviating from remote locations. The main purpose of these solutions is to ensure that the system does not fall into the local optimal solution.

A large number of simulation experiments prove that when the bird’s nest groups values = 15~40 and the detection probability , it can solve many optimization problems [16]. Once the population size is fixed, the discovery probability is an important parameter to balance the global search and the local search and control the elite selection. Therefore, the CS algorithm has characteristic of less parameters, excellent searching path and strong global optimization ability, and so forth [20].

3. The Improved Cuckoo Search Algorithm

For each time, the length and direction of cuckoos’ searching path are highly randomly changed based on Levy flight mechanism, so they are easy to jump from one region to another, which is beneficial to the global search in the early stage of optimization and make the CS algorithm have strong global search ability [16, 21]. It is just because the CS algorithm shows a strong jumping in the search process that makes the local search around each bird’s nest location no careful and no thorough. Therefore, the local optimization information near bird’s nest location has not been fully utilized, which leads to that the local search ability is not strong, the optimization accuracy of the later period is not high, and the convergence speed is slow. In order to improve the convergence velocity and the optimization accuracy of the CS algorithm, a self-learning and self-evolving disturbance operation is added to the algorithm, and a further study for the improvement of disturbance is also discussed.

3.1. Cuckoo Search Algorithm Based on Self-Learning and Self-Evolving Disturbance

In order to make the algorithm carry on more careful and thorough searches near bird’s nest locations, after each iteration of CS algorithm a set of obtained preponderant bird’s nest locations instead of letting directly into the next iteration, a disturbance operation is applied to it for making a further search on the neighborhood of .

Due to the general disturbance, such as Gauss perturbation and random perturbation, all having the great randomness and blindness, in order to obtain a better disturbance effect, the learning and updating strategy of the worst frog in the shuffled frog leaping algorithm (SFLA) and a part of differential evolution (DE) thought are introduced into the constructing of the disturbance factor. The introduction of learning and updating strategy of the worst frog can increase bird’s self-learning ability of each bird’s nest learning from the optimal nest [22]. That is to say it can increase the speed of other solutions approaching to the best solution and improve the algorithm’s convergence rate. The introduction of the differential evolution thought can increase the diversity of bird’s nest location, which makes every bird’s nest have the evolution ability. The good learning and evolving ability are bound to cause high search ability.

Based on the above analysis, the disturbance factor is constituted of two parts: one part is the learning factor from SFLA learning and updating strategy of the worst frog; the other part is the evolution factor based on a part of differential evolution (DE) thoughts. Thus, the whole disturbance factor makes every disturbance with the effect of bird’s self-learning and self-evolving ability. Disturbance factor is structured as follows:where is the disturbed bird’s nest location, is the current best location, and are random number from , , is the number of bird’s nest population, and are the bird’s nest locations corresponded to a random number and , , and obey the Gaussian distribution, is the learning scale, and is evolution scale.

For better controlling of the disturbance range, a controlled quantity of disturbance scope is introduced to control the search scope size around a bird’s nest. After being disturbed, the bird’s nest location is express aswhere is the bird’s nest location after being disturbed, is the controlled quantity of the disturbance range, and is the point to point multiplication.

According to (8) and (9), after searching and selection operation obtain a bird’s nest location , do not let this nest go into the next generation directly. Instead, add to it a disturbance that take as the foundation and disturbance factor, within the distribute scope that is controlled by . Finally after disturbance get a new bird’s nest location , and let go into the next generation.

In different stages, the bird nest’s learning ability, the evolving ability, and the disturbance scope can be controlled by adjusting the values of , , and . For example, when is a given value, if and is a certain constant, then within this scope of search, the bird’s nest only has evolution ability, without learning ability; by the same token, if and is a certain constant, within the search scope, the algorithm only has learning ability and no evolving ability.

The disturbance range controller not only controls the search scope size, but also affects a bird nest’s learning ability and evolution ability. When is given a big value, the coefficient before learning factor and evolving factor can be driven larger, so the learning and evolution ability become stronger. Under this condition, the search scope of the algorithm is larger, the search ability is stronger but the search fineness is lower. By the same token, when is given a small value, the learning and evolution ability are relatively weaker, the search scope is smaller, the search ability becomes weaker, but the search fineness is much higher.

3.2. Cuckoo Search Algorithm Based on Repeat-Cycle Asymptotic Self-Learning and Self-Evolving Disturbance

The ideal disturbance effect should be that in the early stage of disturbances, it has higher searching ability and in the later stage of disturbances has higher search accuracy. In this way, it can obtain a better bird’s nest location fast and then carry on a more careful and thorough search around the better nest location. Based on the above analysis, a repeat-cycle asymptotic disturbance method is proposed. Adopting a dynamic adjustment makes the search scope gradually change from big to small after disturbance. In other words, based on the results of last disturbance, narrowing the disturbance scope, go on the next disturbance. The repeat-cycle asymptotic disturbance search is carried out in turn.

In order to obtain a better bird’s nest position at a faster speed, at the beginning of disturbance is given a larger value. With the disturbance continuing, the bird’s nests gradually get better and the adjustment of bird’s nest locations is more and more subtle. Particularly in the condition that the response of fitness value is sensitive to parameter changing, it needs to use a very small amount of control to make the position have a fine-tuning near optimal value. The is adjusted according to the following:where is the minimum value of control amount of disturbance scope, is the maximum value, is the total number of repeat-cycle disturbances, and is the th disturbance.

In the circulation disturbance, the controlled quantity of the disturbance range is controlled by disturbance number . Make changes between and . When , is maximum . With the disturbance number increasing gradually decrease little by little. Finally when , reaches minimum value .

In the early stage of the repeat-cycle disturbances, the bird’s nest self-learning and self-evolving function plays a leading role, which can make it surely find a better bird’s nest faster. With disturbance number increasing, the search scope gradually turns smaller and the fineness degree of search is gradually enhanced. In the later stage of the repeat-cycle disturbances, the fineness search plays a leading role.

3.3. Algorithm Procedure

The algorithm procedure of the improved cuckoo search algorithm (RC-SSCS) is shown in Figure 1. The specific steps of RC-SSCS algorithm are described as follows.

Step 1 (initialization). Randomly generate bird’s nest location , select the best bird’s nest location, and carry it over to the next generation.

Step 2 (searching operation). Use the location update (1) to search for the next generation bird’s nest position, obtain a new set of bird’s nests, and test them. Then compare the testing results with the bird’s nest position of previous generation and obtain a set of better positions.

Step 3 (selection operation). Generate the random number , which obeys the uniform distribution. Contrast it with the detection probability , if , is changed randomly, otherwise is unchanged. Test the changed nest positions, compare them with locations of the last step, and choose the better nest locations.

Step 4 (repeat-cycle disturbance operation). Add a self-learning and self-evolving disturbance to each bird’s nest location, test new bird’s nest locations that have been disturbed, and then compare the test results with locations of the last disturbance results and choose the better bird’s nest locations. After many times disturbance, obtain a set of the best bird’s nest locations, and then choose a best location from the set.

Step 5 (judgment operation). Calculate the fitness value of and judge whether it achieves the termination condition. If it is satisfied, is the optimal solution, otherwise return to Step 2 and start the next iteration.

4. Simulation Results and Analysis

In order to verify the performances of the improved CS algorithm, six typical continuous test functions are chosen for carrying out the simulation research, meanwhile, which is compare simulation results with ABC, PSO, CS and GCS. These six test functions are shown in Table 1. Their 3D surface figures are shown in Figures 27.

Sphere function is a simple unimodal function. Rosenbrock function is an inseparable single mode function, and its global extreme value is in steep valleys. For the most search algorithms, it is difficult to acquire the right search direction within the canyon. Griewank function is a multimodal function with multiple local optimal points, and due to the correlation between variables, it will be very hard to obtain the global optimal solution. Rastrigrin function is a typical inseparable multimodal function, and in its searching domain, there are a large number of local minimum values, which leads to the fact that it is difficult to obtain the global optimum. Michalewicz function has local extreme values.

The experimental parameters are set as follows. For particle swarm optimization (PSO) algorithm the particle number is ; learning factors , ; the inertia weight . For artificial bee colony (ABC) algorithm total number of colonies is , the number of following bees and leading bees is the same . For cuckoo search (CS) algorithm and its 3 improved CS algorithm the total bird’s nest population , the detection probability , the step length controlled parameter . The cuckoo search algorithm based on Gauss disturbance (GCS) and the cuckoo search algorithm based on self-learning and self-evolving disturbance (SSCS) adopt the same disturbance scope control quantity . For the cuckoo search algorithm based on repeat-cycle asymptotic self-learning and self-evolving disturbance (RC-SSCS), the scope of is [0.25, 1.5], the scale of learning and evolution is set . The number of the circulation disturbance . For all algorithms, the dimension of these six test functions is all set . The number of iterations . For each algorithm, its program run 30 times independently.

Evaluate the performances of algorithms through statistics of the best value, average value and worst value among 30 times running, and the convergence curves of each function. The convergence curves of six functions are shown in Figures 813. The numerical test results of each algorithm are shown in Table 2.

After 500 iterations and 30 times independently running, it can be seen from the convergence curves and numerical statistics results of six functions that the convergence rate and the optimization accuracy of RC-SSCS algorithm is the best. And the convergence rate and optimization accuracy of the two algorithms SSCS and RC-SSCS proposed in this paper are obviously better than the original CS algorithm, GCS algorithm, ABC algorithm, and PSO algorithm.

Seen from six convergence curves, the convergence rate of the six functions is all obviously improved. The convergence curves of function show that the optimization accuracy achieved by RC-SSCS algorithm after 20 times iteration is equal to that achieved by GCS algorithm and CS algorithm after 200 iterations. The function convergence curve shows that the optimization accuracy by RC-SSCS algorithm after 100 iterations reached is equal to the accuracy by original CS algorithm after 400 iterations reached. From the convergence curve of function , It can be seen that the optimization accuracy achieved by RC-SSCS algorithm after 25 iterations is equal to that by SSCS algorithm after 175 iterations achieved and by CS algorithm after 300 iterations achieved. The convergence rate of function and is also changed obviously. The results show that the improved algorithm makes the convergence rate be greatly improved.

Seen from the numerical results of Table 2, the improved algorithm makes the optimization accuracy of six typical functions be improved. The SSCS algorithm and RC-SSCS algorithm relative to CS algorithm, respectively, make the best value of single-mode function increased by 5 and 10 orders of magnitude, and the average value, respectively, increased by 4 and 7 orders of magnitude. For multimode function with multiple local optimal points, compared with the original CS algorithm, SSCS, and RC-SSCS algorithm, respectively, make its best value and average value improved by 5 and 9 orders of magnitude. Optimization accuracy of other functions has been improved. It shows that the improved algorithm can make the optimization accuracy be improved.

Based on the above analysis of the improved algorithm, it is known that, within a certain range of disturbance, the number of the repeat-cycle disturbance affects the balance between the search capability and the optimization accuracy in disturbance process. An appropriate disturbance time can make a good balance between search ability and search accuracy and play a best optimization effect. In process of disturbance search, if the algorithm only has strong search ability but no high search precision, it will not get an ideal search effect, Similarly, if the precision is very high, but search ability is weak, it will be also no ideal search effect. In order to select a reasonable disturbance time, in this paper we carry out the further studies on the relationship between the repeat-cycle disturbance times and the convergence rate and optimization precision. The same parameter settings are chosen as described above, except the number of iterations set . The cycle disturbances to each function are carried out 5 times, 10 times, and 20 times, respectively. The results of the six functions under different disturbance are shown in Figures 1419 and Table 3.

It can be seen from the convergence curves in Figures 1419 that with the increase of cycle’s times, the convergence rate of six functions will be gradually improved. But by looking carefully at each function convergence curve, it can be discovered that the changing of convergence rate when number of loops increases from 5 to 10 is bigger than that when loops number increases from 10 to 20. That is to say, the changing of convergence produced by increasing 5 times loops in front of repeat-cycle disturbance is bigger than that by increasing 10 times in later of it. It also can be seen from the numerical results in Table 3 that the optimization accuracy is the highest when cycle disturbance times is 10. The optimization accuracy of six functions is all improved when the disturbance times are increased from 5 to 10. However, when the disturbance times are increased from 10 to 20, the optimization accuracy all decreases instead of increasing.

In conclusion, the more disturbance times may not obtain the better results. Within the same disturbance scope, when the disturbance times reach a certain number, if the disturbance number increases again, the convergence speed of the algorithm does not have an obvious improvement. However, the optimization accuracy will be reduced. Integrally considering the convergence rate, the optimization accuracy and the optimizing time, it is better to choose the cycled disturbance times about 10.

5. Conclusion

In order to improve the convergence rate and optimization accuracy of the cuckoo search (CS) algorithm for solving function optimization problems, a kind of cuckoo search algorithm based on the repeat-cycle asymptotic self-learning and self-evolving disturbance (RC-SSCS) is proposed. Six typical test functions are chosen for simulation experiments. Simulation results show the effectiveness of the proposed improved cuckoo search algorithm in convergence rate and optimization accuracy.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

This work is partially supported by The Program for China Postdoctoral Science Foundation (Grant no. 20110491510), The Program for Liaoning Excellent Talents in University (Grant no. LR2014008), and The Project by Liaoning Provincial Natural Science Foundation of China (Grant no. 2014020177).