Computational Intelligence and Neuroscience

Computational Intelligence and Neuroscience / 2016 / Article

Research Article | Open Access

Volume 2016 |Article ID 2959370 | 8 pages | https://doi.org/10.1155/2016/2959370

An Improved Cuckoo Search Optimization Algorithm for the Problem of Chaotic Systems Parameter Estimation

Academic Editor: Jens Christian Claussen
Received07 Jun 2015
Revised27 Aug 2015
Accepted01 Oct 2015
Published12 Jan 2016

Abstract

This paper proposes an improved cuckoo search (ICS) algorithm to establish the parameters of chaotic systems. In order to improve the optimization capability of the basic cuckoo search (CS) algorithm, the orthogonal design and simulated annealing operation are incorporated in the CS algorithm to enhance the exploitation search ability. Then the proposed algorithm is used to establish parameters of the Lorenz chaotic system and Chen chaotic system under the noiseless and noise condition, respectively. The numerical results demonstrate that the algorithm can estimate parameters with high accuracy and reliability. Finally, the results are compared with the CS algorithm, genetic algorithm, and particle swarm optimization algorithm, and the compared results demonstrate the method is energy-efficient and superior.

1. Introduction

Chaos is a universal complex dynamical phenomenon, lurking in many nonlinear systems, such as communication systems and meteorological systems. The control and synchronization of chaos has been widely studied [14]. Parameter estimation is a prerequisite to accomplish the control and synchronization of chaos. During recent years many parameter estimation methods have been proposed, such as particle swarm optimization (PSO) [58], genetic algorithm (GA) [912], and mathematical methods of multiple shooting [13]. However, the GA and PSO algorithms are easily trapped into local-best solution that affects the quality of solutions; the precisions of PSO, GA, and multiple shooting are not high enough. Recently, a novel and robust metaheuristic based method called cuckoo search algorithm was proposed by Yang and Deb [1416]. The algorithm proved to be very promising and could outperform existing algorithms such as GA and PSO [14]. However, the relatively poor ability of local searching is a drawback, and it is necessary to further improve the performance of CS algorithm to obtain a higher-quality solution. The basic principle of the ICS algorithm is to integrate the orthogonal design and simulated annealing operation to enhance the exploitation optimization capacity.

The remaining sections of this paper are organized as follows. In Section 2, a brief formulation of chaotic system parameters estimation is described. Section 3 elaborates the ICS algorithm, and the results established upon the proposed algorithm and some compared algorithms are given in Section 4. The paper ends with conclusions in Section 5.

2. Problem Formulation

A problem of parameter estimation can be converted into a problem of multidimensional optimization by constructing the proper fitness function.

Let the following equation be a continuous nonlinear -dimension chaotic system:where   denotes the state vector of the chaotic system, is the derivative of , denotes the initial state of system, and is a set of original parameters. Suppose the structure of the system (1) is known; then the estimated system can be written as where denotes the state vector of the estimated system; is a set of estimated parameters. In order to convert the parameter estimation problem into optimization problem, the following objective fitness function is defined:where is the sampling time point and denotes the length of data used for parameter estimation. The parameter estimation of system (1) can be achieved by searching the most proper values of such that the objective function (3) is globally minimized.

It can be found that (3) is a multidimensional nonlinear function with multiple local search optima; it is easily trapped into local optimal solution and the computation amount is great, so it is not easy to search the globally optimal solution effectively and accurately using traditional general methods. In the paper an improved CS algorithm is proposed to solve the complex optimization problem.

3. Improved CS Algorithm

3.1. Basic CS Algorithm

The basic CS algorithm is based on the brood parasitism of some cuckoo species by laying their eggs in the nests of other host birds. For simplicity in describing the basic CS, the following three ideal rules are used [14]: (1) Each cuckoo lays one egg at a time, and dumps it in a randomly chosen set; (2) the best nests with high-quality eggs will be carried over to the next generations; (3) the number of available host nests is fixed, and the egg laid by a cuckoo is discovered by the host bird with a probability . In this case, the host bird can either get rid of the egg away or simply abandon the nest and build a complex new nest. Based on the above rules, the basic CS algorithm is described as shown in Algorithm 1 [14].

Begin
 Generate iteration time
 Initialized with random vector values and parameters
 Evaluate the fitness of each individual (nest) and determine the best individual with the best fitness value
While (stopping criterion is not met or t < MaxGeneration)
  Get a Cuckoo randomly by local random walk or Lévy Flights
  Evaluate its fitness
  Choose a nest among (say, ) randomly
  If ()
   replace by the new nest
  End If
  A fraction () of worse nests are abandoned and new ones are built
  Keep the best nests with quality solutions
  Rank the solutions and find the current best
  Update the generation number
End while
End

Furthermore, the algorithm used a balanced combination of a local random walk and the global explorative random walk, controlled by a switching parameter . The local random walk can be written aswhere and are two different solutions selected randomly by random permutation, is a Heaviside function, is a random number drawn from a uniform distribution, and is the step size.

On the other hand, the global random walk is carried out by using Lévy flights [1417]:Here, is the step size scaling factor; is the step-lengths that are distributed according to the following probability distribution shown in (6) which has an infinite variance with an infinite mean:

3.2. ICS Algorithm

In order to further improve searching ability of the algorithm, the orthogonal design and simulated annealing operation are integrated into the CS algorithm. The basic idea of the orthogonal design is to utilize the properties of the fractional experiment to efficiently determine the best combination of levels [17]. An orthogonal array of factors with levels and combinations is denoted as , where is the prime number, , and is a positive integer satisfying . The brief procedure of constructing the orthogonal array is described as shown in Procedure 1.

Step  1. Construct the basic columns
 For to
  
  For to
   
  End for
 End for
Step  2. Construct the non-basic columns
 For to
  
  For to
   For to
   End for
  End for
 End for
Step  3. Increment by one for ,

The procedure of the orthogonal design algorithm is elaborated as shown in Algorithm 2 and for more detailed information on the orthogonal design strategy, please refer to [1719].

Begin
(1) Construct the orthogonal array following the above steps
(2) Randomly select two solutions from the population
(3) Quantize the domain formed by the two solutions
(4) Randomly generate integers
(5) Use to generate potential offspring
(6) Randomly select a solution from the population
(7) Compare the solution with the best solution from the orthogonal offspring and obtain the better solution
End

The procedure of simulated annealing algorithm is simply stated as shown in Algorithm 3 [20], and for more detailed information on the simulated annealing, please refer to [2022].

Begin
(1) Given a configuration of the elements of the system, randomly displace the elements on a time,
   by a small amount and calculate the resulting change in the energy,
(2) If  
    Then accept the displacement and use the resulting configuration as the starting point for the next iteration
    Else
    Accept the displacement with probability where is the current temperature and
     is Boltzmann’s constant
(3) Repetition of this step continues until equilibrium is achieved
End

Based on the above description of the orthogonal design strategy and simulated annealing operation, the detailed procedures for parameter estimation with the ICS algorithm can be summarized as shown in Algorithm 4.

Begin
(1) Initial the parameter values of the algorithm, generate the random initial vector values and set the iteration number .
(2) Evaluate fitness values of each individual and determine the current best individual with the best objective value.
   Check whether the stopping criterion is met. If the stopping criterion is met, then output the best solution;
   otherwise update the iteration number and continue the iteration process until the stopping criterion is met.
(3) Keep the best solution of the last iteration, and get a set of new solutions by
   Lévy flight, the Lévy flight is performed according to (4).
(4) Evaluate the fitness value of the new solution , and compare with which represents the solution of
    the th iteration. If is better than , then replace by , otherwise, not abandoning the solution
     at once but accepting the solution with probability  , where is the change in the fitness value
    . is Boltzmann’s constant. is the current temperature. Select a random variable , ,
    if , then accept the new solution and use the solution as the starting point for the next iteration. Otherwise,
    abandon the solution, then a set of solution are obtained.
(5) A fraction () of worse nests are abandoned and new ones are built.
(6) Implement the orthogonal design strategy procedures.
(7) Go to step (2)
End

4. Simulation Results

To demonstrate the effectiveness of the improved algorithm, the algorithm is used to estimate parameters of Lorenz chaotic system [23] and Chen chaotic system [24].

4.1. Lorenz Chaotic System

Lorenz chaotic system equation [23] is expressed as follows:where is the state variables; are the unknown chaotic system parameters which need to be estimated. The real parameters of the system are , , and which ensure a chaotic behavior, in order to obtain the values of some state variables, the fourth-order Runge-Kutta algorithm is used to solve (7), and the integral step is . Then a series of state variables values are obtained and 100 state variables of different times () are chosen to be the sample data. The parameters of the algorithm are set as follows: the max iteration number is , the sample size is , the annealing mode is shown in (8) where is the iteration number, and the initial temperature is . ConsiderThe objective (fitness) function is shown in (9), where is the th state variable that corresponds to the true system parameters and is the th state variable that corresponds to the estimated system parameters:

Figure 1 shows the convergence process of the fitness values and three parameters () during the iterations in a single experiment.

In order to eliminate the difference of each experiment, the algorithm is also executed 50 times; then the mean value of the 50 experiments is taken as the final estimated value; the mean value and best value of the 50 experiments are listed in Table 1. The results based on CS (the best parameter setting is , ), PSO (the best parameter setting is , , where is the inertia weight and is acceleration factor), and GA (the best parameter setting is , , where is the crossover rate and is the mutation rate) are also listed in Table 1.


Mean valueBest value
ICSCSPSOGAICSCSPSOGA

10.0000009.9987369.98501210.08205110.0000009.9999279.99551010.026911
28.00000028.00000528.01441127.88103428.00000028.00000228.00130428.004702
2.6666672.6666612.6681022.6818822.6666672.6666652.6668022.669018
1.1822e − 0103.7614e − 0040.0695170.3319011.2933e − 0112.9556e − 0050.0113770.113969

It can be seen from Table 1 that the best fitness values obtained by ICS algorithm are quite better than the other algorithms. The mean values of the established parameters are also with higher precision than others. The estimated values are close to the true values infinitely. It can be concluded in general that the ICS algorithm contributes to superior performance, CS performs nest-best, PSO is better than GA, and GA performs worst.

As the actual chaotic systems always associate with noise, in order to test the performance of parameter estimation in the noise condition, the noise sequences are added to the original sample data. The white noise is added to the state variables ; the range of the noise sequences is from −0.1 to 0.1. Figure 2 shows the convergence process of the fitness values and three parameters () during the iterations in a single experiment under the noise condition.

In order to eliminate the difference of each experiment, the algorithm is executed 50 times, then the mean value of the 50 experiments is taken as the final estimated value, and the corresponding results are listed in Table 2.


Mean valueBest value
ICSCSPSOGAICSCSPSOGA

9.99611010.0800149.84460610.2179989.99894110.0015659.88100210.044011
28.00227227.98021227.86001327.66120128.00009928.00199528.02244127.900189
2.6665902.6588902.7001982.6598802.6666752.6677042.6755962.670228
0.0084020.03903890.2214010.5002270.0009080.0012280.0509310.255996

It can be seen from Table 2 that the four algorithms all have a certain capability of identification of parameters, but the performance of ICS is much better than the other algorithms; it supplies more robust and precise results; although the precision of the estimated parameters is declined compared with the results in the noiseless condition, the precision is still satisfactory. Then it can be concluded that the ICS algorithm possesses a powerful capability for parameters identification in the noise condition.

4.2. Chen Chaotic System

Chen chaotic system equation [24] is expressed as follows:where is the state variables; are the unknown chaotic system parameters which need to be estimated. The real parameters of the system are , , and which ensure a chaotic behavior, the fourth-order Runge-Kutta algorithm is used to solve (10), and the integral step is . Then a series of state variables values are obtained and 100 state variables of different times () are chosen to be the sample data. The parameters of the algorithm are set as follows: the max iteration number is , the sample size is , the annealing mode is shown in (8) where is the iteration number, and the initial temperature is . The convergence process of the fitness values and three parameters () during the iterations in a single experiment is shown in Figure 3. In order to eliminate the difference of each experiment, the algorithm is executed 50 times, then the mean value of the 50 experiments is taken as the final estimated value, and the corresponding results are listed in Table 2.

It can be seen from Table 3 that the best fitness values obtained by ICS algorithm are quite better than the other algorithms. The mean values of the established parameters are also with higher precision than others. The estimated values are close to the true values asymptotically. It can be concluded in general that the ICS algorithm contributes to superior performance, CS performs nest-best, PSO is better than GA, and GA performs worst.


Mean valueBest value
ICSCSPSOGAICSCSPSOGA

34.99943835.08967534.84427833.53539634.99994534.99666134.78229035.102699
2.9999512.9990813.0129773.0050312.9999992.9999072.9986942.991955
27.99975728.04381027.91764827.29110927.99997427.99842727.89588828.053975
4.6438e − 0074.6628e − 0040.0273120.1152592.9658e − 0103.2051e − 0060.0033120.010562

As the actual chaotic systems always come along with noise, in order to test the performance of parameter estimation in the noise condition, the noise sequences are added to the original sample data. The white noise is added to the state variables ; the range of the noise sequences is from −0.1 to 0.1. Figure 4 shows the convergence process of the fitness values and three parameters () during the iterations in a single experiment under the noise condition.

It can be seen from Table 4 that the four algorithms all have a certain capability of identification of parameters, but the performance of ICS is much better than the other algorithms; it supplies more robust and precise results; although the precision of the estimated parameters is declined compared with the results in the noiseless condition, the precision is still satisfactory. Then it can be concluded that the ICS algorithm possesses a powerful capability for parameters identification in the noise condition.


Mean valueBest value
ICSCSPSOGAICSCSPSOGA

35.42127235.85954034.10839835.96254734.97009635.11220535.31389610.044011
2.9967962.9932812.9702773.0529842.9993112.9973092.98045527.900189
28.20481528.41842527.58845028.43465127.98594028.05561728.1626082.670228
0.0093340.0381120.2442760.5919571.5986e − 0040.0014950.0629010.255996

5. Conclusion

In this paper, an energy-efficient and superior ICS algorithm is proposed to estimate chaotic system parameters. The estimated results demonstrate the strong capabilities and effectiveness of the proposed algorithm, compared with the CS, PSO, and GA algorithms; the ICS algorithm supplies more robust and precise results. Besides, the algorithm also has a more powerful capability of noise immunity. In general, the proposed ICS algorithm is a feasible, energy-efficient, and promising method for parameters estimation of chaotic systems.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgment

This work is supported by National Natural Science Foundation of China, 61271106.

References

  1. A. Sitz, U. Schwarz, J. Kurths, and H. U. Voss, “Estimation of parameters and unobserved components for nonlinear systems from noisy time series,” Physical Review E—Statistical, Nonlinear, and Soft Matter Physics, vol. 66, no. 1, Article ID 016210, 2002. View at: Publisher Site | Google Scholar
  2. L. X. Guo, M. F. Hu, Z. Y. Xu, and A. Hu, “Synchronization and chaos control by quorum sensing mechanism,” Nonlinear Dynamics, vol. 73, no. 3, pp. 1253–1269, 2013. View at: Publisher Site | Google Scholar
  3. U. Parlitz, “Estimating model parameters from time series by autosynchronization,” Physical Review Letters, vol. 76, no. 8, pp. 1232–1244, 1996. View at: Publisher Site | Google Scholar
  4. H. N. Agiza and M. T. Yassen, “Synchronization of Rossler and Chen chaotic dynamical systems using active control,” Physics Letters A, vol. 278, no. 4, pp. 191–197, 2001. View at: Publisher Site | Google Scholar
  5. F. Gao and H.-Q. Tong, “Parameter estimation for chaotic system based on particle swarm optimization,” Acta Physica Sinica, vol. 55, no. 2, pp. 577–582, 2006. View at: Google Scholar
  6. Q. He, L. Wang, and B. Liu, “Parameter estimation for chaotic systems by particle swarm optimization,” Chaos, Solitons and Fractals, vol. 34, no. 2, pp. 654–661, 2007. View at: Publisher Site | Google Scholar
  7. J. Garcia-Nieto, A. C. Olivera, and E. Alba, “Optimal cycle program of traffic lights with particle swarm optimization,” IEEE Transactions on Evolutionary Computation, vol. 17, no. 6, pp. 823–839, 2013. View at: Publisher Site | Google Scholar
  8. M. Salahi, A. Jamalian, and A. Taati, “Global minimization of multi-funnel functions using particle swarm optimization,” Neural Computing and Applications, vol. 23, no. 7-8, pp. 2101–2106, 2013. View at: Publisher Site | Google Scholar
  9. C. Tao, Y. Zhang, and J. J. Jiang, “Estimating system parameters from chaotic time series with synchronization optimized by a genetic algorithm,” Physical Review E, vol. 76, no. 1, Article ID 016209, 2007. View at: Publisher Site | Google Scholar
  10. J. Wang, Z. Sheng, B. H. Zhou, and S. D. Zhou, “Lightning potential forecast over Nanjing with denoised sounding derived indices based on SSA and CS-BP neural network,” Atmospheric Research, vol. 137, pp. 245–256, 2014. View at: Publisher Site | Google Scholar
  11. J. Wang, B. H. Zhou, S. D. Zhou, and Z. Sheng, “Forecasting nonlinear chaotic time series with function expression method based on an improved genetic-simulated annealing algorithm,” Computational Intelligence and Neuroscience, vol. 2015, Article ID 341031, 10 pages, 2015. View at: Publisher Site | Google Scholar
  12. G. G. Szpiro, “Forecasting chaotic time series with genetic algorithms,” Physical Review E, vol. 55, no. 3, Article ID 2557, 1997. View at: Google Scholar
  13. M. Peifer and J. Timmer, “Parameter estimation in ordinary differential equations for biochemical processes using the method of multiple shooting,” IET Systems Biology, vol. 1, no. 2, pp. 78–88, 2007. View at: Publisher Site | Google Scholar
  14. X.-S. Yang and S. Deb, “Cuckoo search via Lévy flights,” in Proceedings of the World Congress on Nature & Biologically Inspired Computing (NaBIC '09), pp. 210–214, IEEE, Coimbatore, India, December 2009. View at: Publisher Site | Google Scholar
  15. X.-S. Yang and S. Deb, “Engineering optimisation by cuckoo search,” International Journal of Mathematical Modelling and Numerical Optimisation, vol. 1, no. 4, pp. 330–343, 2010. View at: Publisher Site | Google Scholar | Zentralblatt MATH
  16. A. H. Gandomi, X.-S. Yang, and A. H. Alavi, “Cuckoo search algorithm: a metaheuristic approach to solve structural optimization problems,” Engineering with Computers, vol. 29, no. 1, pp. 17–35, 2013. View at: Publisher Site | Google Scholar
  17. C. R. Hick, Fundamental Concepts in the Design of Experiments, Saunders College Publishing, Austin, Tex, USA, 1993.
  18. D. C. Montgomery, Design and Analysis of Experiments, Wiley, New York, NY, USA, 1991.
  19. Y.-W. Leung, Y. Wang, Y.-W. Leung, and Y. Wang, “An orthogonal genetic algorithm with quantization for global numerical optimization,” IEEE Transactions on Evolutionary Computation, vol. 5, no. 1, pp. 41–53, 2001. View at: Publisher Site | Google Scholar
  20. S. Kirkpatrick, C. D. Gelatt, and M. P. Vecchi, “Optimization by simulated annealing,” Science, vol. 220, no. 4598, pp. 671–680, 1983. View at: Publisher Site | Google Scholar
  21. M. S. Kim, M. R. Feldman, and C. C. Guest, “Optimum encoding of binary phase-only filters with a simulated annealing algorithm,” Optics Letters, vol. 14, no. 11, pp. 545–547, 1989. View at: Publisher Site | Google Scholar
  22. B. L. Golden and C. C. Skiscim, “Using simulated annealing to solve routing and location problems,” Naval Research Logistics Quarterly, vol. 33, no. 2, pp. 261–279, 1986. View at: Publisher Site | Google Scholar
  23. E. N. Lorenz, “Deterministic nonperiodic flow,” Journal of the Atmospheric Sciences, vol. 20, no. 2, pp. 130–141, 1963. View at: Google Scholar
  24. J. Lü and G. Chen, “A new chaotic attractor coined,” International Journal of Bifurcation and Chaos, vol. 12, no. 3, pp. 659–661, 2002. View at: Publisher Site | Google Scholar

Copyright © 2016 Jun Wang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

2225 Views | 802 Downloads | 20 Citations
 PDF  Download Citation  Citation
 Download other formatsMore
 Order printed copiesOrder

We are committed to sharing findings related to COVID-19 as quickly and safely as possible. Any author submitting a COVID-19 paper should notify us at help@hindawi.com to ensure their research is fast-tracked and made available on a preprint server as soon as possible. We will be providing unlimited waivers of publication charges for accepted articles related to COVID-19. Sign up here as a reviewer to help fast-track new submissions.