Mathematical Problems in Engineering

Mathematical Problems in Engineering / 2014 / Article
Special Issue

Hybrid Intelligent Techniques for Benchmark Functions and Real-World Optimization Problems

View this Special Issue

Research Article | Open Access

Volume 2014 |Article ID 493740 | https://doi.org/10.1155/2014/493740

Seif-Eddeen K. Fateen, Adrián Bonilla-Petriciolet, "Gradient-Based Cuckoo Search for Global Optimization", Mathematical Problems in Engineering, vol. 2014, Article ID 493740, 12 pages, 2014. https://doi.org/10.1155/2014/493740

Gradient-Based Cuckoo Search for Global Optimization

Academic Editor: P. Karthigaikumar
Received30 Dec 2013
Accepted07 Apr 2014
Published12 May 2014

Abstract

One of the major advantages of stochastic global optimization methods is the lack of the need of the gradient of the objective function. However, in some cases, this gradient is readily available and can be used to improve the numerical performance of stochastic optimization methods specially the quality and precision of global optimal solution. In this study, we proposed a gradient-based modification to the cuckoo search algorithm, which is a nature-inspired swarm-based stochastic global optimization method. We introduced the gradient-based cuckoo search (GBCS) and evaluated its performance vis-à-vis the original algorithm in solving twenty-four benchmark functions. The use of GBCS improved reliability and effectiveness of the algorithm in all but four of the tested benchmark problems. GBCS proved to be a strong candidate for solving difficult optimization problems, for which the gradient of the objective function is readily available.

1. Introduction

The use of stochastic global optimization methods has gained popularity in a wide variety of scientific and engineering applications as those methods have some advantages over deterministic optimization methods [1]. Those advantages include the lack of the need for a good initial guess and the ability to handle multimodal and nonconvex objective functions without the assumptions of continuity and differentiability. In addition, one of the important advantages is the lack of the need for information about the gradient of the objective function. Gradient-free optimization methods can be either deterministic or stochastic, but their applications can be found in many disciplines [2]. In many applications, however, the gradient of the objective function is already available or easily obtainable. Yet, this valuable piece of information is entirely ignored by traditional stochastic optimization methods. However, for the functions whose gradient is available, the use of the gradient may improve the reliability and efficiency of the stochastic search algorithm. In particular, the quality and precision of solutions obtained with gradient-based optimization methods outperform those obtained with traditional stochastic optimization methods. For a wide variety of engineering applications, it is needed to obtain solutions with a high precision and, therefore, the conventional stochastic optimization methods may fail to satisfy this requirement.

Until now, several stochastic methods have been proposed and investigated in challenging optimization problems using continuous variables and they include simulated annealing, genetic algorithms, differential evolution, particle swarm optimization, harmony search, and ant colony optimization. In general, these methods may show different numerical performances and, consequently, the search for more effective and reliable stochastic global optimization methods is currently an active area of research. In particular, the cuckoo search (CS) [3] is a novel nature-inspired stochastic optimization method. This relatively new method is gaining popularity in finding the global minimum of diverse science and engineering application problems [48]. For example, it was recently used for the design of integrated power systems [5] and solving reliability-redundancy allocation [6], phase equilibrium [7], and mobile-robot navigation [8] problems. CS was selected to test the concept of using the gradient as a source of new information that guides cuckoos in their search.

Therefore, in this study, a simple modification was done to the CS algorithm to make use of the gradient information and enhance the reliability and efficiency of the algorithm. The aim of this work is to present a modification to the existing CS algorithm based on the gradient of the objective function and to evaluate its performance in comparison with the original algorithm. The remainder of this paper is divided as follows. Section 2 introduces the cuckoo search algorithm. Section 3 introduces the proposed modification and our new gradient-based cuckoo search (GBCS) algorithm. The numerical experiments performed to evaluate the modification are presented in Section 4. The results of the numerical experiments are presented and discussed in Section 5. Section 6 summarizes the conclusions of the work.

2. Cuckoo Search (CS) Algorithm

CS is a nature-inspired stochastic global optimization method that was developed by Yang and Deb [3, 9]. Its concept comes from the brood parasitism behavior of the cuckoo bird. Specifically, brood parasitism is a reproductive strategy followed by cuckoos in which they lay their eggs in the nests of other birds, which are usually other species. If these eggs are discovered by the host bird, it may abandon the nest completely or throw away the alien eggs. This natural phenomenon has led to the evolution of cuckoo eggs to mimic the egg appearance of local host birds. The following rules have been employed in the search algorithm to implement those concepts: (1) one egg is laid by each cuckoo in a random nest and it represents a set of solution coordinates, (2) the best eggs (i.e., solutions) are contained in a fraction of the nests and will carry over to the next generation, and (3) the number of nests is fixed and a host bird can find an alien egg with a specified probability . If this condition occurs, the host bird can discard the egg or abandon the nest, and a new nest will be built elsewhere. For algorithm simplicity, this condition has been implemented in CS assuming that a fraction of nests is replaced by new nests. The pseudocode of CS is reported in Algorithm 1 and details of this metaheuristic are reported in [3]. Note that Lévy flights are used in CS for performing effectively both local and global searches in the solution domain. A Lévy flight is a random walk (i.e., a trajectory that consists of taking successive random steps) and is characterized by a sequence of sudden jumps, which are chosen from a probability density function that has a power law tail. In fact, Lévy flight is considered as the optimum random search pattern and has been useful in stochastic simulations of random natural phenomena including applications of astronomy, physics, and biology. To generate a new egg in CS, a Lévy flight is performed using the coordinates of an egg selected randomly. This step can be represented by where denotes entry-wise multiplication, α is the step size, and Lévy () is the Lévy distribution. Egg is displaced to this new position if the objective function value is found better than another randomly selected egg. The step size controls the scale of random search and depends on scales of the optimization problems under analysis.

begin
 Objective function
 Generate initial population of host nests
while (t < MaxGeneration)
  Get a cuckoo randomly by Lévy flights
  Evaluate its quality/fitness
  Choose a nest among n (say, j) randomly
  If ( )
    Replace j by the new solution
  end
  A fraction ( ) of nests are abandoned at random and new ones are built via random walk
  Keep the best solutions
  Rank the solutions and find the current best
end while
 Postprocess results
end

A fraction (1−) of the nests selected at random is abandoned and replaced by new ones at new locations via local random walks. The local random walk can be written as where and are two different solutions selected randomly by random permutation and is a random number drawn from a uniform distribution. An advantage of CS over genetic algorithm, particle swarm optimization, and other stochastic optimization methods is that there is only one parameter to be tuned, namely, the fraction of nests to be abandoned (1-). However, Yang and Deb [3, 9] found that the results obtained for a variety of optimization problems were not so dependent on the value of and suggested using .

3. Gradient-Based Cuckoo Search (GBCS) Algorithm

The purpose of this section is to introduce the simple modification of the original CS algorithm to incorporate information about the gradient of the objective function. Any modification to algorithm should not change its stochastic nature so as not to negatively affect its performance. A modification was made to the local random walk in which a fraction (1-) of the nests are replaced (2). In the original algorithm, when new nests are generated from the replaced nests via a random step, the magnitude and the direction of the step are both random. In the modified algorithm, the randomness of the magnitude of the step is reserved. However, the direction is determined based on the sign of the gradient of the function. If the gradient is negative, the step direction is made positive. If the gradient is positive, the step direction is made negative. Thus, new nests are generated randomly from the worse nests but in the direction of the minimum as seen from the point of view of the old nests. Thus, (2) is replaced by where sign function obtains the sign of its argument and is the gradient of the objective function at each variable, that is, .

This simple modification does not change the structure of the CS algorithm but makes important usage of the available information about the gradient of the objective function. No additional parameter is needed to implement this change.

4. Numerical Experiments

Twenty-four classical benchmark functions were used to evaluate the performance of GBCS as compared to the original CS. These problems were chosen from amongst a list of forty-one functions. All forty-one functions were screened first by performing five optimization runs with both CS and GBCS. Functions for which both CS and GBCS performed extremely well with no significant difference in the results were deemed not suitable for comparison and excluded from the evaluation. Since CS is already a high-performance global optimizer, the excluded functions were not suitable for showing the differences between the two algorithms. Table 1 lists the twenty-four benchmark functions used for the evaluation of the two algorithms along with their derivatives and the search domain. The number of variables, the number of iterations used, and the value at the global minimum for each problem are shown with the results in Table 2.


NumberNameFunction and its derivativeSearch Domain

1Ackley

2Beale


3Booth


4Cross-leg table

, 

5Himmelblau

6Levy 13


7Matyas

8Schaffer
, 

9Powell


10Power sum   

11Shekel 5

12Wood


13Cube

14Stochastic Cube

15Sphere

16Hartmann

17Dixon-price , 

18Griewank

19Griewank stochastic

20Michaelwicz

21Rosenbrock