Table of Contents Author Guidelines Submit a Manuscript
Journal of Optimization
Volume 2016, Article ID 3260940, 14 pages
http://dx.doi.org/10.1155/2016/3260940
Research Article

Hybridization of Adaptive Differential Evolution with an Expensive Local Search Method

1Department of Mathematics, Jinnah College for Women, University of Peshawar, Khyber Pakhtunkhwa 25000, Pakistan
2Department of Mathematics, Kohat University of Science & Technology (KUST), Kohat, Khyber Pakhtunkhwa 26000, Pakistan
3College of Computer Science, King Khalid University, Abha 61321, Saudi Arabia

Received 27 December 2015; Revised 9 June 2016; Accepted 14 June 2016

Academic Editor: Manlio Gaudioso

Copyright © 2016 Rashida Adeeb Khanum et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Differential evolution (DE) is an effective and efficient heuristic for global optimization problems. However, it faces difficulty in exploiting the local region around the approximate solution. To handle this issue, local search (LS) techniques could be hybridized with DE to improve its local search capability. In this work, we hybridize an updated version of DE, adaptive differential evolution with optional external archive (JADE) with an expensive LS method, Broydon-Fletcher-Goldfarb-Shano (BFGS) for solving continuous unconstrained global optimization problems. The new hybrid algorithm is denoted by DEELS. To validate the performance of DEELS, we carried out extensive experiments on well known test problems suits, CEC2005 and CEC2010. The experimental results, in terms of function error values, success rate, and some other statistics, are compared with some of the state-of-the-art algorithms, self-adaptive control parameters in differential evolution (jDE), sequential DE enhanced by neighborhood search for large-scale global optimization (SDENS), and differential ant-stigmergy algorithm (DASA). These comparisons reveal that DEELS outperforms jDE and SDENS except DASA on the majority of test instances.

1. Introduction

Optimization is concerned with finding best solution for an objective function. In general, an unconstrained optimization problem can be stated as follows: Find global optimum of an objective function , where and is the dimension of the problem.

Evolutionary algorithms (EAs) are inspired from Darwinian theory of evolution [1]. They are very efficient for finding global optimum of many real world problems, including problems from mathematics, engineering, economics, business, and medicines. EA family consists of a variety of stochastic algorithms, like Genetic Algorithms (GAs) [2], Particle Swarm Optimization (PSO) [3, 4], Evolutionary Strategies (ES) [5], and differential evolution algorithm (DE) [6, 7].

Among EAs, DE is the most recent algorithm and is efficient in solving many optimization problems. DE has many advantages. For example, it is simple to understand and implement, has a few control parameters, and is robust [8]. There is no doubt that DE is a remarkable optimizer for many optimization problems. But it has few limitations, like stagnation, premature convergence, and loss of population diversity [9, 10]. Being a global optimizer, DE suffers from searching the neighborhood of the approximate solution to the given problem. This makes room for hybridizing DE with other techniques to improve its poor exploitation (exploring the neighborhood of the approximate solutions). On the other hand, the role of LS methods is to stabilize the search especially in the environs of a local optimum. Thus, they can be combined with global search algorithms to enhance their local searching.

The main aim of this paper is to experiment with and validate the performance of our newly proposed hybrid algorithm, DEELS, which combines JADE [11, 12] and BFGS [13]. As a result, we want to see whether this hybridization will improve the performance of JADE further. Contrary to our published preliminary work [14], this paper presents DEELS in full depth. It also comments on the performance of DEELS for large-scale global optimization problems with dimension 1000. Moreover, in contrast to our previous published comparison with JADE only [14], this time DEELS is compared with jDE [15], SDENS [16], and DASA [17] on problems from CEC2005 and CEC2010 test suits to further explore the capabilities of DEELS for handling small and large dimension problems.

The rest of this paper is organized as follows. Section 2 describes the basic DE, JADE, and the BFGS algorithms. Section 3 presents literature review. Section 4 presents proposed algorithm. Section 5 gives the experimental results, and finally Section 6 concludes this paper and discusses future research direction.

2. Some Relevant Existing Methods

As mentioned earlier, DEELS depends upon JADE and BFGS. Thus, this section presents the basic operators of DE, JADE, and BFGS.

2.1. Basic DE

Differential evolution (DE) [6, 7] is a recently developed bioinspired scheme for finding the global optimum of an optimization problem. This section briefly reviews the DE algorithm. More details about it can be found in [1822]. The working of DE can be described as follows.

2.1.1. Parent Selection

For each member , , of the current generation , three other members, , , and , are randomly selected, where , , and are randomly chosen indices such that , , and and . Thus, for each individual, , a mating pool of four individuals is formed in which an individual breeds against three individuals and produces an offspring.

2.1.2. Reproduction

To generate an offspring, DE incorporates two genetic operators, mutation and crossover. They are detailed as follows:(1)Mutation. After selection, mutation is applied to produce a mutant vector , by adding a scaled difference of the two already chosen vectors to the third chosen vector; that is,where is the scaling factor.(2)Crossover. After mutation, the parameters of the parent vector and mutant vector are mixed by a crossover operator and a trial member is generated as follows:where .

2.1.3. Survival Selection

At the end, the trial vector generated in (2) is compared with its parent vector on the basis of its objective function value. The best of the two will get a chance to become a member of the the new generation; that is,

2.2. JADE

JADE [11] is an adaptive version of DE which modifies it in three aspects.

2.2.1. DE/Current/to-best Strategy

JADE utilized two mutation strategies: one with external archive and the other without it. These strategies can be expressed as follows [11]:where is a vector chosen randomly from the top individuals and , , and are chosen from the current population , while is chosen randomly from , where denotes the archive of JADE and is a constant chosen as . In DEELS, we will utilize the strategy given in (4).

2.2.2. Control Parameters Adaptation

For each individual , control parameter and the crossover probability are generated independently from Cauchy and normal distributions, respectively, as follows [11]:These are then truncated to and , respectively. Initially, both and are set to 0.5. They are then updated at the end of each generation as follows:where denotes the Lehmer mean and denotes the arithmetic mean and is the set of successful ’s while is the set of successful ’s at generation .

2.2.3. Optional External Archive

At each generation, the failed parents are sent to the archive. If the archive size exceeds , some solutions are randomly deleted from it to keep its size equal to . The archive inferior solutions play a roll in JADE’s mutation strategy with archive. The archive not only provides information about direction but improves the diversity as well.

2.3. BFGS

The BFGS method, also known as the quasi Newton algorithm, employs the gradient and Hessian in finding a suitable search direction. BFGS is considered as a good LS method due to its efficiency. The detailed algorithm of BFGS is presented in Algorithm 1.

Algorithm 1: Pseudocode of BFGS method [23].

3. Brief Review of Variants of DE and Hybridization of DE with Local Search Methods

To improve the performance of DE, many researchers devised modifications to the classic DE and proposed different variants. Some researchers modified the selection scheme [24], while others varied mutation and crossover operators [25]. Recently, in [26], orthogonal crossover was used instead of binomial and exponential crossover. Some have introduced new variants like opposition based DE (ODE) [27], centroid based initialization (ciJADE) [28], jDE [15], and genDE [8], while others introduced adaptation and self-adaptation of control parameters and as in [29, 30], SaDE [31], JADE [11, 12], SHADE [32], and EWMA-DECrF [33]. Some introduced cooperative coevolution into DE for large-scale optimization [34]. A group of researchers applied it to discrete problems [35, 36], while others take advantage of its global search ability in continuous domains [26, 3740].

In recent years, the hybridization of DE with LS methods has gained much attraction due to their individual merits. Many hybrid algorithms have shown significant performance improvement. Here, we review some of the methods in this category.

A new differential evolution algorithm with localization around the best point (DELB) is proposed in [41]. In DELB, the initial steps are the same as those in DE except that the mutation scale factor is chosen from randomly for each mutant vector. DELB also modifies the selection step by introducing reflection and contraction. The trial vector is compared with the current best and the parent vector. If the parent is worse than the trial vector, it is replaced by a new concentrated or reflected vector. In DELB, the trial vector can be replaced by its parent vector or reflected vector or contracted vector, while in classic DE only the trial vector replaces the parent.

Recently in [42], DE is hybridized with nonlinear simplex method. This method is known as NSDE. The authors of [42] applied nonlinear simplex method with uniform random numbers to initialize DE population. Initially, individuals are generated uniformly and then next are generated from these points by application of Nelder-Mead Simplex (NMS). Now from population, the fittest are selected as DE’s initial population and the rest of DE is unaltered in NSDE. Thus, NSDE modifies DE in the population step only. It has shown good performance in reducing function evaluations and CPU time.

In another experiment, Brest et al. [43] hybridized DE with Sequential Quadratic Programming (SQP), an efficient but expensive gradient-based LS method. Their hybrid applies the DE algorithm until function evaluations reach of the maximum function evaluations. It then applies SQP for the first time to the best point thus obtained. Afterwards, SQP is applied after every 100 generations to the best solution of the current search. Expensive local search iteration number is set to . In their hybrid, the population size keeps reducing and the process ends with minimum population size. DE provides the users with flexible offspring generation strategies [44]. Hence, hybridization of DE will continue to remain an active field of multidisciplinary research in the years to come.

Thus, we present a new algorithm, DEELS, which utilizes an expensive local search for refining the solutions. The details of DEELS are presented in the following section.

4. A New Hybrid Algorithm: DEELS

In this section, we present our new proposed algorithm, DEELS, which is the combination of two methods with contrasting features. First, we will discuss the main features of the algorithm. Then, we will describe it explicitly.

4.1. Main Idea

Though JADE, due to its adaptive parameter control strategy, performs better than classic DE on many optimization problems, however, its performance worsens with the increase in dimension. BFGS is a LS technique which has a strong self-correcting ability [45] in searching the optimal solution, but it is not globally as good as JADE. The important question is how to reconcile two different aspects to solve the minimization problem.

A very natural way would be to hybridize these two techniques, JADE and BFGS, together for solving unconstrained optimization problems. The issue is how to combine them in a way which is easy to understand and implement. Many hybrid approaches incorporate expensive methods to find the best solution. But, here, the new algorithm incorporates the robust and costly method not only for refining good solutions, but for locating them in the population during the search process.

DEELS begins with JADE and allows it to search for generations. It then selects the best individuals from this population and applies to them the expensive LS, that is, BFGS, for the first time. The objective of applying efficient search is to make them potential individuals to produce better offspring and lead the search in promising directions. These are then introduced into the population and the worse solutions are removed from it.

The purpose of calling BFGS after generations is to concentrate the population and add local search ability to the overall scheme and thus help it avoid getting trapped in the local optimal solutions. For these reasons, BFGS is invoked two more times in the evolution, with an interval of generations. If function value is less than a threshold , this means that it is in the neighborhood of the value to reach and this current best solution might lead the search to the desired optimal solution. Hence, it is desirable to apply the efficient LS by more than one iteration to this best solution. Thus, BFGS is applied by iterations when the best solution is in the vicinity of a local optimum. If the output solution of BFGS is the best known solution, then the algorithm stops; otherwise, it continues until the allowed maximum number of function evaluations is met.

In [43], the population size is reduced dynamically, while in our hybrid algorithm, we keep the population size fixed, since reducing the population size might result in losing population diversity, which is very important for DE. DEELS has got much inspiration from the state-of-the-art paper [46]. We apply expensive LS in combination with an EA (DE) instead of their inexpensive LS. In [46], both methods are LSs, while DEELS combines BFGS with JADE to investigate the effect of combing an EA with a LS method. In [46], a restart is also incorporated, while this is not necessary in DEELS.

4.2. Algorithmic Framework of DEELS

The details of DEELS are given in Algorithm 2. Here, we explain the different strategies used in DEELS.

Algorithm 2: Pseudocode of DEELS.
4.2.1. Global Search

JADE improves the population of solutions by updating it from generation to generation with the help of genetic operators, mutation, and crossover. These operators help the search by producing promising solutions. JADE also possesses global search ability and thus adds it to DEELS. Moreover, JADE being a population based method can keep the diversity of the population and thus decreases the chances of DEELS getting trapped in local optima.

4.2.2. LS

The BFGS method has very strong self-correcting properties (when the right line search is used). If, at some iteration, the Hessian matrix contains bad curvature information, it has the ability to correct these inaccuracies by only few updates [45]. For this reason, BFGS generally performs very well, and once in the neighborhood of a minimizer it can attain superlinear convergence [45]. Though BFGS is efficient, it is a costly method, since it computes the gradient at the given point, which utilizes function evaluations per gradient in DEELS. Further, it approximates the Hessian matrix , which is an matrix of second-order partial derivatives [47], the computational cost of which is per iteration [47]. BFGS needs function evaluations per iteration [45]. Thus, the overall overhead of BFGS is also per iteration.

The BFGS method plays two roles in DEELS; first, it is employed for generating promising solutions in the population after specified intervals of evolution. Secondly, it improves the quality of the best solution found so far by JADE and BFGS together.

Next, we explain what we mean by the terms concentration and refinement. As said earlier, the issue is to have an easy-to-understand and easy-to-implement search process. To achieve this, we need to rely on the fact that the problem is to distinguish between ordering points of which we have a lot and good ones (local optima) of which we have relatively few and the best points (global optima) of which we, potentially, may have only one or none.

Let us draw a diagram (see Figure 1) of the main process which is to rely on LS to do a course clustering (i.e., bring towards the basin of local optima of the majority of the good points in the population) and a refinement step in which hopefully the local optimum will be identified. It is clear that, initially, this process will be rather ineffective because of the sheer randomness of the population of solutions as shown in Figure 1(a); unless we are very lucky, it is unlikely to generate good points in the first population. But the important thing is that the process will become more and more effective as concentration takes its toll, on the population (see Figure 1(c)).

Figure 1: Concentration and refinement of solutions in a population.
4.2.3. Updating the Population

Adding promising solutions to the population of DEELS and removing the worst points from it can improve the quality of offspring in the next generations. As if good parents can produce good offspring, worse parents also have the chance of producing worse solutions. Hence, their removal can have a good effect on the entire population. New potential solutions can also increase the convergence rate.

4.2.4. Stopping Condition

DEELS stops when one or both of the following conditions are met:(1)The maximum number of function evaluations is reached.(2), where is the best individual found in a run and is the known value to reach of the test instance.The maximum number of function evaluations is set to for CEC2010 test instances with dimension 1000, while for 30-dimensional problems (CEC2005), these are chosen as .

5. Comparison Studies

This section reports on two sets of experiments. In Experiment , DEELS is compared with jDE, while in Experiment , DEELS is compared with SDENS and DASA. For comparison with SDENS and DASA, the experimental results for the best, median, mean, and standard deviation values are obtained from [17]. Moreover, all the experiments are conducted in MATLAB environment.

5.1. Experiment  1

In our preliminary results [14], DEELS was compared with JADE only, which is its internal optimization technique. However, here we compare DEELS with another state-of-the-art algorithm jDE [15], which is a self-adaptive DE variant for 30-dimensional problems.

5.1.1. Test Instances for Experiment  1

To study the performance of DEELS, we use CEC2005 test suit (see Table 1). This test suit was especially designed for single-objective unconstrained continuous optimization. Further, it was developed for low dimensions, for example, 30 and 50 dimensions. That is why we selected these instances for our experimental study. More details about these instances can be found in [48]. The instances of CEC2005 can be divided into the following:(i)Unimodal test instances ().(ii)Multimodal test instances:(1)Basic multimodal test instances (),(2)Expanded multimodal test instances ().(iii)Hybrid composition test instance ().The 15th test instance, is designed by combining ten different benchmark functions, that is, two Rastrigin’s functions, two Weirstrass’s functions, two Griewank’s functions, two Ackley’s functions, and two Sphere functions. Its value to reach is 120.

Table 1: CEC2005 test instances.
5.2. Parameter Settings for Experiment  1

The population size is set to 75, because should be between and as suggested in [49]. Here, the problem dimension is set to for all the test instances in both jDE and DEELS. The other two parameters and are initially set to , since this initial setting works well for all the test instances [11]. Later, the parameter values used in JADE are adopted. The number of elite solutions that undergo LS is chosen as . The intensity of LS for concentration is set to 1 and the number of iterations of LS for refining the solution is set to 3. The interval between the LS calls is selected to be 300 generations which is equivalent to function evaluations.

5.3. Evaluation Metrics

Thirty independent runs were conducted for DEELS and jDE. The mean and standard deviation of the function error values are recorded for each run. We also record the success rate (SR) [31], for each test instance. A run is considered as successful if it achieves the desired accuracy within the maximum allowed function evaluations. The SR for a particular function is calculated as follows:

5.4. Comparison of DEELS with jDE

The experimental results for function error values and SR of jDE and DEELS are presented in Table 2. The convergence graphs of both algorithms are obtained by plotting the number of function evaluations against the objective function values. DEELS outperforms jDE on 10 out of 15 test instances, while on the remaining 5 test instances the performance of both algorithms is comparable. In the following, we comment on the DEEL’s behavior in each category of test instances.

Table 2: Experimental results of jDE and DEELS on 15 test instances of 30 variables with 3 × 105 FES. Mean error and std. dev. of the function error values obtained in 30 independent runs.
5.4.1. Unimodal Test Instances ()

As can be observed from Table 2, DEELS performed well for three out of five test instances, in terms of function error values. For the remaining two instances, and , both algorithms are considered to be comparable.

Considering SR, here again DEELS performed well for two unimodal test instances, and . jDE only showed a higher SR in the case of . Overall, on unimodal test instances, DEELS is better than jDE, which can be observed in the last column of Table 2.

5.4.2. Multimodal Test Instances ()

In the case of multimodal test instances, DEELS performed very well on six test instances, and in achieving a good solution (see Table 2). The graphs presented in Figure 2 for these multimodal test instances also prove that the yellow curve (jDE) is above the green curve (DEELS). This means that DEELSs obtained solutions are smaller than those obtained by jDE. This proves that DEELS outperforms jDE. Both algorithms showed equal performance on the rest (i.e., , , and ) of multimodal test instances: symbol in Table 2 shows this fact.

Figure 2: Convergence graphs of jDE and DEELS for six representative test functions at , with population size = 75. (a) , (b) , (c) , (d) , (e) , and (f) .

For , both algorithms attained the accuracy level as given in Table 2. For the remaining multimodal test instances, neither of the algorithms could reach the desired accuracy in any run except for , on which DEELS obtained SR over zero SR of jDE. Thus, one can conclude here again that DEELS is better than jDE in the case of multimodal test instances.

5.4.3. A Hybrid Composition Test Instance ()

This test instance, being the combination of other test functions, is a challenging test function. Hence, it is not an easy task to find its global optimum or attain SR for it. DEELS is successful in getting a better local optimum for it than jDE (please see the convergence graphs of Figure 2). DEELS also obtained a SR for this test instance against SR of jDE. This good performance of DEELS may be due to the fact that DEELS benefits from global search and LS, while jDE is only a global search method and so may not be good in exploiting better solutions.

In general, it is interesting to note that jDE, though equivalent to DEELS on five out of 15 test instances, could not get a better function error value than DEELS for any test instance.

5.5. Experiment  2

In this section, we compare DEELS first with SDENS [50] and then with DASA [17] on the CEC2010 test instances with problem dimension 1000.

5.5.1. Test Instances for Experiment  2

We further investigate the behavior of DEELS on ten new and complex test instances with problem dimension , used in CEC2010 Special Session and Competition on Large-Scale Global Optimization [51]. The test instances used in our experiments are the first ten test instances of CEC2010, which can be divided into two categories as follows:(i)Unimodal test instances (, , and ).(ii)Multimodal test instances (, , , , and ).

5.5.2. Parameter Setting for Experiment  2

The parameters settings are kept the same as demanded in the original paper [51] for CEC2010 instances. For this experiment, the population size is chosen and the problem dimension is set to . The maximum function evaluations are chosen as . The value to reach is set to . Twenty-five independent runs of DEELS have been performed for all test instances.

5.6. Comparison with SDENS

The best, median, mean, and standard deviation of function error values obtained in 25 runs of DEELS are presented in Table 3.

Table 3: Experimental results of SDENS, DASA, and DEELS on 10 test instances of 1000 variables with FES. Best, median, mean, and std. dev. of the function error values obtained over 25 runs.

As can be seen from Table 3, overall DEELS performed well as compared with SDENS in reaching the best solution for seven out of ten test instances, , , , , to . Surely, this better performance is due to the additional exploitation abilities of DEELS. For the remaining three test instances, , , and , SDENS dominated the best solutions of DEELS. and are separable functions, while is a single-group nonseparable multimodal function. The mean value obtained by DEELS on in Table 3 is substantially larger than that of SDENS. Therefore, it may be reasonable to think that the failure of DEELS is due to the BFGS, which may get trapped at a local optimum.

It is interesting to note from Table 3 that, based on median and mean values, DEELS found consistently better median and mean of the average error values than SDENS for the seven out of ten test instances, , , , and . However, for the remaining three test instances, , , and , SDENS maintained its dominance over DEELS. This poor performance of DEELS might be due to one of the abovementioned reasons.

Both algorithms, DEELS and SDENS, achieved success based on standard deviation values as illustrated in Tables 3. That is, for five test instances, , , and to , DEELS performed well in terms of standard deviation values, while for the other five test instances, to , , and , SDENS outperforms DEELS.

Overall, DEELS performance was better than SDENS on best, median, and mean values. However, in case of standard deviation values, the performance of both algorithms is .

5.7. Comparison with DASA

Table 3 presents the best, median, mean, and standard deviation values for DASA, which are obtained from [17]. This table shows that DEELS is superior to DASA on four test instances, , , , and , in terms of best function error values. These test instances are mainly -group nonseparable except , which is -group nonseparable; all are multimodal except . For the remaining six test instances, DEELS performed poorly against DASA in achieving the best function error values. Please note that, among these six functions, , , and are separable. Table 3 shows that DEELS outperforms DASA in finding good median and mean of the function error values for the five test instances, , , , , and , while it is inferior to DASA on the other five test instances, to , , and ; here, is -group nonseparable and is -group nonseparable.

The worse performance of DEELS against DASA can be seen only in case of standard deviation values, where DEELS is superior only in three test instances, , , and , while DASA surpasses DEELS on seven test instances, to , , , , and .

Thus, one can conclude that based on the median and mean function error values DASA and DEELS have similar performance on nonseparable test functions except standard deviation and the minimum objective values where DASA is better than DEELS. The latter failed totally on separable test functions, to . It also remained poor on two nonseparable test functions, and .

6. Conclusion

In this paper, we described DEELS, a new hybrid algorithm that combines two well known algorithms, JADE and BFGS, to keep a balance between exploration and exploitation. DEELS showed efficient performance on majority of the tested test instances against jDE and SDENS except DASA. Based on the experimental results, it can be concluded that LS method can improve the local tuning of the solutions provided that these are hybridized at a proper gap with the global optimizer; otherwise, it can cause early termination of the algorithm and results in premature convergence. It is also observed that DEELS fails on separable functions.

Competing Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

References

  1. R. Mallipeddi, P. N. Suganthan, Q. K. Pan, and M. F. Tasgetiren, “Differential evolution algorithm with ensemble of parameters and mutation strategies,” Applied Soft Computing Journal, vol. 11, no. 2, pp. 1679–1696, 2011. View at Publisher · View at Google Scholar · View at Scopus
  2. S. Y. Yuen and C. K. Chow, “A genetic algorithm that adaptively mutates and never revisits,” IEEE Transactions on Evolutionary Computation, vol. 13, no. 2, pp. 454–472, 2009. View at Publisher · View at Google Scholar · View at Scopus
  3. R. C. Eberhart and J. Kennedy, “A new optimizer using particle swarm theory,” in Proceedings of the 6th International Symposium on Micromachine and Human Science1995., pp. 39–43, Nagoya, Japan, 1995.
  4. J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proceedings of the IEEE International Conference on Neural Networks, vol. 4, pp. 1942–1948, Perth, Australia, December 1995. View at Publisher · View at Google Scholar · View at Scopus
  5. A. E. Eiben and J. E. Smith, Introduction to Evolutionary Computing, Natural Computing Series, Springer, Berlin, Germany, 2003. View at Publisher · View at Google Scholar · View at MathSciNet
  6. R. Storn, Differential Evolution (DE) Research-Trends and Open Questions, vol. SCI 143, Springer, Berlin, Germany, 2008.
  7. R. Storn and K. V. Price, “Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341–359, 1997. View at Publisher · View at Google Scholar · View at MathSciNet
  8. N. Noman and H. Iba, “Accelerating differential evolution using an adaptive local search,” IEEE Transactions on Evolutionary Computation, vol. 12, no. 1, pp. 107–125, 2008. View at Publisher · View at Google Scholar · View at Scopus
  9. Z.-F. Hao, G.-H. Guo, and H. Huang, “A particle swarm optimization algorithm with differential evolution,” in Proceedings of the International Conference on Machine Learning and Cybernetics, pp. 1031–1035, IEEE, Hong Kong, August 2007. View at Publisher · View at Google Scholar · View at Scopus
  10. J. J. Liang, A. K. Qin, P. N. Suganthan, and S. Baskar, “Comprehensive learning particle swarm optimizer for global optimization of multimodal functions,” IEEE Transactions on Evolutionary Computation, vol. 10, no. 3, pp. 281–295, 2006. View at Publisher · View at Google Scholar · View at Scopus
  11. J. Zhang and A. C. Sanderson, “JADE: adaptive differential evolution with optional external archive,” IEEE Transactions on Evolutionary Computation, vol. 13, no. 5, pp. 945–958, 2009. View at Publisher · View at Google Scholar
  12. C. Zhang and L. Gao, “An effective improvement of JADE for real-parameter optimization,” in Proceedings of the 6th International Conference on Advanced Computational Intelligence (ICACI '13), pp. 58–63, Hangzhou, China, October 2013. View at Publisher · View at Google Scholar · View at Scopus
  13. R. Fletcher, Practical Methods of Optimization, John Wiley & Sons, New York, NY, USA, 2nd edition, 1987.
  14. R. A. Khanum and M. A. Jan, “Hybridization of adaptive differential evolution with BFGS,” in Research and Development in Intelligent Systems XXIX: Incorporating Applications and Innovations in Intelligent Systems XX Proceedings of AI-2012, The Thirty-second SGAI International Conference on Innovative Techniques and Applications of Artificial Intelligence, pp. 441–446, Springer, Berlin, Germany, 2012. View at Publisher · View at Google Scholar
  15. J. Brest, S. Greiner, B. Bošković, M. Mernik, and V. Zumer, “Self-adapting control parameters in differential evolution: a comparative study on numerical benchmark problems,” IEEE Transactions on Evolutionary Computation, vol. 10, no. 6, pp. 646–657, 2006. View at Publisher · View at Google Scholar · View at Scopus
  16. H. Wang, Z. Wu, S. Rahnamayan, and D. Jiang, “Sequential DE enhanced by neighborhood search for large scale global optimization,” in Proceedings of the IEEE Congress on Evolutionary Computation, pp. 1–7, IEEE, Barcelona, Spain, July 2010. View at Publisher · View at Google Scholar · View at Scopus
  17. P. Korosec, K. Tashkova, and J. Silc, “The differential ant-stigmergy algorithm for large scale global optimization,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '10), pp. 1–8, IEEE, Barcelona, Spain, July 2010.
  18. R. Storn and K. Price, “Home page of differential evolution,” Tech. Rep., 2003, http://www1.icsi.berkeley.edu/~storn/code.html. View at Google Scholar
  19. S. Das and P. N. Suganthan, “Tutorial: differential evolution, foundations, prospectives and applications,” in Proceedings of the IEEE Symposium Series on Computational Intelligence (SSCI '11), pp. 1–59, Paris, France, April 2011.
  20. P. N. Suganthan and Swagatam, “Tutorial: differential evolution,” in Proceedings of the IEEE Symposium Series on Computational Intelligence (SSCI '11), pp. 1–76, Paris, France, April 2011.
  21. F. Neri and V. Tirronen, “Recent advances in differential evolution: a survey and experimental analysis,” Artificial Intelligence Review, vol. 33, no. 1-2, pp. 61–106, 2010. View at Publisher · View at Google Scholar · View at Scopus
  22. S. Das and P. N. Suganthan, “Differential evolution: a survey of the state-of-the-art,” IEEE Transactions on Evolutionary Computation, vol. 15, no. 1, pp. 4–31, 2011. View at Publisher · View at Google Scholar · View at Scopus
  23. P. Venkataraman, Applied Optimization with Matlab Programming, John Wiley & Sons, New York, NY, USA, 2002.
  24. V. V. D. Melo and A. C. Botazzo Delbem, “Investigating Smart Sampling as a population initialization method for differential evolution in continuous problems,” Information Sciences, vol. 193, pp. 36–53, 2012. View at Publisher · View at Google Scholar · View at Scopus
  25. D. Zaharie, “A comparitive analysis of crossover varients in differential evolution,” in Proceedings of the International Multiconference on Computer Science and Information Technology, Wisła, Poland, October 2007.
  26. Y. Wang, Z. Cai, and Q. Zhang, “Differential evolution with composite trial vector generation strategies and control parameters,” IEEE Transactions on Evolutionary Computation, vol. 15, no. 1, pp. 55–66, 2011. View at Publisher · View at Google Scholar · View at Scopus
  27. S. Rahnamayan, H. R. Tizhoosh, and M. M. A. Salama, “Opposition-based differential evolution,” IEEE Transactions on Evolutionary Computation, vol. 12, no. 1, pp. 64–79, 2008. View at Publisher · View at Google Scholar · View at Scopus
  28. R. A. Khanum and M. A. Jan, “Centroid-based Initialized JADE for global optimization,” in Proceedings of the 3rd Computer Science and Electronic Engineering Conference (CEEC '11), pp. 115–120, IEEE, Colchester, UK, July 2011. View at Publisher · View at Google Scholar · View at Scopus
  29. J. Brest, A. Zamuda, B. Boskovic, S. Greiner, and V. Zumer, Advances in Differential Evolution, vol. SCI143 of An Analysis of the Control Parameters' Adaptation in Differential Evolution, 2008.
  30. Z. Yang, K. Tang, and X. Yao, “Self-adaptive differential evolution with neighborhood search,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '08), pp. 1110–1116, IEEE Press, June 2008. View at Publisher · View at Google Scholar · View at Scopus
  31. A. K. Qin, V. L. Huang, and P. N. Suganthan, “Differential evolution algorithm with strategy adaptation for global numerical optimization,” IEEE Transactions on Evolutionary Computation, vol. 13, no. 2, pp. 398–417, 2009. View at Publisher · View at Google Scholar · View at Scopus
  32. R. Tanabe and A. Fukunaga, “Success-history based parameter adaptation for differential evolution,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '13), pp. 71–78, Cancún, Mexico, June 2013. View at Publisher · View at Google Scholar · View at Scopus
  33. J. Aalto and J. Lampinen, “A mutation and crossover adaptation mechanism for differential evolution algorithm,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '14), pp. 451–458, Beijing, China, July 2014. View at Publisher · View at Google Scholar · View at Scopus
  34. Z. Yang, K. Tang, and X. Yao, “Large scale evolutionary optimization using cooperative coevolution,” Journal of Information Sciences, vol. 178, no. 15, pp. 2985–2999, 2008. View at Publisher · View at Google Scholar
  35. C.-S. Deng, B.-Y. Zhao, A.-Y. Deng, and C.-Y. Liang, “Hybrid-coding binary differential evolution algorithm with application to 0-1 knapsack problems,” in Proceedings of the International Conference on Computer Science and Software Engineering (CSSE '08), pp. 317–320, Wuhan, China, December 2008. View at Publisher · View at Google Scholar · View at Scopus
  36. C. S. Deng, B. Y. Zhao, and C. Y. Liang, “Hybrid binary differential evolution algorithm for 0-1 knapsack problem,” Computer Engineering and Design, vol. 31, no. 8, pp. 1795–1798, 2010. View at Google Scholar
  37. C. Segura, C. A. C. Coello, E. Segredo, and C. León, “An analysis of the automatic adaptation of the crossover rate in differential evolution,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '14), pp. 459–466, IEEE, July 2014. View at Publisher · View at Google Scholar · View at Scopus
  38. A. K. Qin, K. Tang, H. Pan, and S. Xia, “Self-adaptive differential evolution with local search chains for real-parameter single-objective optimization,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '14), pp. 467–474, IEEE, Beijing, China, July 2014. View at Publisher · View at Google Scholar · View at Scopus
  39. F. Wei, Y. Wang, and T. Zong, “Variable grouping based differential evolution using an auxiliary function for large scale global optimization,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '14), pp. 1293–1298, IEEE, Beijing, China, July 2014. View at Publisher · View at Google Scholar · View at Scopus
  40. R. A. Khanum, N. Tairan, M. A. Jan, W. K. Mashwani, and A. Salhi, “Reflected adaptive differential evolution with two external archives for large-scale global optimization,” International Journal of Advanced Computer Science and Applications, vol. 7, no. 2, pp. 675–683, 2016. View at Google Scholar
  41. P. Kaelo and M. M. Ali, “A numerical study of some modified differential evolution algorithms,” European Journal of Operational Research, vol. 169, no. 3, pp. 1176–1184, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  42. M. Ali, M. Pant, and A. Abraham, “Simplex differential evolution,” Acta Polytechnica Hungarica, vol. 6, no. 5, pp. 95–115, 2009. View at Google Scholar · View at Scopus
  43. J. Brest, A. Zamuda, B. Bošković, S. Greiner, M. S. Maučec, and V. Žumer, “Self-adaptive differential evolution with SQP local search,” in Proceedings of the 3rd International Conference on Bioinspired Optimization Methods and their Applications (BIOMA '08), pp. 59–69, Ljubljana, Slovenia, October 2008. View at Scopus
  44. S. Das, S. S. Mullick, and P. Suganthan, “Recent advances in differential evolution—an updated survey,” Swarm and Evolutionary Computation, vol. 27, pp. 1–30, 2016. View at Publisher · View at Google Scholar
  45. A. Skajaa, Limited memory bfgs for nonsmooth optimization [M.S. thesis], 2010.
  46. F. J. Hickernell and Y. Yuan, “A simple multistart algorithm for global optimization,” OR Transactions, vol. 1, no. 2, pp. 1–12, 1997. View at Google Scholar
  47. R. B. Schnabel, “Concurrent function evaluations in local and global optimization,” Computer Methods in Applied Mechanics and Engineering, vol. 64, no. 1–3, pp. 537–552, 1987. View at Google Scholar
  48. A. K. Qin and P. N. Suganthan, “Self-adaptive differential evolution algorithm for numerical optimization,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '05), vol. 2, pp. 1785–1791, September 2005. View at Scopus
  49. J. Ronkkonen, V. Kukkonen, and K. V. Price, “Real-parameter optimization with differential evolution,” in Proceedings of the IEEE Congress on Evolutionary Computation, pp. 506–513, Edinburgh, Scotland, September 2005. View at Publisher · View at Google Scholar
  50. H. Wang, Z. Wu, S. Rahnamayan, and D. Jiang, “Sequential DE enhanced by neighborhood search for large scale global optimization,” in Proceedings of the 6th IEEE World Congress on Computational Intelligence (WCCI '10), pp. 1–7, Barcelona, Spain, July 2010. View at Publisher · View at Google Scholar · View at Scopus
  51. K. Tang, P. N. Xiodongo, Z. Yang, and T. Weise, “Benchmark functions for the CEC2010 special session and competition on large scale global optimization,” Tech. Rep., Nature Inspired Computation and Application Laboratory (NICAL), University of Science and Technology of China, 2010. View at Google Scholar