Journal of Optimization

Volume 2016, Article ID 3260940, 14 pages

http://dx.doi.org/10.1155/2016/3260940

## Hybridization of Adaptive Differential Evolution with an Expensive Local Search Method

^{1}Department of Mathematics, Jinnah College for Women, University of Peshawar, Khyber Pakhtunkhwa 25000, Pakistan^{2}Department of Mathematics, Kohat University of Science & Technology (KUST), Kohat, Khyber Pakhtunkhwa 26000, Pakistan^{3}College of Computer Science, King Khalid University, Abha 61321, Saudi Arabia

Received 27 December 2015; Revised 9 June 2016; Accepted 14 June 2016

Academic Editor: Manlio Gaudioso

Copyright © 2016 Rashida Adeeb Khanum et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

Differential evolution (DE) is an effective and efficient heuristic for global optimization problems. However, it faces difficulty in exploiting the local region around the approximate solution. To handle this issue, local search (LS) techniques could be hybridized with DE to improve its local search capability. In this work, we hybridize an updated version of DE, adaptive differential evolution with optional external archive (JADE) with an expensive LS method, Broydon-Fletcher-Goldfarb-Shano (BFGS) for solving continuous unconstrained global optimization problems. The new hybrid algorithm is denoted by DEELS. To validate the performance of DEELS, we carried out extensive experiments on well known test problems suits, CEC2005 and CEC2010. The experimental results, in terms of function error values, success rate, and some other statistics, are compared with some of the state-of-the-art algorithms, self-adaptive control parameters in differential evolution (jDE), sequential DE enhanced by neighborhood search for large-scale global optimization (SDENS), and differential ant-stigmergy algorithm (DASA). These comparisons reveal that DEELS outperforms jDE and SDENS except DASA on the majority of test instances.

#### 1. Introduction

Optimization is concerned with finding best solution for an objective function. In general, an unconstrained optimization problem can be stated as follows: Find global optimum of an objective function , where and is the dimension of the problem.

Evolutionary algorithms (EAs) are inspired from Darwinian theory of evolution [1]. They are very efficient for finding global optimum of many real world problems, including problems from mathematics, engineering, economics, business, and medicines. EA family consists of a variety of stochastic algorithms, like Genetic Algorithms (GAs) [2], Particle Swarm Optimization (PSO) [3, 4], Evolutionary Strategies (ES) [5], and differential evolution algorithm (DE) [6, 7].

Among EAs, DE is the most recent algorithm and is efficient in solving many optimization problems. DE has many advantages. For example, it is simple to understand and implement, has a few control parameters, and is robust [8]. There is no doubt that DE is a remarkable optimizer for many optimization problems. But it has few limitations, like stagnation, premature convergence, and loss of population diversity [9, 10]. Being a global optimizer, DE suffers from searching the neighborhood of the approximate solution to the given problem. This makes room for hybridizing DE with other techniques to improve its poor exploitation (exploring the neighborhood of the approximate solutions). On the other hand, the role of LS methods is to stabilize the search especially in the environs of a local optimum. Thus, they can be combined with global search algorithms to enhance their local searching.

The main aim of this paper is to experiment with and validate the performance of our newly proposed hybrid algorithm, DEELS, which combines JADE [11, 12] and BFGS [13]. As a result, we want to see whether this hybridization will improve the performance of JADE further. Contrary to our published preliminary work [14], this paper presents DEELS in full depth. It also comments on the performance of DEELS for large-scale global optimization problems with dimension 1000. Moreover, in contrast to our previous published comparison with JADE only [14], this time DEELS is compared with jDE [15], SDENS [16], and DASA [17] on problems from CEC2005 and CEC2010 test suits to further explore the capabilities of DEELS for handling small and large dimension problems.

The rest of this paper is organized as follows. Section 2 describes the basic DE, JADE, and the BFGS algorithms. Section 3 presents literature review. Section 4 presents proposed algorithm. Section 5 gives the experimental results, and finally Section 6 concludes this paper and discusses future research direction.

#### 2. Some Relevant Existing Methods

As mentioned earlier, DEELS depends upon JADE and BFGS. Thus, this section presents the basic operators of DE, JADE, and BFGS.

##### 2.1. Basic DE

Differential evolution (DE) [6, 7] is a recently developed bioinspired scheme for finding the global optimum of an optimization problem. This section briefly reviews the DE algorithm. More details about it can be found in [18–22]. The working of DE can be described as follows.

###### 2.1.1. Parent Selection

For each member , , of the current generation , three other members, , , and , are randomly selected, where , , and are randomly chosen indices such that , , and and . Thus, for each individual, , a mating pool of four individuals is formed in which an individual breeds against three individuals and produces an offspring.

###### 2.1.2. Reproduction

To generate an offspring, DE incorporates two genetic operators, mutation and crossover. They are detailed as follows:(1)*Mutation*. After selection, mutation is applied to produce a mutant vector , by adding a scaled difference of the two already chosen vectors to the third chosen vector; that is,where is the scaling factor.(2)*Crossover*. After mutation, the parameters of the parent vector and mutant vector are mixed by a crossover operator and a trial member is generated as follows:where .

###### 2.1.3. Survival Selection

At the end, the trial vector generated in (2) is compared with its parent vector on the basis of its objective function value. The best of the two will get a chance to become a member of the the new generation; that is,

##### 2.2. JADE

JADE [11] is an adaptive version of DE which modifies it in three aspects.

###### 2.2.1. DE/Current/to-best Strategy

JADE utilized two mutation strategies: one with external archive and the other without it. These strategies can be expressed as follows [11]:where is a vector chosen randomly from the top individuals and , , and are chosen from the current population , while is chosen randomly from , where denotes the archive of JADE and is a constant chosen as . In DEELS, we will utilize the strategy given in (4).

###### 2.2.2. Control Parameters Adaptation

For each individual , control parameter and the crossover probability are generated independently from Cauchy and normal distributions, respectively, as follows [11]:These are then truncated to and , respectively. Initially, both and are set to 0.5. They are then updated at the end of each generation as follows:where denotes the Lehmer mean and denotes the arithmetic mean and is the set of successful ’s while is the set of successful ’s at generation .

###### 2.2.3. Optional External Archive

At each generation, the failed parents are sent to the archive. If the archive size exceeds , some solutions are randomly deleted from it to keep its size equal to . The archive inferior solutions play a roll in JADE’s mutation strategy with archive. The archive not only provides information about direction but improves the diversity as well.

##### 2.3. BFGS

The BFGS method, also known as the quasi Newton algorithm, employs the gradient and Hessian in finding a suitable search direction. BFGS is considered as a good LS method due to its efficiency. The detailed algorithm of BFGS is presented in Algorithm 1.