Abstract

In differential evolution (DE) algorithm, depending on the characteristics of the problem at hand and the available computational resources, different strategies combined with a different set of parameters may be effective. In addition, a single, well-tuned combination of strategies and parameters may not guarantee optimal performance because different strategies combined with different parameter settings can be appropriate during different stages of the evolution. Therefore, various adaptive/self-adaptive techniques have been proposed to adapt the DE strategies and parameters during the course of evolution. In this paper, we propose a new parameter adaptation technique for DE based on ensemble approach and harmony search algorithm (HS). In the proposed method, an ensemble of parameters is randomly sampled which form the initial harmony memory. The parameter ensemble evolves during the course of the optimization process by HS algorithm. Each parameter combination in the harmony memory is evaluated by testing them on the DE population. The performance of the proposed adaptation method is evaluated using two recently proposed strategies (DE/current-to-pbest/bin and DE/current-to-gr_best/bin) as basic DE frameworks. Numerical results demonstrate the effectiveness of the proposed adaptation technique compared to the state-of-the-art DE based algorithms on a set of challenging test problems (CEC 2005).

1. Introduction

During the last decade, evolutionary algorithms (EAs) inspired by Darwinian theory of evolution are becoming increasingly popular because of their ability to handle nonlinear and complex optimization problems. Unlike, the conventional numerical optimization methods, EAs are population-based metaheuristic algorithms and require the objective function values, while properties such as differentiability and continuity are not necessary. However, EAs performance depends on the encoding schemes, evolutionary operators, and parameter settings such as population size, mutation scale factor, and crossover rate. In addition, an appropriate parameter selection is a problem dependent and requires a time-consuming trial-and-error parameter tuning process. The trail-and-error based parameter selection is ineffective if the optimization is required in an automated environment or if the user has no experience in the fine art of the control parameter tuning. To overcome this, different parameter adaptation schemes have been presented [16]. Among the different parameter adaptation techniques adaptive and self-adaptive techniques are popular due to their ability to adjust the parameter during the course of the evolution with minimal or no intervention from the user. In other words, in adaptive and self-adaptive techniques, the parameter adaptation is done based on the feedback from the search process. Self-adaptive techniques are based on the assumption that the most appropriate parameter values produce better offspring which are more likely to survive and propagate the better parameter values [7]. Therefore, in self-adaptive methods the parameters are directly encoded into the individuals and are evolved together with the encoded solutions.

Differential evolution (DE) [8] is a fast and simple technique that has been successfully applied in diverse fields [912]. Like most EAs, the performance of DE [13] is sensitive to population size (), mutation and crossover strategies, and their associated control parameters such as scale factor () and crossover rate (). In other words, the best combination of strategies and their control parameter settings can be different for different optimization problems. In addition, for a given optimization the best combination of strategies and parameter values differ based on the available computational resources and required accuracy. Therefore, to successfully solve a specific optimization problem, it is generally necessary to perform a time-consuming trial-and-error search for the most appropriate strategies and their associated parameter values. However, in DE during the evolution process, the population traverses through different regions in the search space, within which different strategies [14] with different parameter settings may be more effective than a well-tuned, single combination of strategies and parameters. In DE literature, different partial adaptation schemes have been proposed [7, 1417] to overcome the time-consuming trial-and-error procedure.

In [18], the authors proposed an adaptive DE algorithm referred to as JADE. In JADE, the authors implemented a new mutation strategy “DE/current-to-pbest/1” and the control parameters ( and ) are self-adapted. “DE/current-to-pbest/1” is a generalized version of “DE/current-to-best/1.” JADE uses the conventional binomial crossover strategy. In JADE, the self-adaptation of the control parameters avoids the requirement of prior knowledge about parameter settings and works well without user interaction. Motivated by JADE, the authors in [19] proposed another adaptive DE algorithm referred to as MDE-pBX (Modified DE with p-best crossover). MDE-pBX uses a new mutation strategy “DE/current-to-gr_best/1” which is a modified version of “DE/current-to-best/1.” Unlike JADE, MDE-pBX uses a more exploitative “p-best binomial crossover” strategy. In [20], the authors proposed an ensemble approach for parameter adaptation of DE, where each parameter has a pool of values competing to produce future offspring based on their success in the past generations.

Harmony search (HS) is also population-based metaheuristic optimization algorithm which mimics the music improvisation process. Recently, HS is gaining significance as an efficient optimization algorithm and is used in variety of applications. In HS, the generation of a new vector or solution is based on the consideration of all the existing vectors, rather than considering only two vectors as in DE (parent and mutant vector) [21]. This characteristic of HS makes it more explorative compared to the DE algorithm.

During the past decade, hybridization of EAs has gained significance, due to ability to complement each other’s strengths and overcome the drawbacks of the individual algorithms. To enhance the exploitation ability in HS [22], memory consideration is generally employed where new individuals are generated based on the historical search experience. In addition, HS employs random selection approach to explore and sample new solutions from the search space. In HS, the random selection aids the exploration ability but not as efficient as the DE mutation strategy and results in slow convergence characteristics. However, a new solution formed using a set of few randomly selected individuals may limit the exploration ability in DE when the population diversity is low [23]. In [23], the authors propose a hybrid algorithm referred to as differential harmony search (DHS) by fusing the HS and DE mechanisms. The hybridized DHS algorithm could reasonably balance the exploration and exploitation abilities.

In this paper, we propose a DE parameter adaptation technique based on HS algorithm. In the proposed adaptation method, a group of DE control parameter combinations are randomly initialized. The randomly initialized DE parameter combinations form the initial harmony memory (HM) of the HS algorithm. Each combination of the parameters present in the HM is evaluated by testing on the DE population during the evolution. Based on the effectiveness of the DE parameter combinations present in HM, the HS algorithm evolves the parameter combinations. At any given point of time during the evolution of the DE population, the HM contains an ensemble of DE parameters that suits the evolution process of the DE population.

The rest of the paper is organized as follows. Section 2 provides a brief literature review on different adaptive DE and HS algorithms. Section 3 presents the proposed algorithm where the DE parameters are adapted using a HS algorithm. Section 4 presents the experimental results while Section 5 presents the conclusions with some future directions.

2. Literature Review

2.1. Classical Differential Evolution

Differential evolution (DE) is a simple real parameter optimization algorithm that belongs to the class of evolutionary algorithms (EAs) and involves the continuous application of operators such as mutation, crossover, and selection. DE starts with , -dimensional parameter vectors, referred to as population, where each individual is a candidate solution to the problem at hand as shown in

The initial population is uniformly sampled from the search space constrained by the prescribed minimum and maximum parameter bounds and .

After initialization, corresponding to each target vector in the population at the generation , a mutant vector can be generated via a mutation strategy. The most commonly used mutation strategy in DE is given by:

“DE/rand/1” [24]: The indices are randomly generated mutually exclusive integers in the range . The indices are randomly generated once for each mutant vector and are also different from the index . The scale factor is a positive control parameter for scaling the difference vector.

After the mutation, crossover operation is applied to each pair of the target vector and its corresponding mutant vector to generate a trial vector: . In the basic version, DE employs the binomial (uniform) crossover defined as follows [25]: In (3), the crossover rate is a user-specified constant within the range [0,1], which controls the fraction of parameter values copied from the mutant vector. is a randomly chosen integer in the range . In DE, there exists another type of crossover operator called exponential crossover which is functionally equivalent to the circular two-point crossover operator [14].

After crossover, the generated trial vectors are evaluated using the objective function and a selection operation is performed as shown in In (4),    and correspond to the objective values of the trial and target vectors.

As mentioned above, the mutation, crossover, and selection steps are repeated generation after generation until a termination criterion (reaching the maximum number of function evaluations set) is satisfied. The algorithmic description of the DE is summarized in Algorithm 1.

STEP 1: Randomly initialize a population of NP, D-dimensional
   parameter vectors. Set the generation number .
STEP 2: WHILE stopping criterion is not satisfied
   DO
   Mutation—Equation (2)
   Crossover—Equation (3)
   Selection—Equation (4)
    Increment the generation count G = G + 1
STEP 3: END WHILE

2.2. Parameter Adaptation in Differential Evolution

Although, DE has attracted much attention recently as a global optimizer over continuous spaces [25], the performance of the conventional DE algorithm depends on the chosen mutation and crossover strategies and the associated control parameters. Depending on the complexity of the problem, the performance of DE becomes more sensitive to the strategies and the associated parameter values [26] and inappropriate choice may lead to premature convergence, stagnation, or wastage of computational resources [16, 2629]. In other words, due to the complex interaction of control parameters with the DE’s performance [7], choosing an appropriate mutation and crossover strategies and control parameters require some expertise. Since DE was proposed, various empirical guidelines were suggested for choosing the population size () [2426, 29, 30], mutation and crossover strategies [12, 2426, 28, 31, 32], and their associated control parameter settings: scale factor () [2426, 29, 30, 33, 34] and crossover rate () [2426, 2830, 35, 36].

To some extent, the guidelines are useful for selecting the individual parameters of DE. However, the performance of DE is more sensitive to the combination of the mutation strategy and its associated parameters. For a mutation strategy, [7] a particular value of makes the parameter sensitive while some other values of make the same robust. Hence, the manual parameter tuning of DE is not easy and requires a good expertise. To overcome the burden of tuning the DE parameters by trial-and-error, various adaptive techniques have been proposed [1416, 3739]. The most popular adaptive DE variants are as follows [40].(1)SaDE [14]: in SaDE, the trail vector generation strategies and the associated control parameter values are self-adapted based on their previous experiences of generating promising solutions. (2)jDE [7]: the control parameters and are encoded into the individuals and are adjusted based on the parameters and . The initial values of and of each population individual of DE were selected as 0.5 and 0.9, respectively. Then, based on a random number (rand) which is uniformly generated in the range of [0, 1], the values of and are reinitialized if rand < and rand < , respectively. and are reinitialized to a new value randomly generated in the ranges [0.1, 1.0] and [0, 1], respectively. (3)JADE [18]: JADE employs a new mutation strategy “DE/current-to-pbest” and updates the control parameters in an adaptive manner. “DE/current-to-pbest” is a generalized version of “DE/current-to-best” and helps in diversifying the population and improves the convergence performance. In JADE, the parameter adaptation is done automatically and does not need any prior knowledge regarding relationship between the parameter settings and the characteristics of optimization problems. In JADE, the and values corresponding to each population member are sampled from the mean values of and . The mean values of and are updated by the individual and values which are successful in generating better trail vectors compared to the target vectors.(4)EPSDE [20]: while solving a specific problem, different mutation strategies with different parameter settings may be better during different stages of the evolution than a single mutation strategy with unique parameter settings as in the conventional DE. Motivated by these observations an ensemble of mutation strategies and parameter values for DE (EPSDE) was proposed in which a pool of mutation strategies, along with a pool of values corresponding to each associated parameter competes to produce successful offspring population. In EPSDE, the candidate pool of mutation strategies and parameters should be restrictive to avoid the unfavorable influences of less effective mutation strategies and parameters [14]. (5)MDE-pBX [19]: motivated by JADE, MDE-pBX employs a new mutation strategy “DE/current-to-gr_best/1” and the control parameters are self-adapted. According to the new mutation strategy, the algorithm uses the best individual of a group (whose size is q% of the population size) of randomly selected solutions from current generation to perturb the parent (target) vector. In addition, unlike JADE, MDE-pBX uses a modified binomial crossover operation referred to as “p-best crossover.” According to the modified crossover operation, a biased parent selection scheme has been incorporated by letting each mutant undergo the usual binomial crossover with one of the p top-ranked individuals from the current population and not with the target vector with the same index as used in all variants of DE.

2.3. Harmony Search Algorithm

Unlike most EAs, which simulate natural selection and biological evolution, HS is a population-based metaheuristic optimization algorithm which mimics the music improvisation process where musicians improvise their instruments’ pitch by searching for a perfect state of harmony. Some of the characteristics of HS that distinguish it from other metaheuristics such as DE are as follows [21]: (1) considering all the existing solution vectors while generating a new vector, rather than considering only two vectors as in DE (target vector and trail vector); and (2) independent consideration for each decision variable in a solution vector. An overview of the standard HS algorithm is presented in Algorithm 2.

STEP 1: Initialize the HM with HMS randomly generated solutions. Set generation count .
STEP 2: WHILE stopping criterion is not satisfied
   /*Generate a new solution*/
     FOR each decision variable DO
      IF  rand 1 < HMCR
       Pick the value from one of the solutions in HM
       IF  rand 2 < PAR
       Perturb the value picked     /*New solution generated*/
       END IF
       END IF
       END FOR
    IF new solution better than the worst solution in HM (in terms of fitness)
    Replace the worst solution in HM with new solution
    END IF
    Increment the generation count
STEP 3: END WHILE

In HS the improvisation operators, memory consideration, pitch adjustment, and random consideration play a major role in achieving the desired balance between the exploitation and exploration during the optimization process [41]. Essentially, both pitch adjustment and random consideration are the key components of achieving the desired diversification in HS. In random consideration, the new vector’s components are generated at random and have the same level of efficiency as in other algorithms that handle randomization. The random consideration of HS allows the exploration of new regions that may not have been visited in the search space. In HS, pitch adjustment enhances diversification by tuning the components of a new vector’s within a given bandwidth by adding/subtracting a small random amount to an existing component stored in HM. Further to that, pitch adjustment operator can also be considered as a mechanism to support the intensification of HS through controlling the probability of PAR. The intensification in the HS algorithm is represented by the third HS operator, memory consideration. A high harmony acceptance rate means that good solutions from the history/memory are more likely to be selected or inherited. This is equivalent to a certain degree of elitism. Obviously, if the acceptance rate is too low, solutions will converge more slowly.

Recently, HS algorithm garnered a lot of attention from the research community and is successfully applied in solving many optimization problems in engineering and computer science. Consequently, the interest in HS has led to the improvement and development of its performance in line with the requirements of problems that are solved. The improvements proposed by different researchers related to HS can be categorized as follows [42]: (1) HS improvement by appropriate parameters setting; and (2) improvement of HS by hybridizing with other metaheuristic algorithms.

3. Harmony Search Based Parameter Ensemble Adaptation for DE (HSPEADE)

As highlighted in the previous section, depending on the nature of problem (unimodal or multimodal) and available computation resources, different optimization problems require different mutation and crossover strategies combined with different parameter values to obtain optimal performance. In addition, to solve a specific problem, different mutation and crossover strategies with different parameter settings may be better during different stages of the evolution than a single set of strategies with unique parameter settings as in the conventional DE. Motivated by these observations, the authors in [20] proposed an ensemble approach (EPSDE) in which a pool of mutation and crossover strategies, along with a pool of values corresponding to each associated parameter competes to produce successful offspring population.

In EPSDE, each member in the DE population is randomly assigned a mutation and crossover strategies and the associated parameter values taken from the respective pools. The population members (target vectors) produce offspring (trial vectors) using the assigned strategies and parameter values. If the generated trial vector is able to enter the next generation of the evolution, then combination of the strategies and the parameter values that produced trail vector are stored. If trial vector fails to enter the next generation, then the strategies and parameter values associated with that target vector are randomly reinitialized from the respective pools or from the successful combinations stored with equal probability.

To have an optimal performance based on the ensemble approach, the candidate pool of strategies and parameters should be restrictive to avoid the unfavorable influences of less effective strategies and parameters [14]. In other words, the strategies and the parameters present in the respective pools should have diverse characteristics, so that they can exhibit distinct performance characteristics during different stages of the evolution, when dealing with a particular problem.

In EPSDE, since the strategy and parameter pools are restrictive, most of the individuals in the pools may become obsolete during the course of the evolution of DE population. Therefore, it would be apt if the strategy and the parameter pools can evolve with the DE population. Based on this motivation, we propose an HS based parameter ensemble adaptation for DE (HSPEADE). The overall view of the proposed HSPEADE is presented in Algorithm 3.

STEP 1: Initialize a population of NP, D-dimensional individuals as the population of DE
STEP 2: Initialize HM with HMS randomly selected individuals.
STEP 3: WHILE stopping criterion is not satisfied
   DO
   Generate a new vector based on HS
   FOR 1: HMS + 1 /*Evaluate each parameter combination of the HM*/
    Mutation
    Crossover
    Selection
    Evaluate the objective value of each HM vector
    Increment the generation count  G = G + 1
   END FOR
   Update HM
STEP 4: END WHILE

As shown in Algorithm 3, after the initialization of DE population, the HM of the HS algorithm is initialized with HMS number of randomly generated vectors. The members of the HM are the parameter combinations ( and values) corresponding to the mutation and crossover strategies used. Using the members in the HM, a new parameter combination vector is generated using the HS algorithm described in Algorithm 2. Each of the HMS + 1 parameter combinations is evaluated by testing them on the DE population during the evolution. After evaluating all the members of the HM and the newly generated parameter combination, the HM is updated as in HS algorithm. The generation of new parameter combination and the updating of the HM are performed throughout the evolution process of the DE algorithm.

To obtain optimal performance based on the ensemble approach, it is obvious that the parameter combinations in HM should be diverse during initial generations of the DE population evolution and should converge to the optimal combination towards the end of the evolution. During the course of the experimentation, we observed that HS is more suitable to evolve the parameter ensemble due to its characteristics such as the following: (1) HS generates a single vector every generation and replaces the worst performing vector; (2) it can randomly generate new solution vectors thus enabling diversity if needed and (3) it considers all the solution vectors in the memory to generate a new solution.

In HSPEADE, to facilitate the diversity in parameter ensemble in the initial stages and to allow the HS to converge to the optimal combination, we made some modifications in the HS algorithm. In HS algorithm shown in Algorithm 2, the parameters HMCR and PAR are deterministically changed. HMCR is linearly increased from 0 to 1 while PAR is decreased linearly from 1 to 0 with the increase in the generation count.

4. Experimental Setup and Results

In this section, we evaluate the performance of the proposed parameter adaptation technique for DE. The details regarding the test problems, experimental environment, and algorithms used for comparison are given below.

4.1. Problem Set

The performance of the proposed method is evaluated using a selected set of standard test functions from the special session on real-parameter optimization of the IEEE Congress on Evolutionary Computation (CEC 2005) [43]. In this work, we use the first 14 functions of CEC 2005 out of which functions 1–5 are unimodal, functions 6–12 are multimodal, and functions 13-14 are hybrid composition functions. The details about the problems such as parameter ranges, location of the optimal solution, and the optimal objective values can be found in [43]. In the present work, to evaluate the scalability of the algorithms 30-, 50-, and 100-dimensional versions of the test problems are considered. The number of function evaluation used for 30-, 50-, and 100-dimensional problems are 100000, 500000, and 1000000, respectively. On each of the test problems, every algorithm under consideration is run 30 times.

4.2. Setup for Algorithmic Comparison

The proposed HSPEADE being a general idea can be applied with any frame work. In this work, the experiments are designed as follows.(1)We consider a single crossover strategy which is binomial crossover. We selected binomial crossover because the two recent adaptive DE algorithms (JADE [18] and MDE-pBX [19]) which show significant performance on the problem set considered employ binomial crossover. It is to be noted that MDE-pBX uses a modified “p-best binomial crossover” operator. However, in this work we consider the classical binomial crossover only. (2)We consider two mutation strategies “DE/current-to-pbest” and “DE/current-to-gr_best”The algorithmic comparison is divided into two sets as follows. SET 1 uses the “DE/current-to-pbest” strategy while SET 2 uses the “DE/current-to-gr_best” strategy. EPSDE algorithm mentioned above is referred to as EPDE below because in the current work the strategies are fixed. MDE-pBX algorithm is referred to as MDE below because in the present work we use a simple binomial crossover instead of the greedy “p-best crossover.”

SET 1:JADE: “DE/current-to-pbest” strategy, binomial crossover strategy, , , and are adapted [18].EPDE1: “DE/current-to-pbest” strategy, binomial crossover, , , , . HSPEADE1: “DE/current-to-pbest” strategy, binomial crossover, , and are encoded into the HS algorithm for adaptation. ranges from 0.2 to 1.2, ranges from 0 to 1, and ranges from 0.05 to 2.50.

SET 2:MDE: “DE/current-to-gr_best” strategy, binomial crossover strategy, , and are adapted [19].EPDE2: “DE/current-to-gr_best” strategy, binomial crossover strategy, , , .HEPEADE2: “DE/current-to-gr_best” strategy, binomial crossover, , , and are encoded in to the HS algorithm for adaptation. ranges from 0.2 to 1.2, ranges from 0 to 1, and p ranges from 0.05 to 2.50.

4.3. Statistical Tests

To compare the performance of different algorithms, we employ two types of statistical tests, namely, t-test and Wilcoxon rank sum test. The t-test being a parametric method can be used to compare the performance of two algorithms on a single problem. When the performances of two algorithms are to be compared on multiple problems t-test is not valid as the normality assumption fails [44]. Therefore, to compare the performance of two algorithms over a set of different problems, we can use a nonparametric test such as the Wilcoxon rank sum test [44].

4.4. Experimental Results

The experimental results (mean and standard deviations) corresponding to algorithms JADE, EPDE1, and HSEPDE1 (SET 1) on 30-, 50-, and 100-dimensional problems are presented in Tables 1, 2, and 3, respectively. The experimental results (mean and standard deviations) corresponding to algorithms MDE, EPDE2, and HSEPDE2 (SET 2) on 30-, 50-, and 100-dimensional problems are presented in Tables 4, 5, and 6, respectively. In Tables 16, the mean and standard deviation (std.) values are reported for every algorithm on each test problem.

The t-test and Wilcoxon rank sum test results comparing the performance of algorithms in SET 1 and SET 2 are presented in Tables 7 and 8, respectively. In Tables 7 and 8, the t-test results comparing two algorithms on each problem are presented and the last row presents the Wilcoxon rank sum test results. For each of the two tests, +1, 0, −1 in A versus B comparison indicates B is better than A, B is equal to A, and B is worse than A, respectively. For example, in JADE versus EPDE1 comparison in Table 7 (30D) EPSDE1 is better, equal and worst on test problems 4, 7, and 9, respectively.

4.5. Analysis of Experimental Results

SET 1. In Tables 1, 2, and 3, for each test problem, the best performing algorithms among the JADE and EPDE1 are highlighted using italic while the overall best among JADE, EPDE1, and HSPEADE1 are highlighted using bold.

From the experimental results, it can be observed that JADE performs better than EPDE1 on 30-dimensional versions of the problems. However, as the dimensionality of the test problems increases, the performance of the EPDE1 becomes better compared to JADE algorithm. The improved performance of EPDE1 can be attributed to the ensemble approach as different combinations of strategies and parameters can be effective during different stages of the evolution process [20].

From Tables 13, it can be observed that HSPEADE1 outperforms JADE and EPDE1 in most of the test problems. On 30-dimensional problems HSPEADE1 is better than or equal to JADE in 11 cases while JADE is better than HSPEADE1 in 3 cases. As the dimensionality of the test problems increases, the performance of HSPEADE1 gets better than JADE.

From the results, it is clear that the performance of HSPEADE1 is always better than or equal to EPDE1. This confirms our assumption that evolving parameter ensemble is better than fixed combination of parameter ensemble.

SET 2. In Tables 4, 5, and 6, for each test problem, the best performing algorithms among the MDE and EPDE2 are highlighted using italic while the overall best among MDE, EPDE2, and HSPEADE2 are highlighted using bold.

From the experimental results in Tables 46 a similar observation to the above can be made. In 30-, 50-, and 100-dimensional problems, the performance of MDE and EPDE2 is distributed. Unlike EPDE1 which dominates JADE as the dimensionality increases, EPDE2 is unable to dominate MDE. This may be due to the explorative ability of “DE/current-to-gr_best” strategy employed in MDE. However, as the dimensionality of the test problems increases the performance of HSPEADE2 becomes better compared to MDE.

From the Wilcoxon rank sum test results (bottom row of Tables 7 and 8), it can be observed that HSPEADE (HSPEADE1 and HSPEADE2) is always better than the algorithms under comparison. In both the experimental setups (SET 1 and SET 2), the statistical t-test results present in Tables 7 and 8 are summarized in Figures 1 to 4 to present a better view. For example in Figure 1, the bar plots indicate the number of test problems (30D, 50D, and 100D) on which HSPEADE1 is better, similar, and worst compared to JADE. From Figures 1, 2, 3, and 4, it is clear that the performance of HSPEADE1 is better than JADE and EPDE1 while HSPEADE2 is better than MDE and EPDE2.

5. Conclusions and Future Work

To improve the performance of DE, different adaptation techniques have been proposed. In this paper, we propose a new parameter adaptation technique for DE based on ensemble approach and HS algorithm and is referred to as HSPEADE. In HSPEADE, an ensemble of parameters is randomly sampled and forms the initial harmony memory. The parameter ensemble evolves during the course of the optimization process by HS algorithm. Each parameter combination in the harmony memory is evaluated by testing them on the DE population. During the initial stages of the evolution the DE parameter combinations in the harmony memory of HS are diverse and facilitate exploration for the better parameter combination. However, during the end of the evolution process fine tuning of the parameter combinations occurs and facilitates exploitation.

The performance of HSPEADE is evaluated by using two recently proposed DE strategies (DE/current-to-pbest/bin and DE/current-to-gr_best/bin) and the numerical results show that the proposed adaptation technique significant improvement compared to the state-of-the-art adaptive DE algorithms. From the experimental results, it can be observed that the proposed adaptation technique can handle the scalability issues better compared to the other adaptive techniques.

In the present work, we only consider the evolution of the parameter ensemble using the HS framework. As a future work, we would like to incorporate the ensemble of mutation and crossover strategies into the HS framework.

Acknowledgment

This research was supported by Kyungpook National University Research Fund, 2012.