Abstract

Harmony search (HS) method is an emerging metaheuristic optimization algorithm. In this paper, an improved harmony search method based on differential mutation operator (IHSDE) is proposed to deal with the optimization problems. Since the population diversity plays an important role in the behavior of evolution algorithm, the aim of this paper is to calculate the expected population mean and variance of IHSDE from theoretical viewpoint. Numerical results, compared with the HSDE, NGHS, show that the IHSDE method has good convergence property over a test-suite of well-known benchmark functions.

1. Introduction

Most optimization algorithms are based on numerical linear and nonlinear programming methods that require substantial gradient information and usually seek to improve the solution in the neighborhood of an initial point. These algorithms, however, reveal a limited approach to complicated real-world optimization problems because gradient is often difficult to find out or does not exist. What is more, if there is more than one local optimum in the problem, the result may depend on the selection of the starting point, and the obtained optimal solution may not necessarily be the global optimum.

Recently, a new class of metaheuristics, named harmony search (HS), has been developed. The HS algorithm proposed in [1] has been developed in an analogy with music improvisation process where musicians in an ensemble continue to polish their pitches in order to obtain better harmony. Jazz improvisation seeks to find musically pleasing harmony similar to the optimum design process which seeks to find optimum solution. The pitch of each musical instrument determines the aesthetic quality, just as the objective function value is determined by the set of values assigned to each decision variable [2]. In addition, HS uses a stochastic random search instead of a gradient search so that derivative information is unnecessary.

HS may be viewed as a simple real-coded genetic algorithm (GA), since it incorporates many important features of GA like mutation, recombination, and selection. HS has been successfully applied to a wide variety of practical optimization problems like designing controller [3], economic dispatch problem [4, 5], optimal power flow problem [6], neural networks [7], medical diagnosis [8], broadcast scheduling in packet radio networks [9], university course timetabling [10], and other engineering optimization field [11].

Similar to the GA and particle swarm algorithms, the HS method is a random search technique. It does not require any prior domain knowledge, such as the gradient information of the objective functions. Unfortunately, empirical study has shown that the original HS method sometimes suffers from a slow search speed, and it is not suitable for handling the multimodal problems [2].

Recently, Omran and Mahdavi tried to improve the performance of HS by incorporating some techniques from swarm intelligence. The new variant called by them as global best harmony search (GHS) [12] reportedly outperformed three other HS variants over the benchmark problems. Chakraborty et al. proposed an improved harmony search algorithm with differential mutation operator [13]. Gao et al. proposed modified harmony search methods for unimodal and multimodal optimization [14]. Wang et al. proposed self-adaptive harmony search algorithm for optimization [15]. Pan et al. proposed a self-adaptive global best harmony search algorithm for continuous optimization problems [16]. Zou et al. proposed a novel global harmony search algorithm (NGHS, [17]). More latest HS algorithm can be found in [18, 19].

To overcome the shortcoming of premature convergence and stagnation, in this paper, we replace the pitch adjustment operation in classical HS (CHS) with a mutation strategy borrowed from the realm of the differential evolution (DE) algorithms, and we use where is chosen to be a uniformly distributed random variable between 0.6 and 1.0, instead of in [13], and are randomly selected with uniform distribution from the set , . The new mutation strategy is inspired by Chakraborty’s [13], and S. Das’s [19] work, especially in Chakraborty’s work [13], where the author theoretically showed that the harmony search based on differential mutation scheme (HSDE) has greater explorative power than the classical HS with pitch adjustment operation.

The new algorithm proposed in this paper, called IHSDE (improved harmony search methods based on differential mutation operator), has been extensively compared with the HSDE, and the classical HS. Mathematical analysis will show that the IHSDE, under certain conditions, possesses an increasing population variance (with generation) as compared to HSDE. The numerical experiments show that the proposed algorithm is effective in dealing with a test suite of well-known benchmark functions.

The rest of the paper is organized in the following way. Section 2 briefly outlines the classical HS. Section 3 presents IHSDE method based on differential mutation operator and analyzes the expected population mean and variance of IHSDE in terms of theory. Effectiveness of IHSDE method is demonstrated in Section 4 by solving well-known benchmarks. Section 5 concludes the paper.

2. Classical Harmony Search Algorithm

2.1. Harmony Search Algorithm Principles

Current metaheuristic algorithms imitate natural phenomena, and evolution in evolutionary algorithms. HS algorithm was conceptualized using the musical process of searching for a perfect state of harmony. In music improvisation, each player sounds any pitch within the possible range, together making one harmony vector. If all the pitches make a good harmony, that experience is stored in each player’s memory, and the possibility to make a good harmony is increased next time. Similarly, in engineering optimization, each decision variable initially chooses any value within the possible range, together making one solution vector. If all the values of decision variables make a good solution, that experience is stored in each variable’s memory, and the possibility to make a good solution is also increased next time. Figure 1 shows the details of the analogy between music improvisation and engineering optimization.

The HS algorithm does not require initial values for the decision variables. Furthermore, instead of a gradient search, the HS algorithm uses a stochastic random search that is based on the harmony memory considering rate and the pitch-adjusting rate so that derivative information is unnecessary. Compared to earlier metaheuristic optimization algorithms, the HS algorithm imposes fewer mathematical requirements and can be easily adopted for various types of engineering optimization problems.

2.2. Steps for Classical Harmony Search Algorithm

The steps in the procedure of standard harmony search algorithm are as follows:

Step??1 (Initialize the problem and algorithm parameters). The optimization problem is specified as follows. where is an objective function; is the set of each decision variable ; is the number of decision variables; is the set of the possible range of values for each decision variable, . The HS algorithm parameters are also specified in this step. These are the harmony memory size (HMS), or the number of solution vectors in the harmony memory; the harmony memory (HM); harmony memory considering rate (HMCR); pitch-adjusting rate (PAR); the number of improvisations ().

The harmony memory (HM) is a memory location where all the solution vectors (sets of decision variables) are stored. HMCR and PAR are parameters that are used to improve the solution vector.

Step??2 (Initialize the harmony memory). In Step 2, the HM matrix is filled with as many randomly generated solution vectors as the HMS:

Step??3 (Improvise a new harmony). A new harmony vector, , is generated based on three rules: (a) memory consideration, (b) pitch adjustment, and (c) random selection. Generating a new harmony is called “improvisation.” In the memory consideration, the value of the first decision variable for the new vector is chosen from any of the values in the specified HM with a probability HMCR. The HMCR is the rate of choosing one value from the historical values stored in the HM, while (1-HMCR) is the rate of randomly selecting one value from the possible range of values: where rand is a uniformly distributed random variable between 0 and 1.

Every component obtained by the memory consideration is examined to determine whether it should be pitch adjusted. This operation uses the PAR parameter, which is the rate of pitch adjustment as follows.

Pitch adjusting decision for : where is an arbitrary distance bandwidth.

In Step 3, HM consideration, pitch adjustment, or random selection is applied to each variable of the new harmony vector in turn.

Step??4 (Update harmony memory). If the new harmony vector is better than the worst harmony in the HM, the new harmony is included in the HM and the existing worst harmony is excluded from the HM.

Step??5 (Check stopping criterion). If the stopping criterion () is satisfied, computation is terminated. Otherwise, Steps 3 and 4 are repeated.

In the next section, we employ the differential mutation operator to improve the fitness of all the members in the HS memory so that the overall convergence speed of the original HS method can be accelerated.

3. The Improved HS Based on Differential Mutation Operator

Experiments with the CHS algorithm over the standard numerical benchmarks show that the algorithm suffers from the problem of premature and/or false convergence, slow convergence especially over multimodal fitness landscape.

To circumvent these problems of premature, Chakraborty et al. proposed harmony search algorithm with differential mutation operator (HSDE). They replaced the pitch adjustment operation (2.4) in classical HS with a mutation strategy borrowed from the realm of the DE algorithms [20]. The mutation strategy has been presented as where , , and the scale factor is chosen to be a uniformly distributed random variable between 0 and 1, that is, .

In what follows, we reset is chosen to be a uniformly distributed random variable between 0.6 and 1.0, . This improved harmony search algorithm with differential mutation operator will be referred to as IHSDE. In the following, we will show that the IHSDE gives better performance than the HSDE algorithm in theory and experiment.

3.1. IHSDE Algorithm

The pseudocode of IHSDE is described in Algorithm 1. Every variable in the HM needs to go through the above DE mutation refinement procedure. Thus, the resulting HM members are expected to have better fitness than that of the original ones. This strategy can also overcome the premature shortcoming of the regular HS method.

Procedure IHSDE algorithm
 Initiate_parameters
 Initialize_HM
While (not_termination)
   For   to do    // denotes the number of decision variables
   if ( rand < HMCR)     //memory consideration
  Select one harmony from HM randomly: ;
  Execute difference variation operation for the selected harmony:
   , where .
   else
      //random selection
   end if
  End for
 Update harmony memory HM     // if applicable
End while
End procedure

3.2. Theoretical Analyses of the Expected Population Variance

Theoretical analyses of the properties of HS algorithms are very important to understand their search behaviors and to develop more efficient algorithms [13, 19]. Compared to the plethora of works concerning the empirical study of parameter selection and tuning process in HS [1418], not much research has so far been devoted to theoretically analyze the search mechanism and convergence properties of HS, and this area remains largely open to prospective future research.

The evolution of the expected population variance over generations provides a measure of the explorative power of the algorithm. In the following, we will estimate the expected mean and variance of the population obtained by applying mutation operator.

Our ideas are as follows, firstly we find an analytical expression for the population expected variance, and then we compare the expected population variance of IHSDE with HSDE to show that the IHSDE algorithm possesses greater explorative power.

In HS type algorithms, since each dimension is perturbed independently, without loss of generality, we can make our analysis for single-dimensional population members.

Let us consider a population of scalars with elements , . The variance of this population is given by where is population mean and is quadratic population mean.

If the elements of the population are perturbed with some random numbers or variables, will be a random variable, and will be a measure of the explorative power. In the following discussion, we always suppose .

Lemma 3.1. Let be the current population of HS, be an intermediate vector obtained after random selection with harmony memory consideration, and the vector obtained after by replacing the pitch adjustment operation with a mutation operator (3.1) borrowed from DE algorithms in classical HS. Let be the final population after selection. If we let the allowable range for the new values of is where , and the required random numbers are continuously uniformly distributed random variable between 0 and 1 except for random variables , then where .

Proof. Since = HM is the current population of HS and is an intermediate vector obtained after random selection with harmony memory consideration, that is, where and is a new random variable in allowable range .
So,
According to [13, 19], we have
Using (3.6), we can obtain the following relations:
Now, is the vector obtained after mutating . It has the following structure:
Let ,, and , are two randomly chosen members from such that .
Thus,
According to [13], we have
Therefore,
Now, is the final population. Each element of the final population may be represented by the following:
Thus,
Let and represent the population mean and quadratic population mean of the final target population, respectively. Then,
Therefore, the variance of final population is given as and its expected population variance

Remark 3.2. The conclusion in [13] (Theorem 1) is incorrect. In this paper, we give the correct expression for the population expected variance.

The main analytical result is expressed in the form of the following theorem.

Theorem 3.3. Let , be the current population of HS. Giving the same values , here and represent the harmony memory size (HMS) and harmony memory considering rate (HMCR), respectively. The intergeneration population expected variance of IHSDE algorithm is greater than HSDE.

Proof. In HSDE, is chosen to be a uniformly distributed random variable between 0 and 1, and in IHSDE, is chosen to be a uniformly distributed random variable between 0.6 and 1. For convenience, let , and the intergeneration expected variance of HSDE is . Let , and the intergeneration expected variance of IHSDE is .
Since
Thus,

Remark 3.4. Improper parameters can lead to problem of premature and/or false convergence. Usually, . If is chosen too small, it gets more difficult to escape local optima. A larger (or ) increases the probability (or the population expected variance of IHSDE) for escaping a local optimum. However, for , the convergence speed decreases. It is more difficult to converge for a population when the perturbation is larger than the distance between two members. Thus, though standard Gaussian distribution has much larger than that of , it cannot get better convergence. In [21], Mperle et al. suggest that a good initial choice for the amplification factor should be . If one suspects that with this setting only a local optimum is found, then should be increased. This fully shows that our choosing is appropriate.

The above mathematical analysis show that the IHSDE possesses an increasing population variance as compared to HSDE. This ensures that the explorative power of IHSDE is on average greater than that of HSDE, which in turn results into better accuracy of the IHSDE algorithm.

In the following section, we give some numerical experiments over standard test functions.

4. Computational Results

The effectiveness of the IHSDE algorithm has been evaluated on a test suite of well-known benchmark functions (Table 1) [1416, 22, 23]. In Table 1, represents the number of dimensions. Here, , except for function ~, where . The global minima of all the above functions are at , except for , , and . Table 1 summarizes the initialization and search ranges used for these functions.

All the experiments were performed on Windows XP 64 System running on an Hp desktop with Intel(R) Xeon(R) ?GHz and 6?GB RAM, and the codes were written in MATLAB R2008a. Simulations were carried out to compare the optimization (minimization) capabilities of the proposed method (IHSDE) with respect to (a) HSDE [13], (b) classical HS (CHS) [1, 2], (c) NGHS [17]. To make the comparison fair, the populations for all the competitor algorithms (for all problems tested) were initialized using the same random seeds. The HS-variant algorithm parameters were set the same parameters: harmony memory size , harmony memory consideration rate , and the number of improvisations . In HSDE, is chosen to be a uniformly distributed random variable between 0 and 1, and in IHSDE is chosen to be a uniformly distributed random variable between 0.6 and 1, rand, here rand is a uniformly distributed random variable between 0 and 1. In classical HS, we set pitch-adjusting rate .

To judge the accuracy of different algorithms, 50 independent runs of each of the four algorithms were carried out and the best, the mean, the worst fitness values, and the standard deviation (Std) were recorded. Table 2 compares the algorithms on the quality of the optimum solution.

Figure 2 shows the convergence and its boxplot of the best fitness in the population for the different algorithms (CHS, HSDE, IHSDE, NGHS) for all benchmark functions. The values plotted for every generation are averaged over 50 independent runs. The boxplot is the best fitness in the final population for the different algorithms (CHS, HSDE, IHSDE, NGHS) over 50 independent runs. We can see that the behavior of the two former algorithms (CHS and HSDE) is similar for all benchmark functions. From the graphical point of view, the classical HS algorithm is clearly the worst algorithm for most benchmark problems, while the IHSDE outperforms HSDE in most cases except for and . Compared with the two former algorithms (CHS, HSDE), for most test functions, IHSDE can achieve much better optimization results within the same iterations, and IHSDE can reach or exceed the advanced level of the NGHS in most cases.

In order to accurately give search process of each algorithm, we set the same parameters, and run CHS, HSDE, IHSDE method, respectively, for multimodal function F16. Figure 3 shows iterations for three methods.

Remark 4.1. We omitted plots for all the other functions (-) to save space and also in consideration of the fact that they display more or less the same trend.
Experimental results on benchmark functions show that the IHSDE method can outperform the other methods. From Figure 3, we confirm that the IHSDE method has better ability of global search, and not easily fall in local optimum.

5. Conclusion

This paper has presented an improved harmony search algorithm by blending with it a different vector-based mutation operator borrowed from the DE algorithms. The HM members are fine tuned by the DE’s mutation operator to improve their affinities so that enhanced optimization performances can be achieved. Mathematical analysis indicates that the IHSDE posses an increasing population variance as compared to HSDE. This ensures that the explorative power of IHSDE is on average greater than that of HSDE, which in turn results in better accuracy of the IHSDE algorithm. Several simulation examples of the unimodal and multimodal functions have been used to verify the effectiveness of the proposed methods. Compared with the HSDE and CHS, better optimization results are obtained using IHSDE approaches in most cases. Checking the effect of variation of the scale factor of the differential mutation operator may be a worthy issue for preventing the premature convergence in future investigations. Future works will also focus on studying the applications of IHSDE on engineering optimization problems.

Acknowledgments

This work is supported by National Natural Science Foundation of China under Grant no. 60974082, and Scientific Research Program Funded by Shaanxi Provincial Education Department under Grant no. 12JK0863.