Abstract

The harmony search algorithm is a music-inspired optimization technology and has been successfully applied to diverse scientific and engineering problems. However, like other metaheuristic algorithms, it still faces two difficulties: parameter setting and finding the optimal balance between diversity and intensity in searching. This paper proposes a novel, self-adaptive search mechanism for optimization problems with continuous variables. This new variant can automatically configure the evolutionary parameters in accordance with problem characteristics, such as the scale and the boundaries, and dynamically select evolutionary strategies in accordance with its search performance. The new variant simplifies the parameter setting and efficiently solves all types of optimization problems with continuous variables. Statistical test results show that this variant is considerably robust and outperforms the original harmony search (HS), improved harmony search (IHS), and other self-adaptive variants for large-scale optimization problems and constrained problems.

1. Introduction

Harmony search (HS) is a relatively new metaheuristic optimization algorithm [1] that imitates the improvisation process of Jazz musicians as shown in Figure 1. Each musician is analogous to each decision variable; the audience’s aesthetics are analogous to the objective function; the pitch range of a musical instrument corresponds to the value range of the decision variables; the musicians’ improvisations correspond to local and global search schemes; musical harmony at certain time corresponds to a solution vector at certain iteration. In general, a musician’s improvisation follows three rules [2]: playing any one pitch from his/her memory, playing an adjacent pitch to one pitch from his/her memory, and playing a random pitch from the possible sound range. The HS algorithm mimics these rules and converts them into corresponding evolutionary strategies: choosing any one value from the HS memory (defined as memory considerations), choosing a value adjacent to one from the HS memory (defined as pitch adjustments), and choosing a random value from the possible value range (defined as randomisation). The evolutionary rules in the HS algorithm are effectively directed using the following parameters: harmony memory size (HMS, i.e., the number of solutions in harmony memory), harmony memory considering rate (HMCR, distinctly ), pitch adjusting rate (PAR, distinctly ), fine-tuning bandwidth (Bw, i.e., the adjustment range of a variable), and the number of improvisations (NI, i.e., the maximum number of iterations).

The values of these parameters, particularly three key parameters, HMCR, PAR, and Bw, affect the performances of HS. However, parameter setting is problem dependent and difficult, like other evolutionary computational methods, such as genetic algorithms (GA), simulated annealing (SA), particle swarm optimization (PSO), and ant colony optimization (ACO). To alleviate the burden of manually finding the best parameter values, this work proposes a natural and novel approach to parameter setting based on the improvisation practice of musicians.

For any algorithm that involves the operation of searching randomly, the most difficult is to find an appropriate balance between diversity and intensity in the process of searching. when emphasising diversity, it is easy to conduct a slower convergence and even divergence; when emphasising intensity, it is easy to conduct premature convergence and become trapped in a local optimal solution. The HS algorithm is no exception. Thus, this paper also proposes a novel self-adaptive search mechanism.

The rest of this paper is organized as follows. Section 2 presents a concise review of related previous work. Section 3 describes the novel self-adaptive harmony search (NSHS) algorithm. Section 4 introduces a test suite composed of twelve benchmark functions. Section 5 analyzes the robustness of the proposed variant. Section 6 confirms the superiority of the novel variant by the comparison test. The last section makes a conclusion.

2.1. Original Harmony Search Algorithm

The original HS algorithm developed by [1] is carried out as follows.

Step 1 (initialise the problem and algorithm parameters). First, the optimization problem is specified as follows:
where is the objective function, is a vector consisting of decision variables, and and are the lower and upper bounds of variable , respectively. The original HS algorithm basically considers the objective function only. If the problem has equality and/or inequality constraints and a solution vector violates any of them, either the resulting infeasible solution will be abandoned or its objective function value will have a certain penalty score added.
Second, the parameters HMS, HMCR, PAR, Bw, and NI must be valued, and the initial harmony memory (HM) must be generated. Consider

Step 2 (improve a new harmony vector by improvisation). A new solution , termed the harmony vector, is generated based on memory considerations, pitch adjustments, and randomisation. Consider

Step 3 (assess the new harmony vector and update the harmony memory). In this step, the merit of the new harmony vector is assessed based on its objective function value. If the new vector is better than the worst harmony vector in the current HM, the new vector is included in the HM and the existing worst vector is excluded from the HM.

Step 4 (check the stopping criterion). Repeat Steps 2~3 until the termination criterion is satisfied; that is, the number of iterations reaches NI.
The structure of the HS algorithm is much easier. Its core evolutionary process includes only three operations: choosing, adjusting, and randomising. However, the meta-heuristic algorithm handles intensification and diversification. Its intensification and diversification are represented by HMCR and PAR, respectively, and its superiority is confirmed by the existence of a large number of successful applications to diverse scientific and engineering problems, including timetabling [4], the 0-1 knapsack problem [5], power system design [6], transmission network planning [7], flow-shop scheduling [8], structural design [9], water network design [10], soil stability analysis [11], ecological conservation [12], and vehicle routing [13].
However, the major disadvantage of the HS algorithm is that its performance depends on the parameter setting and static parameter values are not conducive to balancing intensification and diversification.

2.2. Variants Based on Parameter Setting

To improve the performance of the original HS algorithm or alleviate the burden of manually finding the best parameter values, some variants based on parameter setting have been proposed.

2.2.1. Improved Harmony Search (IHS) Proposed by Mahdavi et al. [14]

In this variant, PAR and Bw are dynamically adjusted by the following formulae:

where and indicate the minimum and maximum pitch adjusting rates, respectively. and NI denote the th-iteration and the maximum number of iterations, respectively. , and indicate the minimum and maximum bandwidths.

In fact, this variant does not alleviate the difficulty of parameter setting, because users have to provide suitable values of , , and . Furthermore, using (4), the pitch adjusting rate gradually increases from to through an iterative process, which is rather inconsistent with the observation that a larger iteration number usually generates a more perfect harmony and requires a smaller tuning probability.

Later, Omran and Mahdavi [15] abandoned the Bw parameter by replacing direct adjustment with the use of a random variable for the best harmony vector as shown in (6), calling this new variant the global-best harmony search (GHS). Consider

As Wang and Huang [16] noted, the name of this variant is misused and the method can easily cause premature convergence. In fact, the most serious defect in the GHS variant is that if the domain of each variable is different, that is, , then the infeasible solution will frequently be generated. Thus, Pan et al. [17] corrected this serious drawback using the following equation:

2.2.2. The Self-Adaptive Harmony Search Proposed by Wang and Huang [16]

Wang and Huang recognised the shortcomings of IHS and GHS. They used the same modification to dynamically adapt PAR values during the search process, but the value of PAR is decreased linearly with time. Thus, their variant needs are still the values of and . The innovation of the variant is to adjust the value of each variable according to the maximal and minimal values in the harmony memory (HM) and without the Bw parameter again. Consider

Although this adjustment enables each variable to avoid violating its boundary constraint, the bandwidth is larger; therefore, the convergence is very slow and can even stop.

2.2.3. The Self-Adaptive Harmony Search Proposed by Pan et al

Pan et al. [18] proposed a local-best harmony search algorithm with dynamic subpopulations (DLHS). In the variant, the whole harmony memory is equally divided into m subgroups. Moreover, to keep population diversity, the small-sized sub-HMs are regrouped once the period is reached. In the variant, HMCR and PAR are from a parameter set list (PSL) with length Len, which is randomly generated by the initial value with a uniform distribution in the range and and then updated once the list is empty. In the variant, the value of the parameter Bw is also dynamically changed:

Although the variant takes the advantage of each local optimum, it consumes too much time as a cost. Thus, Pan et al. [17] proposed a self-adaptive global best harmony search algorithm (SGHS) based on the GHS algorithm. In the new variant, the value of HMCR (PAR) is normally distributed in the range with mean HMCRm (PARm) and standard deviation 0.01 (0.05). The initial values of HMCRm and PARm are given by the user. In a fixed learning period (LP), the values of HMCRm and PARm do not change; after LP, the values are recalculated by averaging all HMCR and PAR values recorded during the previous LP. In the variant, Bw is still determined by (9).

Obviously, the two variants require extra jobs to set the values of more parameters than the basic HS algorithm and other variants. Moreover, it is irrational that the value of PAR still varies in the interval regardless of the number of iterations because a successful search should gradually converge and the value of PAR should be decreased with time to prevent oscillation.

These defects inspired me to investigate how the key parameters of the HS algorithm do not depend on the problem’s feature again.

2.3. Variants Based on Hybridisation with Other Metaheuristics

To improve the performance of the original HS, many researchers have hybridised the algorithm with other meta-heuristics, such as GA, PSO, and ACO [19]. This hybridisation can be categorised into two approaches [20]: integrating some components of other meta-heuristic algorithms into the HS structure, and conversely, integrating some HS components into other meta-heuristic algorithm structures. Perhaps hybrid methods are better than single meta-heuristics.

However, in my opinion, hybridisation will depart from the essential nature of the original heuristic and the resulting mimicry will also be nondescript. Thus, the correct way to investigate this should be by refining the essence of the imitated objects. This idea inspired me to improve the search mechanism of the original HS algorithm by capturing the deep essence of musical improvisation.

3. A Novel Self-Adaptive Harmony Search Algorithm

3.1. Design of the Control Parameters

The HS algorithm includes five parameters: harmony memory size (HMS), harmony memory considering rate (HMCR), pitch adjusting rate (PAR), fine-tuning bandwidth (Bw), and the number of improvisations (NI). Among these parameters, HMCR, PAR, and Bw are crucial for the convergence of HS.

It should be noted that each improvisation of a musician is always accompanied by fine-tuning until the improvisation is finished. That is, it is necessary to adjust each variable value before the HS algorithm is ended. Therefore, it is reasonable that the value of PAR ought to equal one and senseless to consider this parameter again. For this reason, the proposed self-adaptive variant does not use the PAR parameter.

The value of HMCR in the basic HS algorithm is fixed at its initialisation and does not change until the search ends. As the value of HMCR decreases, harmony memory is used less efficiently, the algorithm converges much more slowly, and more time is consumed. As the value of HMCR increases, harmony memory is used more efficiently, the method converges more rapidly, less time is consumed, and the HS is easily trapped in a local optimum. To enhance the power of the HS algorithm, HMCR is usually valued in the interval . However, the reason for this is not revealed [21]. In fact, musicians depend on their memory more for intricate than simple music. This fact inspired me to construct the HMCR parameter based on the dimension of the problem to be solved. That is, the dimension of the problem is analogous to the complexity of the music,

where denotes the dimension of the problem to be solved. Clearly, 0.5 < HMCR < 1; as the number of dimensions of the problem increases, the value of HMCR increases and harmony memory is used more frequently, and vice versa.

It should be noted that musicians fine-tune their playing throughout their improvisation, and the tuning range is wider at the beginning and narrower as the music is completed. This phenomenon inspired me to construct a dynamic fine-tuning bandwidth for each variable , with bounds and and iteration number :

where denotes the standard deviation for the objective function values in the current HM. The value of decreases gradually as the iteration number increases and increases as the range of each decision variable increases. The domain of each decision variable corresponds to the pitch range of each musical instrument; the gradual decrease in the adjustment range as the iteration number increases corresponds to the gradual decrease of fine-tuning bandwidth with increasing improvisation number. is analogous to the unit of fine-tuning bandwidth, and 0.0001 represents the minimal adjustment. represents the slight difference between the merit of each harmony vector in HM. In fact, at the late stages of musical improvisation, the music is gradually mellowed and the fine-tuning range used is very narrow.

3.2. Design of the Self-Adaptive Mechanism

For every meta-heuristic algorithm, accelerating the convergence and preventing the premature convergence appear contradictory. To resolve this contradiction, the evolutionary mechanism of the HS algorithm has to be properly revised. It should be again noted that fine-tuning occurs throughout the whole process of musician improvisation and the fine-tuning range is wider at the early stage and narrower at later stages. Therefore, this phenomenon can be mimicked by the following evolutionary operations:

where random numbers . The standard deviation for the objective function values fstd represents the whole merit of the new harmony vectors in the current HM. If the standard deviation is less than a given threshold 0.0001, that is, if , then the difference between the function values is very small. If the iteration number is sufficiently large, then the distribution of the harmony vectors is centralised; thus, a new random value should be generated within the narrow range for rapid convergence.

3.3. Evolutionary Process of the NSHS Algorithm

On the basis of the novel parameter-setting method and self-adaptive evolutionary mechanism, the computational procedure of the proposed variant can be given, as shown in Algorithm 1.

NSHS Algorithm
Initialise the problem and algorithm parameters:
 Define a standard fitness function by (1)
 Define the parameters HMCR by (10), Bw by (11), HMS and NI
 Generate randomly an initial HM and calculate fstd
Search the best solution:
While ( ) do
  Generate a new harmony vector by the following loop
  While ( ) do
    First, generate a new value for every variable by the following rules:
    If ( ), then choose uniformly a value from the current memory for variable
     
    Else, generate randomly
       If , then generate randomly in the range of
       
       Else, generate randomly in the range of
       
       End if
    End if
    Second, adjust the new value of every variable by the following rules:
    If , then
       
    Else, generate randomly in the range of
       
    End if
  End while
  Assess the merit of the new harmony vector and update the current harmony memory
  If , then
    Accept the new harmony vector and replace the worst vector in the HM with it.
    Recalculate the standard deviation of the fitness function values fstd.
  End if
End while
Determine the optimal solution in the HM by the minimum fitness.

4. Benchmark Functions

This section lists the global minimization benchmark functions used to evaluate the performances of the proposed algorithm. The test suite comprises six well-known unimodal benchmark functions, six well-known multimodal benchmark functions, and six constrained optimization functions. To evaluate the robustness of a stochastic search algorithm, the twelve unconstrained benchmark functions are randomly shifted and orthogonally rotated, and four benchmark functions are with normal noise in fitness.

4.1. Unimodal Benchmark Functions

Table 1 summarizes the shifted rotated unimodal benchmark functions. Figure 2 depicts their 3D shapes if the dimension is equal to two.

4.2. Multimodal Benchmark Functions

Table 2 summarizes the shifted rotated multimodal benchmark functions. Figure 3 depicts their 3D shapes if the dimension is equal to two.

4.3. Constrained Benchmark Functions

The six constrained optimization functions have also been described in the literature [2]. Thus, we only give their corresponding penalty functions defined in this study. It should be noted that these penalty functions are composed of many penalty terms in order to precisely control every violation:

If , then .

If , then .

If , then .

For , the known best solution and . Consider

If , then .

If , then .

If , then .

If , then .

For , the known best solution and . Consider

If , then .

If , then .

If , then .

If , then .

If , then .

If , then .

If , then .

If , then .

If , then .

If , then .

If then .

If , then .

For , the known best solution and . Consider

If , then .

If , then .

If , then .

If , then .

If , then .

For , the known best solution , and . Consider

If , then .

If , then .

If , then .

If , then .

If , then .

If , then .

If , then .

If , then .

If , then .

If , then .

If , then .

If , then .

For , the known best solution , and . Consider

If , then .

If , then .

If , then .

If , then .

If , then .

If , then .

If , then .

If , then .

If , then .

For , the known best solution , , and .

5. Robustness of the Proposed NSHS Algorithm

The performance of a stochastic search algorithm often is affected by the initial populations. For the HS algorithm, the initial population distribution is determined by the value of HMS and the initial search ranges. Thus, this section investigates the robustness of the proposed variant from the two aspects.

5.1. Effect of HMS

Yang [21] declared that the value of HMS is insensitive to the performance of the original HS algorithm. To reveal the fact that the value of HMS does not also have a significant impact on the performance of the NSHS, a full-factorial experiment was conducted. In the experiment, the HMS is valued at four levels (5, 10, 20, and 50) as a fixed factor; above twelve unconstrained benchmark functions with 10-dimension and six constrained functions were employed; the simulation was repeated 30 times, and each simulation underwent 50,000 evaluations for the objective function. The relative error between the running result and the global or known optimum is tested as a dependent variable. The descriptive statistical result of these errors is shown in Table 3.

Clearly, there exists the tiny difference. However, we can claim that the difference is due to the function tested rather than the HMS by the test result of univariate analysis of variance, seeing the report of SPSS shown in Table 4. Therefore, the performance of the proposed NSHS algorithm is not affected by the value of HMS. To save storage space, it is encouraged to use a small HMS, as Pan et al. [17] explained that “since HM is emulating musician’s short-term memory and the short-term memory of the human is known to be small, it is logical.”

5.2. Effect of Initial Search Ranges

To identify whether the performance of the proposed variant depends on the initial HM, a full-factorial experiment was conducted using the parameters HMS = 10 and NI = 50,000. In the experiment, the initial HM was randomly generated at three types of the initial search ranges, that is, the whole domain, the left part, and the right part. The experiment was repeated 30 times and 1620 observations were collected. The descriptive statistical result of the relative errors of the proposed variant is shown in Table 5.

Obviously, there still exists the small difference. However, the test report of SPSS shown in Table 6 still indicates that the difference is due to the function tested rather than the initial search ranges.

By the results of analysis of variance as shown in Tables 4 and 6, the performance of the proposed NSHS algorithm does not depend on the initial HM. By the descriptive statistical results as shown in Tables 3 and 5, moreover, no matter how the initial HM is, the NSHS algorithm can stability search for the global optima of these benchmark functions with a tiny error, except for function f2. Therefore, the novel variant is considerably robust.

6. Superiority of the Proposed NSHS Algorithm

To confirm the significant superiority of the proposed variant, a comparison test was conducted. The test compared the performance of the NSHS with the original HS algorithm [1] and the other variants mentioned in Section 2, including Mahdavi’s IHS [14], Wang’s self-adaptive variant represented by HSw for convenience [16], Pan’s DLHS [18], and SGHS [17]. For fair comparison, the parameter settings are those proposed by their developers, as shown in Table 7.

For sufficiently persuasive comparative results, every algorithm was run independently 30 times to solve the aforementioned 18 benchmark functions on the same computer (CPU: AMD 4 × 3.5 Ghz; RAM: 4 GB), and, at the beginning of solving a problem, each algorithm used the same initial HM. The conclusion is made based on an analysis of variance rather than a pairwise -test because Garc’s study yet indicated that the latter method is suitable for a single-problem analysis and a multiple-problem analysis should employ a nonparametric test [22]. Next, the experimental results are elaborated from two aspects, respectively.

6.1. Unconstrained Benchmark Function

The performances of these algorithms regarding twelve unconstrained benchmark functions were investigated at different scales, that is, 10, 100, and 1000. Table 8, Table 10 and Table 12 respectively summarize the descriptive statistical results at different scales. For 10-dimension benchmark functions, the superiority of the NSHS is not significant except for functions f3 and f4, as shown in Table 8. The test result shown in Table 9 indicates that the performance of the NSHS is significant better than the IHS at the 95% confidence level and the average error of the NSHS is less than other algorithms except for the DLHS. For 100-dimension benchmark functions, the average error of the NSHS is much less than other algorithms except for the SGHS, as shown in Table 11. Based on the data in Tables 12 and 13, we can see that the superiority of the proposed variant is extremely significant for 1000-dimension benchmark functions.

6.2. Constrained Benchmark Function

The performances of these algorithms with respect to the aforementioned optimization problems with constraints were investigated. The descriptive statistical result and the report on multiple comparisons are, respectively, shown in Tables 14 and 15. Clearly, the performance of the proposed variant is vastly superior to the HS, the IHS, the HSw, and the SGHS. Although the performance of the NSHS is slightly better than the DLHS, the evolutionary mechanism of the former is simpler and the NSHS does not require too much key-parameters setting. Moreover, the performance of the NSHS is significantly better than the DLHS for functions f15 and f18, seeing Table 14.

7. Conclusion

This paper presented a novel self-adaptive harmony search algorithm (NSHS) for optimization problems with continuous variables with the aim of improving the performance of the harmony search. To investigate the performance of the novel variant, many comparison experiments were conducted based on a test suite comprising six unimodal benchmark functions, six multimodal benchmark functions, and six constrained benchmark functions. Using nonparametric statistical tests, the NSHS’s robustness and superiority were detected. Experimental results on the effect of HMS and initial search ranges indicated that the performance of the NSHS did not depend on the initial HM. The results, when compared with related previous work, manifested that the NSHS performed dramatically better than other six algorithms for large-scale problems and constrained problems.

The originality of this work is mainly reflected in two aspects.(i)Proposed a simpler approach to the setting of key parameters based on musical practices. The proposed variant has only two key parameters HMCR and Bw, but they do not require the user’s inputs. Moreover, the parameters settings are in accordance with the features of the optimization problem to be solved, such as the scale and the boundaries. Moreover, this approach discards the PAR parameter and further simplifies the parameter setting. Musical improvisation is usually accompanied by fine-tuning; thus, it is senseless to consider this parameter and fine-tuning bandwidth at the same time.(ii)Developed a novel self-adaptive evolutionary scheme based on the standard deviation of the fitness function values in the memory to accelerate algorithm convergence and assure the reaching of a global optimum. This scheme automatically narrows the search space based on a suitable threshold and, meanwhile, solves the problem of the optimal balance between search diversity and intensity.

Acknowledgments

Many thanks to the anonymous reviewer for valuable comments that helped to improve this paper. This work was partly supported by the National Natural Science Foundation of China under Grant 71101003, 91224007, and 71332003 and the Fundamental Research Funds for the Central Universities under Grant YWF-13-D2-JC-12.