Abstract

Water wave optimization (WWO) is a novel metaheuristic method that is based on shallow water wave theory, which has simple structure, easy realization, and good performance even with a small population. To improve the convergence speed and calculation precision even further, this paper on elite opposition-based strategy water wave optimization (EOBWWO) is proposed, and it has been applied for function optimization and structure engineering design problems. There are three major optimization strategies in the improvement: elite opposition-based (EOB) learning strategy enhances the diversity of population, local neighborhood search strategy is introduced to enhance local search in breaking operation, and improved propagation operator provides the improved algorithm with a better balance between exploration and exploitation. EOBWWO algorithm is verified by using 20 benchmark functions and two structure engineering design problems and the performance of EOBWWO is compared against those of the state-of-the-art algorithms. Experimental results show that the proposed algorithm has faster convergence speed, higher calculation precision, with the exact solution being even obtained on some benchmark functions, and a higher degree of stability than other comparative algorithms.

1. Introduction

Optimization problem is wide and varied; in many scientific and engineering computation areas, a majority of problems of people encounter can be attributed to objective optimization problem; thus, the research of optimization problem has been a very active field. In fact, the optimization method can be divided into two types of deterministic optimization and stochastic optimization; although the deterministic optimization method is relatively mature, its application condition is harsh and difficult to deal with large-scale optimization problems, which prompted the stochastic optimization method, especially development of heuristic optimization method.

In recent years, the heuristic optimization algorithm, especially the metaheuristic optimization algorithm, has been concerned by many researchers. Metaheuristic optimization algorithm originates from the simulation of various types of physical, biological, social, and other phenomena in nature to solve optimization problems. As a stochastic optimization method, the metaheuristic optimization algorithm has the advantage of simple and universal, strong robustness, suitable for parallel processing and wide application range. Due to the advantages of metaheuristic, several algorithms have been proposed recently, such as particle swarm optimization (PSO) [1], genetic algorithm (GA) [2], ant colony optimization (ACO) [3], artificial bee colony (ABC) [4], cuckoo search (CS) [5], bat algorithm (BA) [6], firefly algorithm (FA) [7], flower pollination algorithm (FPA) [8], and water wave optimization (WWO) [9].

Water wave optimization (WWO) is a relatively new metaheuristic initially proposed by Zheng in 2015 [9], inspired by the shallow water wave theory [10] for global optimization. WWO has the advantages of simple framework and thus easiness of implementation; even with a small population size it performs well [9]. At present, as a new metaheuristic optimization method, WWO has been successfully applied to the optimization problems such as high speed [9] and TSP [11].

In order to further improve the performance of WWO, some modified approaches are introduced to strengthen its performance. Zhang et al. [12] improved on WWO; an improved version with variable population size (VC-WWO) is proposed by them, and, meanwhile, a comprehensive learning mechanism is developed in refraction operator to increase the solution diversity. Zheng and Zhang [13] developed a simplified version of WWO (Sim-WWO); in Sim-WWO, leaving out the refraction operator and in order to better balance exploration and exploitation as well as partially compensate the effect of weeding out refraction operator, a strategy of population size reducing is introduced. In order to apply WWO to combinatorial optimization problem, the traveling salesman problem (TSP), Wu et al. [11] redefined the propagation, breaking, and refraction operator based on the original WWO. In this paper, an improved water wave optimization algorithm based on elite-opposition (EOBWWO) learning strategy has been applied to function optimization and structure engineering design problems. The improvements include three parts: elite opposition-based learning (EOBL) strategy enhances the diversity of population, local neighborhood search strategy is introduced to enhance local search in breaking operation, and improved propagation operator provides the improved algorithm with a better balance between exploration and exploitation. We tested the performance of EOBWWO on 20 benchmark functions and two structure engineering design problems. The experimental results show that proposed algorithm has signification performance advantage including a fast convergence speed and a high calculation precision; in addition, the improved algorithm is able to obtain the exact solution on some test functions.

This paper is organized into the following sections. Section 2 introduces the original WWO algorithm briefly, the detailed description of EOBWWO algorithm is presented in Section 3, simulation experiments and results discussed are described in Section 4, and finally the conclusion is given in Section 5.

2. Water Wave Optimization (WWO) Algorithm

Water wave optimization (WWO) algorithm is inspired by shallow water wave theory and developed by Zheng [9], where each individual in the population is analogous to the “water wave” object with a wave height and a wavelength . Without losing generality, suppose there is a maximization problem and its objection function is , where practical problem can be compared with the shallow water wave model; the corresponding relation is shown as follows:

When the population is initialized, for each wave, the wave height is set to a constant and wavelength is generally set to 0.5. The fitness value of each water wave is inversely proportional to the vertical distance to the seabed; from this we can know that from the seabed nearer the water wave fitness value is bigger, the wave height is bigger, and the wavelength is smaller, as illustrated in Figure 1. During the process of optimization problem-solving, search globally in the solution space by simulating the propagation, breaking, and refraction operation of water waves.

2.1. Propagation

In WWO, all water waves have to be propagated once at each generation. It is assumed that the original water wave is , is a new wave created by propagation operator, the dimension of the maximum value function is , the propagation operation is shifted, and each dimension of the original water wave is given as where , is used to control the propagation step which is a uniformly distributed random number fixed in [], and is the length of the dimension of the search space. If the length of is longer than the length of the dimension of the search space, a new position will be reset randomly aswhere and are the lower bound and upper bound of th dimension of the search space and is a random number within the range .

After propagating, we evaluate fitness of ; if , instead of in the population, meanwhile the wave height of is reset to ; otherwise, remained, and in order to simulate energy dissipation of wave in the process of propagation, its height is decreased by one.

It is a natural phenomenon that when a wave travels from deep water to shallow water, its wave height increases and its wavelength decreases, as illustrated in Figure 1. In a bid to simulate this phenomenon, WWO uses the way in which the wavelength of each wave is updated after each generation as follows: where is a control parameter named wavelength reduction coefficient, and are the maximum and minimum fitness values among the current population, respectively, and is a very small positive constant to avoid division-by-zero.

2.2. Breaking

In the water wave theory, with the energy of water wave increasing constantly, crest becomes more and more steep, and the wave breaks into a series of solitary waves when its velocity of crest exceeds the wave celerity. After propagating, WWO only performs breaking on the wave which is a new best solution , which is used to improve the diversity of the population. The detailed process is as follows: first of all, we select randomly dimensions (where is a random number between 1 and a predefined number ) and perform operations on each selected dimension of original wave to generate each dimension of solitary wave as follows: where is a Gaussian random number with mean 0 and standard deviation 1 and is breaking coefficient. If the solitary wave with the best fitness is better than , is selected instead of ; otherwise, remained.

2.3. Refraction

In WWO, the refraction operation only performs on a wave whose height decreases to zero to avoid search stagnation, which simulates the phenomenon that wave ray is not perpendicular to the isobath. By refraction, in the way that random number centered halfway between the original positions and to calculate each dimension of new wave , the details are as follows:

Followed by refraction, the wave height of is also reset to ; meanwhile its wavelength is updated as follows:

To sum up, the role of propagation operator is to make the high fitness wave exploit small area and the low fitness wave explore large area, the breaking operator enhances the local search among the promising best waves, and the refraction operation helps avoid search stagnation and thus reduces premature convergence. The basic framework of WWO is as Algorithm 1 [9].

Input: Define objective function ,
Output: The best solution ;
(1) Initialization: Initialize parameters including , , , randomly initialize a population of waves;
(2) while stop criterion is not satified do
(3)   for each do
(4)     Propagate to a new based on equation (2);
(5)    if then
(6)      if then
(7)      Break based on equation (5);
(8)      Update with ;
(9)     endif
(10)      Replace with ;
(11)    else
(12)      Decrease by one;
(13)     if then
(14)      Refract to a new based on equation (6) and equation (7);
(15)     endif
(16)    endif
(17)      Update the wavelength based on equation (4);
(18)   endfor
(19) endwhile

3. The EOBWWO Algorithm

In order to improve the performance including global searching and local searching abilities of WWO and obtain a better balance between exploration and exploitation even further, there are three optimization strategies applied to the original WWO; they are elite opposition-based learning (EOBL) strategy [42], local neighborhood searching (LNS) strategy [43], and improved propagation operator.

3.1. Elite Opposition-Based Learning (EOBL) Strategy

The optimization process of WWO algorithm can be regarded as the transformation continually of its search space. When the algorithm falls into local optimum, the search space is difficult to contain the global optimal solution. Thus it is very significant to guide the current solution space approximation to the space of global optimal solution. In a bid to enhance the global search ability (i.e., exploration ability) of WWO, the elite opposition-based learning (EOBL) strategy is introduced.

Before introducing the EOBL, we should firstly explain opposition-based learning (OBL) [44]. The main idea of OBL is that it generates the opposition solution of current solution, evaluates current solution and opposition solution at the same time, and chooses the better one to enter the next iteration. We assume is a point in current population ( is the dimension of search space; ), and its opposition point is defined as follows: Since the opposition solution of OBL generated may not be more favorable than the current search space to search the global optimal solution, thus in this paper, we can use elite opposition-based learning (EOBL) strategy. EOBL is a new technique in the field of intelligence computation and its model can be described as follows: Suppose the elite individual (optimal individual in population) in the current population is , for an individual , the elite opposition solution of is defined as follows: where , is the population size, and is a generalized coefficient, and is the dynamic boundary of the th dimensional search space and can be obtained by the following formula: The fixed boundary is not conducive to preserve the search experience; thus we use dynamic boundary of the search space to replace the fixed boundary to preserve the search experience in order to make the opposition solution located in the search space which is narrowing. Moreover, if the operator of dynamic boundary makes jump out of , the following method can be used to reset : The EOBL generates the opposition population according to the elite individual and evaluates the current population and the elite population at the same time; in addition, it makes full use of the characteristics of the elite individuals to contain more useful search information than the ordinary individuals which improve the diversity of the population to certain extent. EOBL can enhance the ability of global exploration of WWO.

3.2. Local Neighborhood Search (LNS)

WWO only performs the breaking operator on the new best solution to enhance local search around the best solution. In a bid to further enhance the local search ability to improve the convergence speed, the local neighborhood search [43] (LNS) model is added before the breaking operation.

The main idea of LNS is using the best solution found so far in a small neighborhood of the current solution rather than the entire population to update the current solution. The experience of an individual’s neighborhood is considered when updating the individual’s location, so that the graph of interconnections of them is called neighborhood structure. Suppose there is a WWO population is a vector in the current population, and its dimension is . The indices of each vector are random in a bid to maintain the diversity of each neighborhood. Next, we can define the neighborhood of radius ( is a nonzero integer and ), where , for each vector ; that is to say, neighborhood of consists of . For analysis, we suppose that the vectors can be arranged into a ring topology according to their indices. Figure 2 illustrates the concept of local neighborhood model. In addition, the neighborhood topology is static and about the definition of the set of vectors all the time. LNS model is described inwhere is the best vector in the neighborhood of and and and are the scaling factors, where . In the improved version of WWO, the new best solution is updated according to (12), and the breaking operation is performed by the updated solution as where is the best solution updated by LNS.

3.3. Improvement of WWO

In original WWO, all water waves have to be propagated once at each generation, and the search behavior of each water wave is affected by the other waves in the group. Similar to PSO [1, 45, 46], an inertial weight is embedded into (2) in order to learn the past experience. Moreover, as shown in (2) and (4), propagation operator can make high fitness wave search small region and low fitness wave explore large region in global search process. In (2), the search step size is a random number fixed in the range , which is not very reasonable because search step size prefers fairly large step at the beginning in order to strengthen the probability of reaching the optimal regions, and, with the iteration going on, search step size should be decreased gradually to enhance the local exploitation ability. The random step size in the propagation process is improved by referencing to the method of [47]; the improved propagation operator is as follows:where and are, respectively, the maximum and minimum inertial weight, where , is the current iteration number and is the maximum iteration number, and and are two selected constants, where . Improved propagation operator not only makes use of the past experience, but also makes the search step size decrease gradually with the iteration going on. The whole pseudocode of EOBWWO can be summarized as Algorithm 2.

Input: Define objective function ,
Output: The best solution ;
(1) Initialization: Initialize related parameters including , , , , , , , , and , initialize
  dynamic boundary of the search space, randomly initialize a population of waves
(2) while stop criterion is not satisfied do
(3)   Update the current population with EOBL accordding to equation (9), equation (10) and equation (11);
(4)   for each do
(5)    Propagate to a new based on equation (14);
(6)    if then
(7)     if then
(8)       Break based on equation (12) and equation (13);
(9)       Update with ;
(10)     Endif
(11)     Replace with ;
(12)    else
(13)     Decrease by one;
(14)     if then
(15)      Refract to a new based on equation (6) and equation (7);
(16)     endif
(17)    endif
(18)      Update the wavelength based on equation (4);
(19)   endfor
(20) endwhile

4. Simulation Experiments and Results Analysis

In order to identify the effectiveness and efficiency of EOBWWO, 20 standard test functions are applied in this section. The detailed parameters about 20 benchmark functions [48, 49] including functional form, scope, optimal solution, and the iterations are illustrated in Table 1. The 20 benchmark functions can be divided into three groups, unimodal functions () as group 1, multimodal functions () as group 2, and low-dimension functions () as group 3. In the unimodal functions, the global optimum of is located in a smooth, long, and narrow parabolic valley; when the traditional gradient optimization method is searched to the valley edge, it is difficult to carry out global optimization. However, it is very slow to change the value in the long and narrow area, which can be used to evaluate the performance of the algorithm. In the multimodal functions, has many local minima, it is a typical nonlinear multimodal function, which has a wide range of search space, and it is generally considered to be a complex multimodal problem which is difficult to deal with. In general, unimodal functions are suitable for evaluating the exploitation; however, multimodal functions tend to be a good choice for evaluating exploration [50].

The rest of this section is organized as follows: experimental setting is given in Section 4.1, experiment results of 30 dimensions and discussion are represented in Section 4.2, high-dimension test results including 100 dimensions, 1000 dimensions, and 10000 dimensions for some unimodal functions and multimodal functions are described in Section 4.3, and two design problems are shown in Section 4.4.

4.1. Experimental Setting and Comparative Methods

The empirical analysis was conducted on a computer of Intel(R) with 3.5 GHz Xeon CPU and 8 GB of memory, the operating system is Windows 7, and the programs are written in Matlab 2012a.

The scope and dimension of variables have significant influence on the complexity of optimization. The scope of the benchmark function and the dimension of the low-dimension functions are illustrated in Table 1. The dimensions of unimodal functions and multimodal functions are, respectively, 30, 100, 1000, and 10000.

The performance of proposed EOBWWO algorithm is evaluated by comparing it to five state-of-the-art metaheuristic algorithms: ABC [4], CS [5], FPA [6], BA [8], and WWO [9]; the parameters settings of aforementioned algorithms are given in Table 2.

4.2. Experiment Results and Discussion

In Tables 3-4, the dimension is 30, whereas the standard benchmark functions are listed in Table 1. In this paper, all function optimization experimental results of the algorithms are repeated 30 times to ensure the credibility in statistics. There are four evaluation indicators: max, min, median, and Std represent the worst fitness value, optimal fitness value, median of the test results, and standard deviation, respectively. The last column of Tables 35 gives the rank of the algorithms in terms of median values among the six algorithms. The minimum value, best median value, and the minimum standard deviation values among the six algorithms of each benchmark function are shown in bold.

Moreover, nonparametric Wilcoxon rank tests were conducted on the results of EOBWWO and other comparative algorithms on the 20 benchmark functions, and the test results are shown in Table 6, where the value of is 1 indicating that the performance and comparative method are statistically different with 95% confidence, and 0 implies that there is no statistical difference [9].

In Table 3, on the unimodal group, EOBWWO obtained the exact solution except function and obtained the minimum standard deviation. Although ranking fifth on function , standard deviation of EOBWWO is less than the other algorithms. All these mean that EOBWWO has a higher calculation precision and better stability in the optimization of the unimodal functions.

On group 2 of 8 multimodal functions, seeing from Table 4, EOBWWO can find the exact solution for , , , , , and , and the standard deviations of the five functions of EOBWWO are zeros. In addition, EOBWWO obtains the best median value on all functions. For functions and , the worst fitness value, best fitness value, median value, and standard deviations of EOBWWO are less than the other five algorithms. The multimodal functions are more complex than unimodal functions due to the local minima; thus the above analysis indicates that EOBWWO has a strong global search ability and higher calculation precision.

On group 3 of 6 low-dimension functions, results are illustrated in Table 5. For , EOBWWO obtains the minimum values including the worst fitness, best value fitness, median value, and standard deviations among the comparative algorithms. For , ABC, CS, WWO, and EOBWWO can find the exact solution, and the standard deviations of ABC and EOBWWO are zeros. For , ABC, CS, WWO, and EOBWWO have the same median value and optimal fitness value, while the standard deviation of CS is minimal. For , FPA can obtain the better fitness value and median, but the standard deviation of WWO is the best. For , optimal fitness value, median value, and the standard deviation of FPA are better than those of EOBWWO. And for , it is obvious that ABC, CS, WWO, and EOBWWO obtain the exact solution and the standard deviation of CS is better. Through the above analysis of Table 5, we can draw a conclusion that EOBWWO has certain advantages in dealing with low-dimension functions according to the experimental results.

In summary, as a result of introducing the three major optimization strategies in the improvement, the calculation precision of EOBWWO is better than the comparative algorithms for most benchmark functions. In addition to functions , , and , the calculation precision of EOBWWO is inferior to ABC, FPA, and FPA, respectively. Moreover, EOBWWO can find the exact solution on , , , , , , and , and the standard deviations of these functions are zeros, which show the higher calculation precision and stronger stability of EOBWWO. EOBWWO obtains exact solution on multimodal functions , , , , and , which shows that EOBWWO has better global search performance.

In order to show the performance of the EOBNWWO clearly, Figures 322 represent the convergence curves and Figures 2342 describe the ANOVA test of global minimum of benchmark functions in Table 1. From Figures 322, obliviously, the convergence rate of EOBWWO is faster than other comparison algorithms including WWO on , , and , and the exact solution of EOBWWO is obtained in some functions , , , and . All of these indicate that EOBWWO has a faster convergence speed and a higher calculation precision than the other comparative algorithms. Figures 2342 show the ANOVA test of global minimum for ; it can be easily found that the standard deviation of EOBWWO is much smaller for most functions and the standard deviation is even zero on some functions (e.g., , , and ). Figures 2342 imply that EOBWWO has strong stability.

4.3. High-Dimension Function Test Results

In order to validate performance of EOBWWO comprehensively, in this subsection, we choose four functions including two unimodal functions () and two multimodal functions () to test the 100 dimensions, 1000 dimensions, and 10000 dimensions, respectively, on the six algorithms. The results of all the algorithms about the four functions are summarized in Table 7. The maximum numbers of iteration of each algorithm on each function are consistent with Table 1.

For unimodal function , it can be seen obviously that the performance of EOBWWO outperforms the other comparative algorithms for dimensions 100, 1000, and 10000 from Table 7. With the increase of dimension, EOBWWO is still obtaining the exact solution and the standard deviation is zero on . In the five comparison algorithms, as far as the median value changes in each dimension are concerned, the stability of FPA is better, but BA obtains the minimum value in each dimension. From the results of function in Table 7, it is very easy to find that although the performance of EOBWWO is not very good when the dimension is 30, with the increase of dimension, not only does EOBWWO obtain the minimum value in each dimension, but also the standard deviation is the smallest and the range is not very large. In addition, for EOBWWO, the change of the order of magnitudes of median is the smallest in different dimensions. Those provide strong evidence that EOBWWO has higher performance in dealing with complex functions. Among the other comparative algorithms, the performance of BA is better, the second is CS, the third is WWO, and the fourth and fifth are BC and FPA, respectively.

For multimodal functions, EOBWWO obtained the exact solution and the standard deviation is zero on and for the dimensions 100, 1000, and 10000. Taking into account median value, the performance of BA is better among comparative algorithms. However, considering the order of magnitudes of median, the difference between the five algorithms is not obvious.

Furthermore, some high dimensional tests of EOBWWO are also tested in functions , , , , and ; details of the experimental results are shown in Table 8. As shown in Tables 7 and 8 and the above analysis, EOBWWO has the ability efficiently and stably to handle high dimensional functions.

4.4. Structural Engineering Design Examples

Many structural design problems in the real world are constrained optimization problems which are nonlinear with complex constraints and the optimal solution even does not exist in some cases. In order to evaluate the performance of EOBWWO even further, in this subsection, EOBWWO was used to solve two structural design problems: design of a compressing spring and design of a welded beam.

4.4.1. Test Problem 1: Design of a Tension/Compression Spring

Design of a tension or compressing spring problem is introduced by Belegundu [14] firstly and it deals with the optimal design of tension/compression spring for a minimum weight. As shown in Figure 43, a tension/compression spring problem has three design variables: the wire diameter , the mean coil diameter , and the number of active coils . The minimum weight is subject to constraints on minimum deflection, shear stress, surge frequency, and limits on outside diameter [15]. A detailed description of the problem is as follows:where the experimental parameters are set as follows: [29]. Table 9 lists the optimal solution for compression spring design obtained by EOBWWO. The results are 20 runs independently and the number of iterations of EOBWWO is 5000. In the process of dealing with tension/compression spring constrained optimization problem, first of all we need to determine whether the four constraints are satisfied. If these constraint conditions are all satisfied, then calculate according to formula (15) and compare with the original fitness values, the better result as fitness value of constrained optimization problem. Otherwise, the original fitness value remains and continues to iterate.

In the process of using EOBWWO algorithm to deal with tension/compression spring constrained optimization problem, when the position of individual in current population is changed (create a new solution) and required to estimate the new solution, the steps of dealing with the tension/compression spring constrained optimization problems are as follows.

Step 1. Calculate the values of the four constraint conditions (see (16)–(19)) and estimate whether they are all satisfied with the constraint conditions. If these constraint conditions are all satisfied, go to Step 2; otherwise, go to Step 3.

Step 2. Calculate fitness value of new solution by formula (15), and compare new fitness value to original fitness value. According to the comparison results determine whether to update the current individual. Go to Step 4.

Step 3. Keeping the original individual that violates any constraint, go to Step 4.

Step 4. Continue performing the following operations.

The reason behind keeping the individual that violates any constraint of the constraint conditions is that each iteration of the population has to implement the elite opposition-based learning (EOBL) strategy and propagation operation; then the individual that violates any constraint may satisfy the constraints in the next iteration.

As one of the most well-known design benchmark problems, many researchers have studied this problem. Belegundu [14] introduced this problem and used eight different mathematical optimization techniques for this problem. Arora [15] solved this problem using a numerical optimization technique called a constraint correction at the constant cost. Table 10 summarized the optimal results of design of a tension/compression spring obtained by EOBWWO and other researchers.

As seen from Table 10, the proposed method obtained the best design overall of 0.012665234 corresponding to and the results obtained by EOBWWO are better than the comparative methods.

4.4.2. Test Problem 2: Design of a Welded Beam

As another well-known design benchmark problem, the objective of welded beam problem is to minimize overall cost of fabrication subject to constraints on shear stress , bending stress in the beam , buckling load on the bar , end deflection of the beam , and side constraints. As depicted in Figure 44 [28], this problem consists of four design variables: thickness of the weld , the length of the welded joint , the width of the beam , and the thickness of the beam .

The problem can be formulated as follows:where

EOBWWO algorithm was run to find the minimum cost of fabrication of this design problem, where the range of values of the four experimental parameters , , , is set as follows: [30]. Similarly, in the process of searching the minimum overall cost of fabrication, it is also the first to determine whether the seven constraint conditions are satisfied when calculating the fitness value. If these constraint conditions are all satisfied, then make use of formula (20) to calculate fitness value. In each run, EOBWWO is at the cost of 5000-function evaluation to locate the optimal solution. The experimental results of design of a welded beam obtained by EOBWWO are listed in Table 11. Similarly, in the process of searching the minimum overall cost of fabrication, when the position of any individual in the current population is changed and required to estimate the new solution, the steps of using EOBWWO algorithm to deal with welded beam problem are as using EOBWWO algorithm to deal with compression spring.

The results of comparing with those of other optimization algorithms reported in the literature are shown in Table 12. It can be seen from Table 12 remarkably that the proposed EOBWWO algorithm is much better than other algorithms in the design of welded beam, and the optimal solution obtained by EOBWWO is 1.69634711 corresponding to .

5. Conclusions

In this paper, three strategies are added to the original WWO algorithm to improve the convergence speed and calculation precision of WWO algorithm even further for function optimization and structure engineering design problems. The elite opposition-based (EOB) learning strategy enhances global exploration capability by means of increasing population diversity. The local neighborhood search strategy is introduced to enhance local exploitation capability via enhancing the local search around the promising optimal solution. In addition, the improved propagation operator provides the improved algorithm with a better balance between exploration and exploitation. By using the above-mentioned three strategies, EOBWWO can deal with function optimization including multimodal functions and structural design problems. The results of 20 benchmark functions and two structural design problems in Section 4 demonstrated that the performance of EOBWWO is better than the comparative algorithms at most benchmark functions and other solving methods for the two structural design problems. The improved EOBWWO algorithm can significantly improve the convergence speed and calculation precision of the original WWO algorithm. There are various important issues for the further research topics of EOBWWO. On the one hand, structural design problems not only exist in the real world widely, but also are generally nonlinear and constrained optimization problems. Therefore, other design problems can be resolved using EOBWWO in future research, such as multidimensional knapsack problem, permutation flow shop scheduling problem [51], and graph coloring problem. On the other hand, some improvements can be introduced to EOBWWO and WWO algorithm to enhance the ability of dealing with relevant problems. More elaborate set of parameters, such as breaking coefficient and wavelength reduction coefficient , multiple population strategy, and combination with other optimization algorithms are some good choices. In addition, multiobjective optimization problems are also the focus in the future research.

Competing Interests

The authors declare that they have no competing interests.

Acknowledgments

This work was supported by the National Science Foundation of China under Grants nos. 61463007 and 61563008 and the Guangxi Natural Science Foundation under Grant no. 2016GXNSFAA380264.