Abstract

To improve the seeker optimization algorithm (SOA), an elastic collision seeker optimization algorithm (ECSOA) was proposed. The ECSOA evolves some individuals in three situations: completely elastic collision, completely inelastic collision, and non-completely elastic collision. These strategies enhance the individuals’ diversity and avert falling into the local optimum. The ECSOA is compared with the particle swarm optimization (PSO), the simulated annealing and genetic algorithm (SA_GA), the gravitational search algorithm (GSA), the sine cosine algorithm (SCA), the multiverse optimizer (MVO), and the seeker optimization algorithm (SOA); then, fifteen benchmark functions, four PID control parameter models, and six constrained engineering optimization problems were selected for the experiment. According to the experimental results, the ECSOA can be used in the benchmark functions, the PID control parameter optimization, and the optimization constrained engineering problems. The optimization ability and robustness of ECSOA are better.

1. Introduction

Recently, the heuristic algorithm has received a lot of attention. Such algorithms create random methods for many optimization problems. Since the “no free lunch” (NFL) theorem, no one optimization solution can optimize overall questions [1]. Therefore, researchers pose new algorithms or enhance the current algorithms to deal with optimization problems. The current algorithms are the genetic algorithm (GA) [2], the particle swarm optimization (PSO) [3], the simulated annealing (SA) [4], the harmony search (HS) [5], the gravitational search algorithm (GSA) [6], the moth-flame optimization (MFO) [7], the sine cosine algorithm (SCA) [8], the multiverse optimizer (MVO) [9], the seeker optimization algorithm (SOA) [10], the artificial bee colony (ABC) algorithm [11], the krill herd (KH) [12], the monarch butterfly optimization (MBO) [13], the elephant herding optimization (EHO) [14], the moth search (MS) algorithm [15], the slime mould algorithm (SMA) [16], and the Harris hawks optimization (HHO) [17].

However, some optimization algorithms are still not very successful in optimization problems. The optimization problems include issues with low optimization precision, being premature, having only a local optimal solution, slow convergence speed, and insufficient robustness. To better overcome the issues of optimization precision, prematurity, having only a local optimal solution, slow convergence rate, and poor robustness, some improved algorithms have proven to be feasible optimization algorithms and have been used in practical engineering. For instance, the Harris hawks optimization algorithm, salp swarm algorithm, grasshopper optimization algorithm, and dragonfly algorithm are used for the structural design optimization of vehicle components [18]. The adaptive inertia weight factor in the traditional PSO optimizes path planning [19]. The PSO based on Gaussian and quantum behavior optimizes constrained engineering problems [20]. The least squares support vector machines based on Gaussian are proposed [21]. A Levy flights discrete bat algorithm is adopted to solve the Euclidean traveling salesman problem [22]. The cuckoo optimization algorithm in reverse logistics is used to design a network for COVID-19 waste management [23]. A chaotic cuckoo optimization algorithm based on a Levy flight, backward learning, and interfere operator is used to classify the optimal feature subspace [24]. An elite symbiotic organisms search algorithm with mutually beneficial factor is adopted to optimize the functions [25]. An artificial bee colony with dynamic Cauchy mutation is adopted to solve feature selection [26]. A new elastic collision optimization algorithm is applied in sensor cloud resource scheduling [27].

Dai et al. proposed the SOA in 2006 [28]; the goal is to mimic the seekers’ behavior and the way they exchange information and solve practical application optimization problems. In the recent decade, the SOA has been used in many fields, such as unconstrained optimization problems [29], optimal reactive power dispatch [30], challenging set of benchmark problems [31], design of a digital filter [32], optimizing parameters of artificial neural networks [33], optimizing model and structures of fuel cell [34], novel human group optimizer algorithm [35], and several practical applications [36]. However, in the initial stage of dealing with optimization problems, SOA converges faster than others. When all individuals are near the best individual for solving the optimization problem, the individuals will lose diversity and fall into prematurity.

In this article, we propose an elastic collision seeker optimization algorithm (ECSOA), which evolves some individuals in three situations: complete elastic collision, complete inelastic collision, and incomplete elastic collision. These strategies enhance the individuals’ diversity and avert premature convergence. The ECSOA is compared to seven improved SOAs, such as the changing algorithm parameters, the adaptive transformation of empirical value parameters, the Levy motion of some individuals, the reverse learning, the addition of mutual benefit factor, and the Cauchy mutation. This article chose fifteen benchmark functions to test. According to the experimental results, the convergence speed and accuracy of ECSOA are higher. The improved strategy enables the SOA to maintain the individuals’ diversity, avert falling into the local optimum, and make up for the shortcomings of SOA in the aspect of easy precocity. Finally, compared with PSO, SA_GA, GSA, SCA, MVO, and SOA, the ECSOA has been implemented and tested on a complete set of well-known fifteen benchmark functions, four PID control parameter optimization models, and six optimization constrained engineering problems taken from literature. According to the experimental results, ECSOA is feasible in the benchmark functions, the PID parameter optimization problems, and the constrained engineering optimization problems. The ECSOA can find better values for solving the questions. The improved SOA successfully overcomes its tendency to prematurely converge to local optima for problems. The ECSOA has better optimization performance and robustness. The algorithm also has an improvement over the original SOA. The advantages of the ECSOA are summed up as follows:(1)An ECSOA is raised to enhance the precision and robustness of the optimization process.(2)The elastic collision strategies, the completely elastic collision, the completely inelastic collision, and the non-complete elastic collision, can improve the diversity of individuals, enhance local search, and avert premature convergence.

The rest of the article structure is as follows. Section 2 presents the SOA and the algorithm improvement strategies. Section 3 describes the ECSOA. Section 4 shows the algorithm optimization experiments, the results, and the analyses. Lastly, Section 5 gives some conclusions.

2. Basic SOA and Algorithm Improvement Strategies

The SOA carries out in-depth search mimicking human search behavior. It considers optimization as a search for an optimal solution by a search team in search space, taking the search team as population and the site of the searcher as task method. Using “experience gradient” to determine the search direction, we use uncertain reasoning to resolve the search step measurement, through the scout direction and search step size to complete the searchers’ position in the search interspace update, to attain the optimization of the solution.

2.1. Key Update Points for SOA

SOAs have three main updating steps.

2.1.1. Search Direction

The forward orientation of search is defined by the experience gradient obtained from the individuals’ movement and the evaluation of other individuals’ search historical position. The egoistic direction , altruistic direction , and preemptive direction of the ith individual in any dimension can be obtained.

The searcher uses the method of a random weighted average to obtain the search orientation.where t1, t2 ∈ {t, t − 1, t − 2}; and are the best advantages of separately; is the historical optimal location in the neighborhood where the ith search factor is located; p is the optimal locality from the ith search factor to the current locality; ψ1 and ψ1 are random numbers in [0, 1]; and ω is the weight of inertia.

2.1.2. Search Step Size

The SOA refers to the reasoning of the fuzzy approximation ability. The SOA, through the computer language, describes some of the human natural languages that can simulate human intelligence reasoning search behavior. If the algorithm expresses a simple fuzzy rule, it adapts to the best approximation of the objective optimization problems. The greater search step length is more important. However, the smaller fitness corresponds to the smaller search step length. The Gaussian distribution function is adopted to describe the search step measurement.where α and δ are parameters of a membership function.

According to (3), the probability of the output variable exceeding [−3δ, 3δ] is less than 0.0111. Therefore, µmin = 0.0111. Under normal circumstances, the optimal position of an individual has µmax = 1.0, and the worst place is 0.0111. However, to accelerate the convergence speed and get the optimal individual to have an uncertain step size, µmax is set as 0.9 in this paper. Select the following function as the fuzzy variable with a “small” target function value:where µij is determined by (4) and (5), Ii is the count of the sequence xi(t) of the current individuals arranged from high to low by function value, and the function rand(µi,1) is the real number in any partition [µi, 1].

It can be seen that (4) simulates the random search behavior of human beings. Step measurement of j-dimensional search interspace is determined by the following equation:where δij is a parameter of the Gaussian distribution function, which is defined by where ω is the weight of inertia. As the evolutionary algebra increases, ω decreases linearly from 0.9 to 0.1. and are, respectively, the variate of the minimum value and maximum value of the function.

2.1.3. Individual Location Updates

After obtaining the scout direction and scout step measurement of the individual, the location update is represented by (8).i is the ith searcher individual; j represents the individual dimension; fij(t) and αij(t), respectively, represent the searchers’ search direction and search step size at time t; and xij(t) and xij(t + 1), respectively, represent the searchers’ site at time t and (t + 1).

2.2. Algorithm Improvement Strategies

Six strategies for improving the algorithm are listed in this paper.

2.2.1. Dynamic Adaptive Gaussian Variation of Empirical Parameters

In the SOA, (8) is changed to (10), and the empirical value C1 is changed to an adaptive empirical value that varies between 0.1 and 0.5 with the change of optimization algebra according to (11). The individual position update is still the same as (9).where i represents the ith individual, j represents the individual dimension, δij is a parameter of the Gaussian membership function [20, 21], t means the current algebra, itermax represents the maximum optimization algebra, and d represents the dimension of the optimized object.

2.2.2. The Levy Movement

A Levy movement [22, 24] is a random searching path alternating between short and occasionally long walks following the Levy distribution. The position update equation of the Levy motion is as follows:i represents the ith individual and j represents the number of individuals. Γ(β) = (β − 1)!. t is the current algebra. d is the dimension of the optimized object. r1, r2 ∈ rand (0, 1). β is the partial real constant, which is 1.5 in this paper. After judging whether the fitness value is good or bad based on the newly generated individual position vector in (14), the original individual will be replaced by the best.

2.2.3. The Refraction Reverse Learning

If the projection of refraction points on the x-axis represents , represent the reverse solution of individual x based on the refraction principle [23, 24]. The value of the boundary point in the refraction reverse learning is (a + b)/2 of the search interval [a, b]. As shown in Figure 1, the calculation of sinα and sinβ is shown in (13) and (14).

According to (13) and (14), we can get

Assuming k = h/h, we can write (15) as (16), and then (17), n = 1 and k = 1, can be simplified to (18).

When it is applied to the SOA, the probability of mutation is 0.8. The individual positions are taken for the refraction reverse learning according to (19) to get the new individual positions. In the formula, i is the ith individual, and j is the individual dimension. After judging whether the fitness value is good or bad based on the newly generated individual position vector in (19), the original individual is replaced by the best one.

2.2.4. The Mutually Beneficial Factor

The individuals xh were randomly selected, and xm was determined by (20) to determine the mutually beneficial factor C [25].where i represents the ith individual, j represents the individual dimension, ψ represents a random number in (0,1), xgbest represents the j-dimensional component of the current optimal position of the entire population, C is the mutual benefit factor, R is the benefit parameter, and 1 or 2 is randomly selected. After judging whether the fitness value is good or bad based on the newly generated individual position vector in (21), the original individual is replaced by the best one.

2.2.5. The Cauchy Variation

In this paper, the Cauchy inverse can mutate the population under certain probability. The Cauchy inverse function [26] is shown in (22). Referring to (22), we can write the new position of the individual as (23); that is, the new position of the individual is obtained by the Cauchy mutation. After judging whether the fitness value is good or bad based on the newly generated individual position vector in (23), the original individual is replaced by the best one.where F−1 is the Cauchy inverse function and r1 and r2 are random values within [0, 1].

2.2.6. Elastic Collision Variation

For the individual xij (xij is a solution distributed in the solution space of the optimization problem and can be abstractly represented as a unit mass object at a certain position in the space), δ = {x′}(x′ ∈ P (t) ∧ x′≠xij); xij and x′ move in each other’s direction at the velocities f(xij) and f(x′), respectively; and xij and x′ will collide at ∆t, and then after ∆t, xi reaches the new position xi,new. The derivation is as follows. For the complete elasticity (CE) collision, according to the law of conservation of momentum and energy [27],

Similarly, for the complete inelastic (CI) collisions and the non-complete elastic (NCE) collisions,

The individual updating mechanism is as follows:where i represents the ith individual, j represents the individual dimension, ε ∈ (0,1) and ξ ∈ (0, 0.5) are newer systems, Gij(t) is the optimal solution of xij(t) history, and Bij(t) is the optimal solution of species. r1 ∈ (0, 1), r2 ∈ (0, 1), and α ∈ (0, 0.5) are the random numbers. After judging whether the fitness value is good or bad based on the newly generated individual position vector in (27), the original individual is replaced by the best one.

3. ECSOA

The ECSOA evolves some individuals in the CE collision, the CI collision, and the NCE collision to improve the diversity of individuals and boost partial scouting. Algorithm 1 is the primary process of the ECSOA.

(1)t = 0
(2)Parameter initialization.
(3)Population initialization. Generate an initial species group.
(4)Evaluate each seeker. Compute the fitness. Determine the optimal solution Pbest,G.
(5)While the stopping condition is not satisfied.
 (5.1)Running process of the ECSOA
  (1)The search direction of the searcher is generated according to (2)
  (2)The search step size is generated according to (6)
  (3)Generate a new position xECSOA,G according to (9), and the range of xECSOA,G is judged and modified to meet (xmin, xmax).
  (4)Calculate the fitness and judge the optimal solution.
   if f (xECSOA,G) ≤ Pbest,G
    Pbest,G = f (xECSOA,G)
   end if
 (5.2)The elastic collision variation
  (1)if rand < Pm, the elastic collision variation was carried out on some new positions, according to (26), to obtain new xECSOA,G, and the range of xECSOA,G is judged and modified to meet (xmin, xmax).
  (Other improvement strategies, such as the empirical value parameter adaptive transformation formula (10), the refraction reverse learning formula (14), the Levy variation formula (19), the introduction of mutually beneficial factor formula (21), and the Cauchy variation formula (23), were updated according to the corresponding formula.)
  (2)Calculate the fitness and judge the optimal solution.
   if f (xECSOA,G) ≤ Pbest,G
    Pbest,G = f (xECSOA,G)
   end if
  end if
(6)t = t+1
(7)if t < Tmax, then jump to 3; else stop.

4. Experimental Results

4.1. Experimental Setup

The algorithms used in the experiment in this paper were running under MATLAB R2016a. The computer is configured as Intel® Core™ i7-7500U CPU @2.7 GHz 2.9 GHz processor with 8 GB of memory, Windows 10 operating system.

4.2. Algorithm Performance Comparison in Benchmark Functions

To ensure that the comparison of these algorithms is fair, the population number of algorithms is 30, and the evolutionary algebra is 1000. At the same time, for further ensuring the fairness of algorithm comparison and reducing the effect of randomness, the results of the seven algorithms after 30 independent runs were selected for comparison.

4.2.1. Benchmark Functions

In this field, it is common to base the capability of algorithms on mathematic functions that are known to be globally optimal. Fifteen benchmark functions in the literature are used as the comparative test platform [7, 10, 3739]. Table 1 shows the functions in the experiment. Variables are set to one hundred.

4.2.2. Performance Comparison of SOA with Different Improvement Methods

In this paper, the SOA is improved by seven different methods: the parameter changing SOA (PCSOA), the parameter adaptive Gaussian transform SOA (PAGTSOA), the SOA based on Levy variation (LVSOA), the SOA based on refraction reverse learning mechanism (RRLSOA), the SOA based on mutually beneficial factor strategy (MBFSOA), the SOA based on Cauchy variation (CVSOA), and the elastic collision seeker optimization algorithm (ECSOA).

(1) Parameter Setting of SOA with Different Improvement Methods. This section will introduce the parameter setting of the improved SOAs used in the experiment in this paper. Dai et al. have done a lot of research on the parameter set of the SOA [32], and we did a lot of practice tests and comparative studies about the parameters. The specific parameters of the improved SOA are shown in Table 2. In the next section, we will use these improved algorithms for experimental comparison and choose a relatively optimal improved algorithm to compare it with other advanced intelligent algorithms.

(2) Improved Algorithms’ Performance Comparison in Benchmark Functions. The SOA is improved in seven different ways: the SOA based on parameter change (PCSOA), the SOA based on parameter adaptive Gaussian transform (PAGTSOA), the SOA based on Levy variation (LVSOA), the SOA based on refraction reverse learning mechanism (RRLOOA), the SOA based on mutual benefit factor strategy (MBFSOA), the SOA based on Cauchy variation (CVSOA), and the SOA based on elastic collision (ECSOA). To test the performance, each improved algorithm was optimized for the fifteen functions in Table 1. Each algorithm and each function were run independently 30 times. The performance of the SOA and the seven improved SOAs in fifteen-function optimization was compared in terms of the mean (Mean), standard deviation (Std.), best fitness (Best), program running time (Time), and best fitness rank (Rank) of 30 running results. The optimal fitness reflects the optimization accuracy of the algorithm, the average value and standard deviation reflect the robustness of the algorithms, and the running time reflects the time of the program. The results of the functions f1-f15 are displayed in Table 3. The boldface indicates that the optimal result is better.

Based on Table 3, for the benchmark functions f1-f15, the comparison between the seven improved SOAs in this paper and the original SOA shows that the optimization result of the ECSOA is the best value. The mean (Mean), standard deviation (Std.), best fitness (Best), and best fitness rank (Rank) of the ECSOA were the best after 30 independent runs. The f1-f15 total program running time (Time) rank is the fourth among all the eight algorithms compared in this paper. The running time of the ECSOA is longer than that of the SOA, PCSOA, and PAGTSOA; it is shorter than that of the LVSOA, MBFSOA, CVSOA, and PAGTSOA. From the perspective of optimization accuracy and robustness, the ECSOA has the best optimization performance among the improved SOAs in this paper. Section 4.2.3 will compare the ECSOA with the other intelligent optimization algorithms that are widely used at present.

4.2.3. Performance Comparison of Different Algorithms in Benchmark Functions

To test the performance of the ECSOA, it is compared to the PSO, SA_GA, GSA, SCA, MVO, and SSA, using the fifteen benchmark functions [7, 10, 3739] in Table 1, which have been widely used in the test.

(1) The Parameter Setting of Different Algorithms. In this section, the parameters’ set of the PSO [40], SA_GA [41], GSA [6], SCA [8], MVO [9], SOA [28], and ECSOA is presented. According to [6, 8, 9, 28, 40, 41], we did a lot of practice tests and comparative studies for the parameters set. Table 4 shows the parameters set of different algorithms.

(2) The Results Comparison of Different Algorithms in Benchmark Functions. The mean values, standard deviation, best fitness, and best fitness rank of the algorithms of 30 independent runs and the data of functions f1-f15 optimization results are shown in Table 5. The boldface indicates that the optimal outcome is better.

Based on Table 5, for the best value of the benchmark functions, the standard deviation, and the mean, the ECSOA is better than the others. According to the optimal fitness value mean rank and all rank results from Table 5, the ECSOA has a strong optimization ability and strong robustness to a benchmark function.

Figure 2 shows the fitness curves of the best values for the benchmark functions f1-f15 (D = 100). As seen from Figure 2, the convergence of the ECSOA is faster, and the precision of the ECSOA is better.

Figure 3 is the ANOVA for the benchmark functions f1–f15 (D = 100). As seen from Figure 3, the ECSOA showed better robustness and improved SOA. Therefore, the ECSOA is a feasible solution in the optimization of benchmark functions.

4.2.4. Complexity Analysis

The calculational complexity of the SOA is O (NDM), N represents the total individual count, D represents the dimension count, and M represents the maximum count of algebras. The computational complexity of the first phase of the SOA stage is O (NDM). The elastic collision strategy is introduced to calculate the O (NDM) value. Therefore, the overall complexity of the ECSOA is O (NDM + NDM). Based on the principle of the Big-O representation [42], if the count of algebras is high (M ≫ N, D), the calculational complexity is O (NDM). Therefore, the overall calculational complexity of the ECSOA is almost the same as the basic SOA.

4.2.5. Statistical Testing of Algorithms in Benchmark Functions

Using Wilcoxon’s rank-sum test [43], we can discover the important differences between the two algorithms. This test gives the value .

Table 6 indicates the results of statistical testing. N/A represents the best algorithm. From Table 6, ECSOA is suitable for the fifteen functions. Therefore, the ECSOA is better than the other algorithms.

4.2.6. Run Time Comparison of Algorithms in Benchmark Functions

In this subsection, the running time of the algorithms for each function is recorded under the same conditions: population number of 30, evolution algebra of 1000, and 30 independent runs of the above fifteen benchmark functions f1–f15 (d = 100). Then, the running time of the fifteen functions is added to obtain the sum of the 30 independent running times of each algorithm for the fifteen functions listed in this paper and the ranking of the total time, as shown in Table 7. As seen from Table 7, the SCA has the most minor program running time, followed by the PSO algorithm, which has more program running time. The ECSOA ranks fifth, which has a relatively longer program running time. At the bottom of the list is the SA_GA, which takes the most running time.

To learn more traits about the program running time of the seven algorithms in the fifteen functions, a bar chart in Figure 4 was made for the total time of each algorithm after 30 independent runs. From Figure 4, as to the running time, the ECSOA is less than the SA_GA and GSA; the SCA is the least; the SA_GA is the most; the ECSOA is less than one-sixth of SA_GA; and the ECSOA is nearly four times the SCA, which is relatively large.

4.2.7. Performance Profiles of Algorithms in Benchmark Functions

The average fitness was selected as the capability index. The algorithmic capability is expressed in performance profiles, which is calculated by the following formulas:where represents an algorithm; G is the algorithms set; f means a function; F represents the function set; ng represents the count of algorithms in the experiment; nf is the number of functions in the experiment; µf,g is the average fitness obtained by the algorithm after solving function f, rf,g is the capability ratio; ρg is the algorithmic capability; and τ is a factor of the best probability [44].

Figure 5 shows the capability ratios of the average value for the seven algorithms on the benchmark functions f1-f15 (D = 100). The consequences are revealed by a log scale 2. As shown in Figure 5, the ECSOA has the highest probability. When τ = 1, the ECSOA is about 0.8, which is better than that of the others. When τ = 8, the ECSOA is the winner on the given test functions, ESOA is 1, PSO is 0.67, SA_GA is 0.067, SCA is 0.3, GSA is 0.3, MVO is 0.2, and SOA is 0.33. Regarding the performance curve, the ECSOA is the best; the ECSOA can achieve 100% when τ ≥1. Thus, the performance of the ECSOA is better than that of the other algorithms.

4.3. Algorithm Performance Comparison in PID Controller Parameter Optimization Problems

In this subsection, we use four control system optimizing PID parameter models to test the capability of the ECSOA. For g1–g3, the population number of all algorithms is 20, the max number of algebras is 20, g1-g2 step response time is set to 10s, and g3 step response time is set to 30s. For g4, the population number of all algorithms is 50, the max number of algebras is 50, the step response time is set to 50s.

4.3.1. Control System Models

Equations (30)–(33) show the test control system models optimizing PID parameters used in our experiment. Figure 6 shows the process diagram for optimizing the test control system PID parameters by the ECSOA. Figure 7 shows the optimization PID parameter model structure of the control system.

4.3.2. Results Comparison of Algorithms in the PID Controller Parameter Optimization

For testing the capability of the ECSOA, it is compared with the PSO, SA_GA, GSA, SCA, MVO, and SOA in terms of the PID controller parameter optimization. The mean values, standard deviation values, best fitness values, and best fitness values rank of the algorithms of 30 independent runs for g1–g4 are displayed in Table 8. The boldface indicates that the optimal result is better.

For the PID controller parameter optimization problems, according to Table 8, except g3 and g4, as to the best fitness, the ECSOA is better than the others. The optimal fitness value result of the ECSOA for g3 model is only worse than the SA_GA, the optimal fitness value result of the ECSOA for g4 model is only worse than the PSO algorithm. As to the standard deviation results, for g1 model, the ECSOA is only worse than the SA_GA, SCA, and the MVO; for g2 and g3 models, the ECSOA is only worse than the SA_GA; and for g4 model, the ECSOA is only worse than the MVO. Except for g1 and g4, as to the mean test results, the ECSOA is better than the others; for g1 model, the ECSOA is only worse than the SCA; and for g4 model, the ECSOA is only worse than the MVO. According to the optimal fitness value mean rank and all rank results from Table 8, the ECSOA can find solutions and has very strong robustness for the PID controller parameter optimization problems.

4.3.3. Convergence Curves Comparison of Algorithms in PID Controller Parameter Optimization

Figure 8 shows the fitness curves of PID controller parameter optimization for g1–g4. The comparison between the seven algorithms in Figure 8 shows that the convergence of the ECSOA is fast and the precision of the ECSOA is the best. The ECSOA can find the optimal value.

4.3.4. ANOVA Tests Comparison of Algorithms in PID Controller Parameter Optimization

Figure 9 is the ANOVA of the global best values PID controller parameter optimization for g1–g4. As seen from Figure 9, ECSOA is the most robust algorithm.

4.3.5. Unit Step Function PID Controller Parameter Optimization

Figure 10 shows the unit step function PID controller parameter optimization for g1–g4. As seen from Figure 10, the ECSOA is used to optimize the unit step function PID controller parameters of g1–g4, and the unit step functions tend to stabilize very quickly and accurately.

Therefore, the ECSOA is an effective and feasible solution in the control system models optimizing PID parameters.

4.4. Algorithm Performance Comparison in Constrained Engineering Optimization Problems

We are using six constrained engineering problems to test the capability of the ECSOA further. These constrained engineering problems are very popular in the literature. The penalty function is used to calculate the constrained problem. The parameters set for all of the heuristic algorithms still adopt the parameter setting in Table 4 of section 4.2.3. The formulations of these problems are available in the appendix.

4.4.1. Welded Beam Design Problem

This is a least fabrication cost problem, which has four parameters and seven constraints. The parameters of the structural system are shown in Figure 11 [7]. Some of the algorithms are taken from other literature as follows: GSA [6], MFO [7], MVO [9], CPSO [45], and HS [46]. For the problem in this paper, the ECSOA is compared to the PSO, SA_GA, GSA, SCA, MVO, and SOA and provides the best-obtained values in Table 9.

In Table 9, the ECSOA is better than GSA, MFO, MVO, GA, CPSO, and HS algorithms in other literature. The ECSOA is also better than the PSO, SA_GA, GSA, SCA, MVO, and SOA. Therefore, the ECSOA can resolve the problem.

4.4.2. Pressure Vessel Design Problem

This is also the least fabrication cost problem of four parameters and four constraints. The parameters of the structural system are shown in Figure 12 [7]. Some of the algorithms are taken from other literature as follows: MFO [7], ES [47], DE [48], ACO [49], and GA [50]. For the problem, the ECSOA is compared to the PSO, SA_GA, GSA, SCA, MVO, and SOA and provides the best-obtained values in Table 10.

For the problem, the ECSOA is better than the MFO, ES, DE, ACO, and GA algorithms in other literature. The ECSOA is also better than the PSO, SA_GA, GSA, SCA, and MVO. There is not much difference between the optimal value of ESOA and that of SOA. Therefore, ECSOA can resolve the problem.

4.4.3. Cantilever Beam Design Problem

This is a problem that is determined by five parameters and is only applied to the scope of variables of constraints. The parameters of the structural system are shown in Figure 13 [7]. Some of the algorithms are taken from other literature as follows: MFO [7], CS [51], GCA [52], MMA [52], and SOS [53]. For the problem, the ECSOA is compared to the PSO, SA_GA, GSA, SCA, MVO, and SOA and provides the best-obtained values in Table 11.

In Table 11, the ECSOA proves to be better than the MFO, CS, GCA, MMA, and SOS algorithm in other literature. The ECSOA is also better than the PSO, SA_GA, GSA, SCA, and MVO. There is not much difference between the optimal value of ECSOA and that of SOA. Therefore, the ECSOA can resolve the problem.

4.4.4. Gear Train Design Problem

This is a minimum gear ratio problem, which has four variables and a scope of variables of constraints. Figure 14 is the schematic diagram [7]. Some of the algorithms are taken from other literature as follows: MFO [7], MVO [9], CS [51], ABC [54], and MBA [54]. For the problem in this paper, the ECSOA is compared to the PSO, SA_GA, GSA, SCA, MVO, and SOA and provides the best-obtained values in Table 12.

In Table 12, the ECSOA proves to be better than the MFO, MVO, CS, ABC, and MBA algorithm in other literature. Except for the SA_GA, GSA, and PSO, the ECSOA is also better than the SCA, the MVO, and the SOA. The result of the ECSOA has reached the theoretical best solution, although the optimum of the ECSOA is worse than that of the SA_GA, GSA, and PSO. The ECSOA finds a new value. Therefore, the ECSOA can resolve the problem.

4.4.5. Three-Bar Truss Design Problem

This is a minimize weight problem under stress, which has two variables and only applies to the scope of the variables of constraints. Figure 15 is the schematic diagram of the components [7]. Some of the algorithms are taken from other literature as follows: MFO [7], MVO [9], CS [51], MBA [54], and DEDS [55]. For the problem, the ECSOA is compared to the PSO, SA_GA, GSA, SCA, MVO, and SOA and provides the best values in Table 13.

In Table 13, except for the MVO and the PSO, the ECSOA is better than the others. The best value of the ECSOA has reached the theoretical best solution, although the optimum of the ECSOA is worse than that of the MVO and the PSO. Therefore, the ECSOA can resolve the problem.

4.4.6. I-Beam Design Problem

This is a minimize vertical deflection problem that has four variables and a constraint. Figure 16 is the design diagram [7]. Some of the algorithms are taken from other literature as follows: MFO [7], CS [51], SOS [53], IARSM [56], and ARSM [56]. For the problem, the ECSOA is compared to the PSO, SA_GA, GSA, SCA, MVO, and SOA and provides the best-obtained values in Table 14.

In Table 14, except for the MFO, GSA, SOA, and SA_GA, the ECSOA is better than the others. The fitness of the MFO is the best. Although the most minor vertical deviation of the ECSOA is not as good as that of the GSA, the SOA, and the SA_GA, it is very close to other relative optimal values. Therefore, the ECSOA is an effective and feasible solution to the I-beam design optimization problem.

In brief, the ECSOA proves to be better than the other algorithms in most actual studies. The ECSOA can resolve these practical problems.

5. Conclusion

An ECSOA is presented, with a completely elastic collision, completely inelastic collision, and non-completely elastic collision method. According to the four-phase test of the ECSOA from different perspectives, it improved the SOA, the benchmark function optimization, the PID control parameter optimization problems, and the constrained engineering problems.

In the first phase, the SOA is improved in seven different ways: the SOA based on parameter change (PCSOA), the SOA based on parameter adaptive Gaussian transform (PAGTSOA), the SOA based on Levy variation (LVSOA), the SOA based on refraction reverse learning mechanism (RRLOOA), the SOA based on mutual benefit factor strategy (MBFSOA), the SOA based on Cauchy variation (CVSOA), and the SOA based on elastic collision (ECSOA). Each improved algorithm was optimized for the fifteen functions. The result is that the ECSOA is feasible in the benchmark functions. In this phase, we consider the ranking values of 30 independent runs between the ECSOA mean values, the standard deviation values, the best fitness values, the best fitness values rank, the convergence curves, and the variance tests for the global minimum values.

In the second phase, fifteen benchmark function optimization problems are used to test the ECSOA further. The ECSOA is compared to the PSO, SA_GA, GSA, SCA, MVO, and SOA for verification. It was observed that the ECSOA is feasible and competitive in benchmark functions. The second test phase is also about the ranking values of 30 independent runs between the ECSOA mean values, standard deviation values, best fitness values, best fitness values rank, convergence curves, and variance tests for the global minimum values. In the benchmark function optimization problems, the complexity analysis of the ECSOA is researched, and the overall calculational complexity of the ECSOA is almost the same as that of the basic SOA. Wilcoxon’s rank-sum test is studied, and the ECSOA proves to be better than the other six algorithms. Based on the run time comparison of seven algorithms in benchmark functions, the ECSOA has relatively more program running time, and it is not optimal in terms of running time. From the results of the performance ratios of the average solution for the seven algorithms, the optimization probability of the ECSOA is the highest.

In the third phase, the four PID control parameter optimization models were used to test the ECSOA in practice further. The problems were a parameter optimization model of second-order PID controller without time delay, a parameter optimization model of PID controller with first-order micro delay, a parameter optimization model of first-order PID controller with significant time delay, and a parameter optimization model of high order PID controller without time delay problems. The third test phase also considered the ECSOA mean values, standard deviation values, best fitness values, best fitness values rank of 30 independent runs, convergence curves, and ANOVA. From the results of PID parameter optimization problems, the ECSOA was compared to various algorithms. The results show that the ECSOA is effective and feasible in practical problems.

Eventually, in the last phase, six engineering problems further tested the ECSOA. The ECSOA was compared to various algorithms. The results prove that the ECSOA is the highest competitive algorithm for the practical optimization problems.

According to the comparative analysis of the experiments, the conclusion is as follows:(1)The elastic collision strategy includes the completely elastic collision, the completely inelastic collision, and the noncomplete elastic collision. The three different situations of elastic collision strategy tend to generate random seekers, increase the diversity of the seeker, increase the search space, and avoid premature convergence.(2)Among the eight improved algorithms (PCSOA, PAGTSOA, ECSAO, LVSOA, RRLOOA, MBFSOA, CVSOA, and ECSOA), the ECSOA performed best in the benchmark functions test.(3)Among the seven algorithms (PSO, SA_GA, GSA, SCA, MVO, SOA, and ECSOA), the ECSOA optimization benchmark function has the highest optimization capability.(4)The ECSOA optimization benchmark functions have almost the same calculational complexity as the SOA.(5)The running time of the ECSOA optimization benchmark function is relatively high. Among the seven algorithms compared, the running time is only better than that of the SA_GA.(6)The ECSOA can solve real challenging problems, such as the PID control parameter optimization problems and the classical constrained engineering optimization problems.(7)Further improving and application can be incorporated into future studies. The improved SOA and the heuristic algorithms based on those improved strategies can be applied not only to engineering optimization problems, but also to path planning problems, pattern recognition, intelligent control and other fields, and many practical application optimization problems that cannot be solved by traditional methods. Except the methods used in the paper, some of representative computational intelligence algorithms can be used to solve the problems, such as the MBO, EHO, MS, SMA, and HHO.

Appendix

A. Welded Beam Design Problem

Consider , and minimize , subject to

Variable ranges are , , , and , where , , P = 6000 lb, L = 14 in, E = 30 × 106 psi, G = 12 × 106 psi, τmax = 136000 psi, σmax = 30000 psi, δmax = 0.25 in

B. Pressure Vessel Design Problem

Consider , and minimize , subject to

Variable ranges are , , , .

C. Cantilever Design Problem

Consider , and minimize , subject to

Variable ranges are .

D. Gear Train Design Problem

Consider , and minimize .

Variable ranges are .

E. Three-Bar Truss Design Problem

Consider , and minimize , subject to

Variable ranges are , , , .

F. I-Beam Design Problem

Consider , and minimize , subject to .

Variable ranges are , , , .

Data Availability

The data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.

Acknowledgments

This work was supported by the National Natural Science Foundation of China (Grant nos. 51766005 and 52166001) and the Science and Technology Project of Yunnan Tobacco Company of China (Grant no. 2019530000241019).