Abstract

This article comes up with a complex-valued encoding multichain seeker optimization algorithm (CMSOA) for the engineering optimization problems. The complex-valued encoding strategy and the multichain strategy are leaded in the seeker optimization algorithm (SOA). These strategies enhance the individuals’ diversity, enhance the local search, avert falling into the local optimum, and are the influential global optimization strategies. This article chooses fifteen benchmark functions, four proportional integral derivative (PID) control parameter models, and six constrained engineering problems to test. According to the experimental results, the CMSOA can be used in the benchmark functions, in the PID control parameter optimization, and in the optimization of constrained engineering problems. Compared to the particle swarm optimization (PSO), simulated annealing based on genetic algorithm (SA_GA), gravitational search algorithm (GSA), sine cosine algorithm (SCA), multiverse optimizer (MVO), and seeker optimization algorithm (SOA), the optimization ability and robustness of the CMSOA are better than those of others algorithms.

1. Introduction

Recently, the heuristic algorithm has received a lot of attention. Such algorithms create random methods for many optimization problems. Since the no free lunch (NFL) theorem, no one optimization solution can optimize overall questions [1]. Therefore, researchers pose new algorithms or enhance the current algorithms to deal with optimization problems. The current algorithms are the genetic algorithm (GA) [2], particle swarm optimization (PSO) [3], simulated annealing (SA) [4], harmony search (HS) [5], gravitational search algorithm (GSA) [6], moth-flame optimization (MFO) [7], sine cosine algorithm (SCA) [8], multiverse optimizer (MVO) [9], seeker optimization algorithm (SOA) [10], monarch butterfly optimization (MBO) [11], slime mould algorithm (SMA) [12], moth search algorithm (MSA) [13], hunger games search (HGS) [14], Runge–Kutta method (RUN) [15], and Harris hawks optimization (HHO) [16].

However, some optimization algorithms are still not very successful in many optimization problems. The optimization problems include the following: being premature, issues with low optimization precision, having only a local optimal solution, slow convergence speed, and insufficient robustness. To better overcome the issues of common optimization precision, prematurity, having only a local optimal solution, slow convergence rate, and poor robustness, some improved algorithms have proven to be feasible optimization algorithms and have been used in practical engineering. For instance, the evolutionary algorithms are improved by the adaptive parameter control methods [17]. The simulated annealing algorithm based on the particle swarm algorithm is adopted to optimize the extracting multiple tests [18]. A whale optimization algorithm based on hybrid algorithm framework with learning and complementary is used in function optimization and engineering design problems [19]. A multilayered gravitational search algorithm is used in function optimization and real-world problems [20]. An artificial bee colony algorithm search is improved by scale-free networks [21]. The chaotic local search-based differential evolution algorithm is applied to optimize the function optimization and the real-world optimization problems [22].

Also, the complex-valued encoding heuristic algorithms have been proposed according to the characteristics of some algorithms. These complex-valued encoding intelligent optimization algorithms have proven to be feasible optimization algorithms and have been used in practical engineering. For instance, the complex-valued encoding dragonfly algorithm optimized the power systems [23]. A gray wolf optimization based on plural encoding optimized the filter model [24]. The complex-valued encoding satin bowerbird optimization algorithm solved the benchmark functions [25]. The complex-valued encoding-driven optimization optimized the 0-1 knapsack problem [26]. The complex-valued encoding symbiotic organism search algorithm was proposed for the overall optimization [27]. The complex-valued encoding flower pollination algorithm optimized the constrained engineering optimization problems [28]. A comprehensive survey was offered for the complex-valued encoding metaheuristic optimization algorithm [24].

Dai et al. proposed SOA in 2006 [29]; the goal is to mimic the seekers’ behavior and the way they exchange information and solve practical application optimization problems. Recently, SOA has been used in many fields, such as in unconstrained optimization problems [30], optimal reactive power dispatch [31], a challenging set of benchmark problems [32], the design of a digital filter [33], optimizing parameters of artificial neural networks [34], the optimizing model and structures of fuel cell [35], the novel human group optimizer algorithm [36], and several practical applications [37].

However, in the initial stage of dealing with optimization problems, the SOA converges faster than others. When all individuals are near to the best individual for solving the optimization problem, the individuals will lose diversity and fall into prematurity.

To overcome the shortcomings of the SOA, there are various strategies for improving the SOA, such as the best empirical parameter strategy, the dynamic adaptive Gaussian variation of empirical parameter strategy, the Chebyshev chaos of order three strategy, the real coding double-link strategy, the complex-valued encoding strategy, and the complex-valued encoding multichain strategy. After improving the SOA with the above strategies and making an experimental comparison, this paper selects several improved strategies with better results to improve the SOA together. In this article, complex number coding and a multichain strategy are used to enhance the global optimization and the local search. We propose the complex-valued encoding multichain seeker optimization algorithm (CMSOA). The multichain strategy includes the complex-valued multichain and the stochastic complex multichain strategy. The CMSOA has been tested on fifteen benchmark functions, four PID control parameter optimizations, and six engineering optimizations taken from the literature. In comparison with PSO, SA_GA, GSA, SCA, MVO, and SOA, the CMSOA can find better values to solving the questions, and the precision and robustness of the CMSOA are better. The complex-valued encoding and the multichain methods enhance the diversity of the individuals and avert premature convergence. The CMSOA overcomes the premature convergence of the SOA. The advantages of the CMSOA are summed up as follows:(1)CMSOA is proposed to enhance the precision and robustness of optimization.(2)With the complex-coded multichain strategy, in complex-valued coding, the real part, imaginary part, and real number are used as parallel individual variables to solve the objective function problem.(3)The stochastic multichain strategy is introduced in the SOA. According to the initial solution generation rule of the complex number coding, the real part, the imaginary part, and the real number are randomly generated as the parallel individual variables to solve the objective function.(4)The complex-coded strategy, the multichain strategy, and the stochastic multichain strategy can improve the diversity of individuals, enhance local search, and avert premature convergence.

The rest of the article is organized as follows. Section 2 presents the SOA and the algorithm improvement strategies. Section 3 describes the CMSOA. Section 4 shows the algorithm optimization experiments, the results, and the analyses. At last, Section 5 gives some conclusions.

2. The Basic SOA and Algorithm Improvement Strategies

The SOA carries out in-depth research on human search behavior. It considers optimization as a search for an optimal solution by a search team in search space, taking search team as population and the site of the searcher as task method. Using “experience gradient” to determine the search direction, we use uncertain reasoning to resolve the search step measurement, through the scout direction and search step size to complete the searchers’ position in the search interspace update, to attain the optimization of the solution.

2.1. Key Update Points for SOA

The SOAs have three main updating steps. In this section, i is the ith searcher individual and j represents the individual dimension. s is the total number of individuals; D is the total number of dimensions of the variable; t means the current algebra; and itermax represents the maximum optimization algebra. xij(t) and xij(t + 1), respectively, represent the searchers’ site at algebras t and (t + 1).

2.1.1. Search Direction

The forward orientation of a search is defined by the experience gradient obtained from the individuals’ movement and the evaluation of other individuals’ search historical position. The egoistic direction , altruistic direction , and preemptive direction of the ith individual in any dimension can be obtained.

The searcher uses the method of a random weighted average to obtain the search orientation.where t1, t2 ∈ {t, t − 1, t − 2}; and are the best advantages of separately; i,best is the historical optimal location in the neighborhood where the ith search factor is located; pi,best is the optimal locality from the ith search factor to the current locality; ψ1 and ψ1 are random numbers in [0, 1]; and ω is the weight of inertia.

2.1.2. Search Step Size

The SOA refers to the reasoning of the fuzzy approximation ability. The SOA, through the computer language, describes some of the human natural languages that can simulate human intelligence reasoning search behavior. If the algorithm expresses a simple fuzzy rule, it adapts to the best approximation of the objective optimization problems. Greater search step length is more important. However, the smaller fitness corresponds to the smaller search step length. The Gaussian distribution function is adopted to describe the search step measurement.whereα andδ are parameters of a membership function.

According to equation (3), the probability of the output variable exceeding [−3δ, 3δ] is less than 0.0111. Therefore, μmin = 0.0111. Under normal circumstances, the optimal position of an individual has μmax = 1.0 and the worst place is 0.0111. However, to accelerate the convergence speed and get the optimal individual to have an uncertain step size, μmax is set as 0.9 in this paper. Select the following function as the fuzzy variable with a “small” target function value:where μij is determined by equations (4) and (5) and Ii is the count of the sequence xi(t) of the current individuals arranged from high to low by function value. The function rand(μi, 1) is the real number in any partition [μi, 1]. It can be seen from equation (4) that it simulates the random search behavior of human beings. Step measurement of j-dimensional search interspace is determined bywhere δij is a parameter of the Gaussian distribution function, which is defined bywhere ω is the weight of inertia. As the evolutionary algebra increases, ω decreases linearly from 0.9 to 0.1. and are, respectively, the variate of the minimum value and maximum value of the function.

2.1.3. Individual Location Updates

After obtaining the scout direction and scout step measurement of the individual, the location update is represented by

fij(t) and αij(t), respectively, represent the searchers’ search direction and search step size at time t.

2.2. The Algorithm Improvement Strategies

Five strategies for improving the algorithm are listed in this paper.

2.2.1. The Best Empirical Parameter Strategy

The first strategy is an empirical parameter change strategy. In the basic SOA, equation (8) is changed to equation (10), and the empirical value C is changed to a fixed empirical value. Through a large number of experimental tests, the empirical value is C = 0.2. The individual position update is still the same as equation (9).where δij is a parameter of the Gaussian membership function [38, 39] and is the variate of the minimum value of the function.

2.2.2. The Dynamic Adaptive Gaussian Variation of Empirical Parameter

In the SOA, equation (8) is changed to equation (11), and the empirical value C1 is changed to an adaptive empirical value that varies between 0.1 and 0.5 with the change of optimization algebra according to equation (12). The individual position update is still the same as equation (9).where δij is a parameter of the Gaussian membership function [38, 39] and is the variate of the minimum value of the function.

2.2.3. The Chebyshev Chaos of Order Three

The Chebyshev map of order is defined aswhere when , xij is chaotic and ergodic and has orthogonality. In this case, no matter how close different initial values are, the sequences derived from multiple iterations are not correlated with each other.

2.2.4. The Multichain Strategy/the Double-Chain Strategy

The multichain strategy includes taking the real and the imaginary parts of the plural as separate parallel solutions and the randomly generating parallel solutions according to the complex number coding law.

In this paper, the meaning of the multichain strategy is that a single individual variable in the original SOA is converted into six parallel individual parameters when the CMSOA optimizes a problem. In complex-valued coding, there are real part XR, imaginary part XI, and real number XK. In each iterative loop optimization, XR, XI, and XK are adjusted to the variables that meet the scope of X (Xmin = Ak, Xmax = Bk). XR, XI, and XK were taken as the relative optimal solution variables, respectively, to solve the objective function problem. Secondly, a group of variables that randomly generate XR_Random, XI_Random, and XK_Random according to formulas (9)–(11) and meet the scope of X (Xmin = Ak, Xmax = Bk) should be added in each cycle optimization and taken as the relative optimal solution variable to solve the objective function, respectively. At the end individual of the single solution, the respective optimal solutions are saved, and the global optimal value is saved as the current optimal value after the comparison of each optimal solution. The optimal solution variables of the next generation of XR, XI, and XK are changed according to formulas (13)–(15). The next generation optimal solution variables of XR_Random, XI_Random, and XK_Random are generated randomly according to formulas (9)–(11). In other words, a single individual variable X in the original SOA is converted to six individual variables XR, XI, XK, XR_Random, XI_Random, and XK_Random when solved by the CMSOA, and this is shown in Figure 1. So, instead of solving for one main chain, we are solving for six parallel chains. A multichain strategy is used in the CMSOA; the strategy adds the variety of the individual, enhances the local scout, and averts premature convergence.

For the SOA of real number coding, the real number coding is one chain, and the random generation of real number population is another chain. So, a double-chain is made up of a real number coding chain and a random generation of real number chain.

2.2.5. The Complex-Valved Encoding

(1) Initial Population Generation. In light of the variable interval [Ak, Bk], k = 1, 2, …, 2s − 1, 2s, the modules ρk, the phase angles θk, and the plural are produced [40] as follows:

(2) Individual Location Updates. The real part is updated bywhere αR represents the scout direction of the real parts, fR is the scout step measurement of the real parts, and XR represents the location of the real number parts.

The imaginary part is updated bywhere αI represents the scout direction of the imaginary part, fI is the scout step measurement of the imaginary part, and XI represents the location of the imaginary number part.

(3) Fitness Evaluation Method. When calculating fitness values using the SOA, we convert plural to real numbers. The formula is as follows.(1)Take the plural mathematical module as real number:(2)Define the sign according to the phase angle:where Xk is the real number.

3. The CMSOA Process

The chromosomes of complex organisms are regarded as double-stranded or multistranded construction. Since a complex value is made up of the real part and the imaginary part [26, 4143], the complex value is represented as a double-chain. A double-chain represents a chromosome pair, and the individuals that make up the double-chain have the same length. The two-body framework enhances the variety of individuals and makes the algorithm have better searching and calculation capacity.

The CMSOA is based on a multiple population evolution model, three populations evolved by the SOA, and three other populations evolved from random generation. The individual groups use the information-sharing mechanisms to realize coevolution. Algorithm 1 shows the primary process of the CMSOA.

(1)t ← 0
(2)Initialization: generate initial species group based on formulas (15)–(17).
(3)Convert plural into real numbers based on formulas (20) and (21).
(4)Determine the range of XR_CMSOA,G, XI_CMSOA,G, and XCMSOA,G to satisfy the range of X.
(5)Evaluate each seeker. Compute the fitness.
(6)While stopping condition is not satisfied
(6.1) Running process of the CMSOA
(6.1.1)  Renew the real parts by formula (18), XR_CMSOA,G.
(6.1.2)  Renew the imaginary parts based on formula (19), XI_CMSOA,G.
(6.1.3)  Convert plural into real number based on formulas (20) and (21), XCMSOA,G.
(6.1.4)  Determine the range of XR_CMSOA,G, XI_CMSOA,G, and XCMSOA,G to satisfy the range of X.
(6.1.5)  Scout strategy giving scout direction and scout range.
(6.1.6)  Calculate the fitness (XR_CMSOA,G), (XI_CMSOA,G), (XCMSOA,G).
(6.1.7)  Identify the best solution XCMSOAbest,G
   FCMSOA,G = min[f(XR_CMSOA,G) f(XI_CMSOA,G) f(XCMSOA,G)]
   XCMSOAbest,G = min(FCMSOA,G)
(6.2) Random generation and calculation
(6.2.1)  Generate Initial population according to formulas (15)–(17).
(6.2.2)  Convert complex numbers into real numbers according to formulas (20) and (21).
(6.2.3)  Determine the XR_Random,G, XI_Random,G, and XRandom,G to satisfy the range of X.
(6.2.4)  Calculate the fitness (XR_Random,G), (XI_Random,G), (XRandom,G).
(6.2.5)  Identify the best value XRandombest,G
   FRandom,G = min[f(XR_Random,G) f(XI_Random,G) f(XRandom,G)]
   XRandombest,G = min(FRandom,G)
(6.3) Confirm the global best value Xbest
  If (XCMSOAbest,G) ≤ (XRandombest,G)
   Xbest = XCMSOAbest,G
  else Xbest = XRandombest,G
  end if
(7)t = t + 1
(8)if t < itermax, then jump to 3; Else Stop.

4. Experimental Results

4.1. Experimental Setup

The algorithms used in the experiment in this paper were run under MATLAB R2016a. The computer is configured as Intel (R) Core (TM) i7-7500U CPU @2.7 GHz 2.9 GHz processor with 8 GB of memory, and the operating system is Windows 10.

4.2. Algorithm Performance Comparison in Benchmark Functions

To ensure that the comparison of these algorithms is fair, the population number of algorithms is 30, and the evolutionary algebra is 1000. At the same time, for further ensuring the fairness of algorithm comparison and reducing the effect of randomness, the results of the seven algorithms after 30 independent runs were selected for comparison.

4.2.1. The Benchmark Functions

In this field, it is common to base the capability of algorithms on mathematic functions that are known to be globally optimal. Fifteen benchmark functions in the literature are used as the comparative test platform [7, 10, 4446]. Table 1 shows the functions in the experiment. Variables are set to one thousand.

4.2.2. Algorithm Performance Comparison of the SOA with Different Improvement Methods

In this paper, the SOA is improved by six different methods: the parameter changing SOA (PCSOA), the parameter adaptive Gaussian transform SOA (PAGTSOA), the SOA based on the Chebyshev chaos of order three (CCSOA), the SOA based on real coding double-link (DSOA), the SOA based on complex-valued encoding (CSOA), and the complex-valued encoding multichain seeker optimization algorithm (CMSOA).

(1) Parameter Setting of SOA with Different Improvement Methods. This section will introduce the parameter setting of the improved SOAs used in the experiment in this paper. Dai et al. have done a lot of research on the parameter set of the SOA [33], and we did a lot of practice tests and comparative studies about the parameters. The specific parameters of the improved SOA are shown in Table 2. In the next section, we use these improved SOAs for experimental comparison and choose a relatively optimal improved algorithm to compare with other advanced intelligent algorithms.

(2) Improved Algorithm Performance Comparison in the Benchmark Functions. The SOA is improved in six different ways: the parameter changing SOA (PCSOA), the parameter adaptive Gaussian transform SOA (PAGTSOA), the SOA based on the Chebyshev chaos of order three (CCSOA), the SOA based on real coding double-link (DSOA), the SOA based on complex-valued encoding (CSOA), and the complex-valued encoding multichain seeker optimization algorithm (CMSOA). To test the performance, each improved algorithm was optimized for the fifteen functions in Table 1. Each algorithm and each function were run independently 30 times. The performance of the SOA and the six improved SOAs in fifteen function optimizations was compared by the mean (Mean), standard deviation (Std.), best fitness (Best), the program running time (Time), and the best fitness rank (Rank) of 30 running results. The optimal fitness reflects the optimization accuracy of the algorithm, the average value and standard deviation reflect the robustness of the algorithms, and the running time reflects the time of the program. The results of functions f1f15 are displayed in Table 3. The values in bold and italics indicate that the optimal result is better.

Based on Table 3, for the benchmark functions f1f15, the comparison between the seven improved SOAs in this paper and the original SOA shows that the optimization result of the CMSOA is the best value. The mean (Mean), standard deviation (Std.), best fitness (Best), and best fitness rank (Rank) of the CMSOA were the best after 30 independent runs. The total program running time of f1f15 ranks fifth among the seven algorithms compared in this paper. The running time of the CMSOA is longer than that of others algorithms. From the perspective of optimization accuracy and robustness, the CMSOA has the best optimization performance than these improved SOAs in this paper. Section 4.2.3 compares the CMSOA with other intelligent optimization algorithms widely used today.

(3) Search History of the CMSOA. Figure 2 shows the graph of the optimized function f1, the convergence curves, the initial population’s positions, and the search history; the search history behaviors of the search seekers are marked with red mark +. Based on Figure 2, for the benchmark function f1, the convergence curve of the CMSOA is fast. From the search history of the CMSOA, the search seekers of the CMSOA extensively move towards promising search regions in the search space; the search seekers searched the given search space by the moment in the change search step size and different search directions; this gives the way to increase local search, escape local optima, and avoid premature convergence.

Similarly, Figure 3 shows the graph of the optimized function f10, the convergence curves, the initial population’s positions, and the search history; the search history behaviors of the search seekers are marked with red mark +. Based on Figure 3, for the benchmark function f10, the convergence curve of the CMSOA is fast. From the search history of the CMSOA, the search seekers of the CMSOA extensively move towards promising search regions in the search space; the search seekers searched the given search space by the moment in the change search step size and different search directions; this gives the way to increase local search, escape local optima, and avoid premature convergence.

Similarly, Figure 4 shows the graph of the optimized function f14, the convergence curves, the initial population’s positions, and the search history; the search history behaviors of the search seekers are marked with red mark +. Based on Figure 4, for the benchmark function f14, the convergence curve of the CMSOA is the fast. From the search history of the CMSOA, the search seekers of the CMSOA extensively move towards promising search regions in the search space; the search seekers searched the given search space by the moment in the change search step size and different search directions; this gives the way to increase local search, escape local optima, and avoid premature convergence.

4.2.3. The Algorithm Performance Comparison of Different Algorithms in the Benchmark Functions

To test the performance of the CMSOA, the CMSOA is compared with the PSO, SA-GA, GSA, SCA, MVO, and SSA, using fifteen benchmark functions [7, 10, 4446] in Table 1, which have been widely used in the test.

(1) The Parameter Setting of Different Algorithms. In this section, the parameter set of the PSO [47], SA_GA [48], GSA [6], SCA [8], MVO [9], SOA [29], and CMSOA is presented. According to references [6, 8, 23, 29, 47, 48], we did a lot of practice tests and comparative studies for the parameter set. The parameters of the seven algorithms depend on the real experience to take the right value. Table 4 lists the parameters in the test.

(2) The Result Comparison of Different Algorithms in Benchmark Functions. This section uses the same fifteen functions as in Table 1, but we have expanded the dimension of the variables to 1000 dimensions. The mean values, standard deviation, best fitness, and best fitness rank of the algorithms of 30 all-alone runs and the data of optimization results of functions f1f15 are shown in Table 5. The values in bold and italics indicate that the optimal outcome is better.

For the benchmark functions f1f15, based on Table 5, except f4, f7, f9, f10, f11, f14, and f15, the optimal value of the CMSOA is better than that of the others. To f9, the optimal value of the CMSOA has reached the theoretical best value, although the optimal fitness value of the CMSOA is inferior to that of the PSO and the GSA. For f7 function, the optimal value of the CMSOA is only worse than that of the PSO algorithm. The optimal fitness value result of the CMSOA to f4 is worse than that of the PSO, GSA, and SOA. The optimal fitness value results of the CMSOA for f15 function are only worse than those of the PSO algorithm, the optimal fitness value result of the CMSOA for f14 function is worse than that of the PSO and SOA, the optimal fitness value result of the CMSOA for f11 function is worse only than that of the SCA algorithm, and the result of the CMSOA for f10 is worse than that of the SOA and the MVO.

For the benchmark functions f1f15, based on Table 5, except f2, f3, f4, f7, f9, f10, f11, f12, and f14, the standard deviation results of the CMSOA are better than those of the others. The standard deviation results of the CMSOA for f9 are only worse than those of the PSO algorithm, the result of the CMSOA for f7 is worse than that of the PSO, GSA, and SOA, the result of the CMSOA for f4 is worse than that of the SCA, GSA, SA_GA, PSO, and the MVO, the result of the CMSOA for f3 is worse than that of the MVO and SOA, and the result of the CMSOA for f2 is only worse than that of the SOA. To f2 function, the standard deviation results of the PSO, SA_GA, and the SCA algorithm have no solution, and the GSA and the MVO algorithm have an infinite standard deviation. The CMSOA is better than the others. For f14, the CMSOA is worse than the PSO and the GSA. For f11 and f12, the CMSOA is worse than the PSO, SA_GA, GSA, MVO, and SOA, and the standard deviation results of the CMSOA for f10 function are worse than those of the PSO, SA_GA, GSA, SCA, and the MVO algorithm.

For the benchmark functions f1f15, based on Table 5, except f3, f4, f7, f9, f10, and f14, the mean values of the CMSOA are better than those of the others. For f9, the mean test results of the CMSOA have reached the theoretical best value; although the mean test result of the CMSOA is worse than that of the PSO, the result of the CMSOA for f7 is worse than that of the PSO, the result of the CMSOA for f4 is worse than that of the PSO, GSA, and SOA, and the result of the CMSOA for f3 is only worse than that of the GSA and SOA. The CMSOA is better than the others, for f10, the CMSOA is worse than the MVO and SOA, and for f14, the CMSOA is only worse than the PSO algorithm.

According to the optimal fitness value mean rank and all rank results from Table 5, the CMSOA can find solutions and has strong optimization ability and strong robustness to benchmark function.

(3) Convergence Curve Comparison of Algorithms in the Benchmark Functions. Figure 5 shows the fitness curves of the best fitness for the benchmark functions f1f15 (D = 1000). As seen from Figure 5, the CMSOA is compared to the other six algorithms; the convergence of the CMSOA is faster, and the precision of the CMSOA is better, except f4, f7, f9, f10, f14, and f15. Although the CMSOA for f9 is worse than the PSO in terms of convergence and the precision, the CMSOA has reached the theoretical best value, for f7, the CMSOA is only worse than the PSO, and for f4, the CMSOA is worse than the PSO, GSA, and SOA. The CMSOA for f15 is only worse than the SOA in terms of convergence and precision, for f14, the CMSOA is worse than the SOA and the PSO, and for f10, the CMSOA is worse than the MVO and SOA. Because of the multichain strategy to augment the individuals’ diversity and local scout intensity, the CMSOA has better optimization property.

(4) ANOVA Test Comparison of Algorithms in Benchmark Functions. Figure 6 shows the ANOVA of the global best values for benchmark functions f1f15 (D = 1000). As seen from Figure 6, the CMSOA is the most robust, except f3, f4, f7, f10, f12, and f14. The ANOVA test results of the CMSOA for f7 are only worse than those of the PSO algorithm, the result of the CMSOA for f4 is worse than that of the PSO, GSA, and SOA, and the result of the CMSOA for f3 function is only worse than that of the GSA and SOA. The ANOVA test results of the CMSOA for f10 function are only worse than those of the MVO algorithm, and the results of f12 and f14 are worse than that of the PSO and SOA. The CMSOA showed better robustness.

4.2.4. Complexity Analysis

The calculation complexity of the basic SOA is O (N.D.M), where N is the total individual count, D is the dimension count, and M is the maximum count of algebras. The computational complexity of the first phase of the SOA stage is O (N.D.M). The complex coding strategy is introduced to calculate the O (N.D.M) value. The introduced multichain strategy’s calculational complexity value is O (N.D.M). So, the overall complexity of the CMSOA is O (N.D.M + N.D.M + N.D.M). Based on the principle of Big-O representation [49], if the count of algebras is high (M ≫ N, D), the calculational complexity is O ((N.D.M). Therefore, the overall calculational complexity of the CMSOA is almost the same as the basic SOA.

4.2.5. Run Time Comparison of Algorithms in Benchmark Functions

In this section, we recorded the running time of each algorithm under the same conditions: population number 30, evolution algebra 1000, and 30 independent runs of the above fifteen benchmark functions f1f15 (D = 1000). Then, the running time of the fifteen functions is added to obtain the sum of the 30 independent running times of each algorithm for the fifteen functions listed in this paper and the ranking of the total time, as shown in Table 6. As seen from Table 6, the PSO algorithm has the most minor program running time, followed by the SCA algorithm, which has more minor program running time, and the CMSOA ranks sixth, which has relatively more program running time. At the bottom of the list is the SA_GA algorithm, which takes the most running time.

To learn more about the program running time of the seven algorithms in the fifteen functions, a bar chart (Figure 7) was made for the total time of each algorithm after 30 independent runs. From Figure 7, the program running time of the PSO is the least, while that of the SA_GA algorithm is the most, and the program running time of the CMSOA is less than half of that of the SA_GA algorithm, which is relatively large.

4.2.6. Exploration and Exploitation in Benchmark Functions

According to [5052], formulas (22)–(25) represent the exploration and development capability of an algorithm.where median xj is the median of dimension j in whole swarm, is the dimension j of the swam individual i, n is the size of swarm, Divj is the average for all the individuals, Div is the diversity of swarm in an iteration, Divmax is the maximum diversity in all iterations, and Xpl% and Xpt% are the exploration and exploitation percentages for an iteration, respectively.

Figure 8 shows the exploration and exploitation abilities of the CMSOA as the number of iterations increases in the benchmark functions f1f15. As observed from the plotted curves shown in Figure 8, the CMSOA maintains good balance between the exploration and exploitation ratios as the number of iterations increases.

4.2.7. Performance Profiles of Algorithms in Benchmark Functions

The average fitness was selected as the capability index. The algorithmic capability is expressed in performance profiles, which are calculated by the following formulas:where represents an algorithm, G is the algorithm set, f represents a function, F is the function set, is the number of algorithms in the experiment, nf is the number of functions in the experiment, is the average fitness after the algorithm solves function f, is the capability ratio, is the algorithmic capability, and τ is a factor of the best probability [53].

Figure 9 shows the capability ratios of the mean fitness for the seven algorithms on the benchmark functions f1f15 (D = 1000). The results are displayed by a log scale 2. As shown in Figure 9, the CSMOA has the highest probability. When τ = 1, the CMSOA is about 0.6, which is better than others. When τ = 4, the CMSOA is about 0.87, the PSO is 0.53, the SOA is 0.40, the GSA is 0.067, the MVO is 0.067, the SCA is 0.067, and the SA_GA is 0.067. When τ = 12, the CMSOA is 0.87, the PSO is 0.73, the SOA is 0.80, the GSA is 0.33, the MVO is 0.33, the SCA is 0.27, and the SA_GA is 0.2. The capability curve of the CMSOA lies above others, and the CMSOA can achieve about 0.87 when τ ≥ 4. Thus, the property of the CMSOA is better than that of other algorithms.

4.3. Algorithm Performance Comparison in PID Controller Parameter Optimization Problems

In this section, we use four test control system models optimizing the PID parameters to test the capability of the CMSOA. For the , the population number of all algorithms is 20, the max number of algebras is 20, the step response time of is set to 10 s, and the step response time of is set to 30 s. For , the population number of all algorithms is 50, the max number of algebras is 50, and the step response time is set at 50 s.

4.3.1. Control System Models

Equations (28)–(31) show the test control system models optimizing PID parameters used in our experiment. Figure 10 shows the process diagram for optimizing the test control system PID parameters by the CMSOA. Figure 11 shows the optimization of PID parameter model structure of the test control system.

4.3.2. Result Comparison of Algorithms in PID Controller Parameter Optimization

For testing the capability of the CMSOA, the CMSOA is compared with the PSO, SA-GA, GSA, SCA, MVO, and SOA in the PID controller parameter optimization. The mean values, standard deviation values, best fitness values, and best fitness value ranks of the algorithms of 30 all-alone runs, for , are displayed in Table 7. The values in bold and italics indicate that the optimal result is better.

For the PID controller parameter optimization problems, according to Table 7, except and , in terms of best fitness, the CMSOA is better than others. The optimal fitness value results of the CMSOA for model are only worse than those of the SA_GA algorithm; the optimal fitness value result of the CMSOA for g4 model is only worse than that of the PSO algorithm. Except for and , in terms of standard deviation results, the CMSOA is better than others, and the CMSOA is only worse than the SA_GA. In terms of mean, the CMSOA is better than others. According to the optimal fitness value mean rank and all rank results from Table 7, the CMSOA can find solutions and has very strong robustness for the PID controller parameter optimization problems.

4.3.3. The Convergence Curve Comparison of Algorithms in PID Controller Parameter Optimization

Figure 12 shows the fitness curves of PID controller parameter optimization for . As shown in Figure 12, the CMSOA is compared with the other six algorithms; the convergence of the CMSOA is fast, and the precision of the CMSOA is best. The CMSOA can find the optimal value.

4.3.4. ANOVA Test Comparison of Algorithms in the PID Controller Parameter Optimization

Figure 13 shows the ANOVA of the global best values’ PID controller parameter optimization for . As seen from Figure 13, the CMSOA is the most robust compared to other algorithms.

4.3.5. The Unit Step Functions of PID Controller Parameter Optimization

Figure 14 shows the unit step functions of PID controller parameter optimization for . As seen from Figure 14, by the CMSOA to optimization the unit step models PID controller parameter for , the unit step functions tend to stabilize very quickly and accurately.

Therefore, the CMSOA is an effective and feasible PID parameter optimization solution for the control system model.

4.4. Algorithm Performance Comparison in Constrained Engineering Optimization Problems

We use six engineering problems to test the capability of the CMSOA further. The engineering problems are very popular in the literature. The penalty function is used to calculate the constrained problem. The parameter set for all of the heuristic algorithms still adopts the parameter setting from Table 4 of Section 4.2.3. The formulations of these problems are available in Appendix.

4.4.1. Welded Beam Design Problem

This is a least fabrication cost problem, which has four parameters and seven constraints. The parameters of the structural system are shown in Figure 15 [9]. Some of the works come from these kinds of literature: GSA [6], MFO [7], MVO [9], coevolutionary particle swarm optimization (CPSO) [54], and harmony search (HS) [55]. For the problem in this paper, the CMSOA is compared with the PSO, SA_GA, GSA, SCA, MVO, and SOA, and it provides the best-obtained values in Table 8.

In Table 8, the CMSOA is better than the GSA, MFO, MVO, GA, CPSO, and HS algorithms. The CMSOA is also better than the PSO, SA_GA, GSA, SCA, MVO, and SOA. Therefore, the CMSOA is an effective and feasible solution to the problem.

4.4.2. Pressure Vessel Design Problem

This is also a least fabrication cost problem of four parameters and four constraints. The parameters of the structural system are shown in Figure 16 [9]. Some of the works come from the literature: the MFO [7], the evolution strategies (ESs) [56], the differential evolution (DE) [57], the ant colony optimization (ACO) [58], and the GA [59]. For the problem, the CMSOA is compared with the PSO, SA_GA, GSA, SCA, MVO, and SOA, and it provides the best-obtained values in Table 9.

In Table 9, the CMSOA is better than the MFO, ES, DE, ACO, and GA. The CMSOA is also better than the PSO, SA_GA, GSA, SCA, MVO, and SOA. Therefore, the CMSOA is an effective and feasible solution to the problem.

4.4.3. Cantilever Beam Design Problem

This is a problem that is determined by five parameters and is only applied to the scope of the variables of constraints. The parameters of the structural system are shown in Figure 17 [7]. Some of the works come from these kinds of literature: the MFO [7], the cuckoo search algorithm (CS) [60], the generalized convex approximation (GCA) [61], the method of moving asymptotes (MMA) [61], and the symbiotic organism search (SOS) [62]. For the problem, the CMSOA is compared with the PSO, SA_GA, GSA, SCA, MVO, and SOA, and it provides the best-obtained values in Table 10.

In Table 10, the CMSOA is better than the MFO, CS, GCA, MMA, and SOS. The CMSOA is also better than the PSO, SA_GA, GSA, SCA, MVO, and SOA. Therefore, the CMSOA is an effective and feasible solution to the problem.

4.4.4. Gear Train Design Problem

This is a gear ratio minimization problem, which has four variables and the scope of variables of constraints. Figure 18 shows the schematic diagram [63]. Some of the works come from these kinds of the literature: the MFO [7], the MVO [9], the CS [60], the artificial bee colony (ABC) [64], and the mine blast algorithm (MBA) [64]. In this paper, the CMSOA is compared with the PSO, SA_GA, GSA, SCA, MVO, and SOA, and it provides the best-obtained values in Table 11.

In Table 11, the CMSOA proves to be better than the MFO, MVO, CS, ABC, and MBA. Except for the SA_GA, GSA, and PSO, the CMSOA is better than the SCA, MVO, and SOA. The optimal fitness value of the CMSOA has reached the theoretical best value, although the optimal fitness value of the CMSOA is worse than that of the SA_GA, GSA, and PSO. The CMSOA finds a new value. Therefore, the CMSOA can resolve the problem.

4.4.5. Three-Bar Truss Design Problem

This is a weight minimization problem under stress, which has two variables and only applies to the scope of the variables of constraints. The schematic diagram of the components [63] is shown in Figure 19 [9].

Some of the works come from these kinds of literatures: MFO [7], MVO [9], CS [60], MBA [64], and differential evolution with dynamic stochastic selection (DEDSS) [65]. In this paper, the problem is resolved by the CMSOA. For the problem, the CMSOA is compared with the PSO, SA_GA, GSA, SCA, MVO, and SOA, and it provides the best-obtained values in Table 12.

In Table 12, except MVO and PSO, the CMSOA is better than the others. The optimal fitness value of the CMSOA has reached the theoretical best value, although the optimal fitness value of the CMSOA is worse than that of the MVO and the PSO. Therefore, the CMSOA can resolve the problem.

4.4.6. I-Beam Design Problem

This is a vertical deflection minimization problem that has four variables and a constraint. Figure 20 shows the design diagram [7]. Some of the works come from these kinds of literatures: MFO [7], CS [60], SOS [62], the adaptive response surface method (ARSM) [66], and the improved adaptive response surface method (IARSM) [66]. For the problem, the CMSOA is compared with the PSO, SA_GA, GSA, SCA, MVO, and SOA, and it provides the best-obtained values in Table 13.

In Table 13, except MFO, GSA, SOA, and SA-GA, the CMSOA is better than the others. The fitness of the MFO is best. Although the most minor vertical deviation of the CMSOA is not as good as that of the GSA, SOA, and SA-GA, it is very close to other relative optimal values. Therefore, the CMSOA is an effective and feasible solution to the I-beam design optimization problem.

In brief, the CMSOA proves to be better than the other algorithms in most actual studies. The CMSOA can resolve these practical problems.

5. Conclusion

A CMSOA is presented, with a complex-valued encoding method and a multichain strategy. The CMSOA is tested in four stages from different perspectives such as benchmark function, PID control parameters, and constraint engineering. Besides, the CMSOA was compared with the PSO, SA-GA, GSA, SCA, MVO, and SOA.

In the first phase, the SOA is improved in six different ways: the parameter changing SOA (PCSOA), the parameter adaptive Gaussian transform SOA (PAGTSOA), the SOA based on the Chebyshev chaos of order three (CCSOA), the SOA based on real coding double-link (DSOA), the SOA based on complex-valued encoding (CSOA), and the complex-valued encoding multichain seeker optimization algorithm (CMSOA). Each improved algorithm was optimized for the fifteen functions. The result is that the CMSOA is feasible in the benchmark functions. In this phase, we consider the ranking values of 30 all-alone runs between the CMSOA mean values, standard deviation values, best fitness values, best fitness value ranks, the convergence curves of functions f1, f10, and f14, and the population’s positions search history of functions f1, f10, and f14.

In the second phase, fifteen benchmark function optimization problems are used to test the CMSOA further. The CMSOA is compared to the PSO, SA-GA, GSA, SCA, MVO, and SOA for verification. The CMSOA is feasible in the benchmark functions. The second phase is also about the ranking values of 30 all-alone runs between the CMSOA mean values, standard deviation values, best fitness values, best fitness value ranks, convergence curves, and the variance tests for the global minimum values. In the benchmark function optimization problems, the optimal solution curves obtained by the CMSOA are in good agreement with the theoretical optimal solution curves, and the accuracy of the CMSOA is better. The ANOVA of the global best values to benchmark functions is studied, and the CMSOA is the most robust algorithm. Based on the complexity analysis, the CMSOA is known as an efficient algorithm. Based on the run time comparison of seven algorithms in benchmark functions, the CMSOA has relatively more program running time, and it is not optimal in terms of running time. The exploration and exploitation abilities of the CMSOA in benchmark functions are studied, and the CMSOA maintains good balance between the exploration and exploitation abilities as the number of iterations increases. From the results of the performance ratios of the average solution for the seven algorithms, the optimization probability of the CMSOA is the highest.

In the third test phase, four PID control parameter optimization problems are used to test the CMSOA in practice further. The problems were a parameter optimization model of second-order PID controller without time delay, a parameter optimization model of PID controller with first-order microdelay, a parameter optimization model of first-order PID controller with significant time delay, and a parameter optimization model of high-order PID controller without time delay problems. The third test phase also considered the CMSOA mean values, standard deviation values, best fitness values, and best fitness values rank of 30 all alone runs, the convergence curves, and the ANOVA. From the results of PID parameter optimization problems, compared with the other six algorithms, the CMSOA is effective and feasible in the practical problem.

Eventually, in the last test phase, six engineering problems further tested the CMSOA. The CMSOA was compared with various algorithms. The results prove that the CMSOA is the highest competitive algorithm for the practical optimization problems.

According to the comparative analysis of the experiments, the conclusion is as follows:(i)We use the complex-valued encoding and the multichain strategy for each seeker to increase the scout region and avoid convergence to local optimality.(ii)Among the six improved algorithms: PCSOA, PAGSOA, CCSOA, DSOA, CSOA, and CMSOA, the CMSOA performed best in the benchmark function test.(iii)Among the seven algorithms PSO, SA_GA, GSA, SCA, MVO, SOA, and CMSOA, the CMSOA optimization benchmark functions have higher optimization capability.(iv)The CMSOA optimization benchmark function has almost the same calculational complexity as the SOA.(v)The running time of the CMSOA optimization benchmark function is relatively long. Among the seven algorithms compared, the running time is only shorter than that of the SA-GA.(vi)The CMSOA can solve real challenging problems, such as the PID control parameter optimization problems and the real constrained engineering optimization problems.(vii)Further improvement and application can be incorporated into future studies.

Appendix

A. Welded Beam Design Problem

where , , , , , , , , , P = 6000 lb, L = 14 in, E = 30 × 106 psi, G = 12 × 106 psi, τmax = 136000 psi, σmax = 30000 psi, δmax = 0.25 in.

B. Pressure Vessel Design Problem

C. Cantilever Design Problem

D. Gear Train Design Problem

E. Three-Bar Truss Design Problem

F. I-Beam Design Problem

Data Availability

The data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.

Acknowledgments

This study was supported by the National Natural Science Foundation of China (grant nos. 51766005 and 52166001) and Science and Technology Project of Yunnan Tobacco Company of China (grant no. 2019530000241019).