Abstract
This article comes up with a complexvalued encoding multichain seeker optimization algorithm (CMSOA) for the engineering optimization problems. The complexvalued encoding strategy and the multichain strategy are leaded in the seeker optimization algorithm (SOA). These strategies enhance the individuals’ diversity, enhance the local search, avert falling into the local optimum, and are the influential global optimization strategies. This article chooses fifteen benchmark functions, four proportional integral derivative (PID) control parameter models, and six constrained engineering problems to test. According to the experimental results, the CMSOA can be used in the benchmark functions, in the PID control parameter optimization, and in the optimization of constrained engineering problems. Compared to the particle swarm optimization (PSO), simulated annealing based on genetic algorithm (SA_GA), gravitational search algorithm (GSA), sine cosine algorithm (SCA), multiverse optimizer (MVO), and seeker optimization algorithm (SOA), the optimization ability and robustness of the CMSOA are better than those of others algorithms.
1. Introduction
Recently, the heuristic algorithm has received a lot of attention. Such algorithms create random methods for many optimization problems. Since the no free lunch (NFL) theorem, no one optimization solution can optimize overall questions [1]. Therefore, researchers pose new algorithms or enhance the current algorithms to deal with optimization problems. The current algorithms are the genetic algorithm (GA) [2], particle swarm optimization (PSO) [3], simulated annealing (SA) [4], harmony search (HS) [5], gravitational search algorithm (GSA) [6], mothflame optimization (MFO) [7], sine cosine algorithm (SCA) [8], multiverse optimizer (MVO) [9], seeker optimization algorithm (SOA) [10], monarch butterfly optimization (MBO) [11], slime mould algorithm (SMA) [12], moth search algorithm (MSA) [13], hunger games search (HGS) [14], Runge–Kutta method (RUN) [15], and Harris hawks optimization (HHO) [16].
However, some optimization algorithms are still not very successful in many optimization problems. The optimization problems include the following: being premature, issues with low optimization precision, having only a local optimal solution, slow convergence speed, and insufficient robustness. To better overcome the issues of common optimization precision, prematurity, having only a local optimal solution, slow convergence rate, and poor robustness, some improved algorithms have proven to be feasible optimization algorithms and have been used in practical engineering. For instance, the evolutionary algorithms are improved by the adaptive parameter control methods [17]. The simulated annealing algorithm based on the particle swarm algorithm is adopted to optimize the extracting multiple tests [18]. A whale optimization algorithm based on hybrid algorithm framework with learning and complementary is used in function optimization and engineering design problems [19]. A multilayered gravitational search algorithm is used in function optimization and realworld problems [20]. An artificial bee colony algorithm search is improved by scalefree networks [21]. The chaotic local searchbased differential evolution algorithm is applied to optimize the function optimization and the realworld optimization problems [22].
Also, the complexvalued encoding heuristic algorithms have been proposed according to the characteristics of some algorithms. These complexvalued encoding intelligent optimization algorithms have proven to be feasible optimization algorithms and have been used in practical engineering. For instance, the complexvalued encoding dragonfly algorithm optimized the power systems [23]. A gray wolf optimization based on plural encoding optimized the filter model [24]. The complexvalued encoding satin bowerbird optimization algorithm solved the benchmark functions [25]. The complexvalued encodingdriven optimization optimized the 01 knapsack problem [26]. The complexvalued encoding symbiotic organism search algorithm was proposed for the overall optimization [27]. The complexvalued encoding flower pollination algorithm optimized the constrained engineering optimization problems [28]. A comprehensive survey was offered for the complexvalued encoding metaheuristic optimization algorithm [24].
Dai et al. proposed SOA in 2006 [29]; the goal is to mimic the seekers’ behavior and the way they exchange information and solve practical application optimization problems. Recently, SOA has been used in many fields, such as in unconstrained optimization problems [30], optimal reactive power dispatch [31], a challenging set of benchmark problems [32], the design of a digital filter [33], optimizing parameters of artificial neural networks [34], the optimizing model and structures of fuel cell [35], the novel human group optimizer algorithm [36], and several practical applications [37].
However, in the initial stage of dealing with optimization problems, the SOA converges faster than others. When all individuals are near to the best individual for solving the optimization problem, the individuals will lose diversity and fall into prematurity.
To overcome the shortcomings of the SOA, there are various strategies for improving the SOA, such as the best empirical parameter strategy, the dynamic adaptive Gaussian variation of empirical parameter strategy, the Chebyshev chaos of order three strategy, the real coding doublelink strategy, the complexvalued encoding strategy, and the complexvalued encoding multichain strategy. After improving the SOA with the above strategies and making an experimental comparison, this paper selects several improved strategies with better results to improve the SOA together. In this article, complex number coding and a multichain strategy are used to enhance the global optimization and the local search. We propose the complexvalued encoding multichain seeker optimization algorithm (CMSOA). The multichain strategy includes the complexvalued multichain and the stochastic complex multichain strategy. The CMSOA has been tested on fifteen benchmark functions, four PID control parameter optimizations, and six engineering optimizations taken from the literature. In comparison with PSO, SA_GA, GSA, SCA, MVO, and SOA, the CMSOA can find better values to solving the questions, and the precision and robustness of the CMSOA are better. The complexvalued encoding and the multichain methods enhance the diversity of the individuals and avert premature convergence. The CMSOA overcomes the premature convergence of the SOA. The advantages of the CMSOA are summed up as follows:(1)CMSOA is proposed to enhance the precision and robustness of optimization.(2)With the complexcoded multichain strategy, in complexvalued coding, the real part, imaginary part, and real number are used as parallel individual variables to solve the objective function problem.(3)The stochastic multichain strategy is introduced in the SOA. According to the initial solution generation rule of the complex number coding, the real part, the imaginary part, and the real number are randomly generated as the parallel individual variables to solve the objective function.(4)The complexcoded strategy, the multichain strategy, and the stochastic multichain strategy can improve the diversity of individuals, enhance local search, and avert premature convergence.
The rest of the article is organized as follows. Section 2 presents the SOA and the algorithm improvement strategies. Section 3 describes the CMSOA. Section 4 shows the algorithm optimization experiments, the results, and the analyses. At last, Section 5 gives some conclusions.
2. The Basic SOA and Algorithm Improvement Strategies
The SOA carries out indepth research on human search behavior. It considers optimization as a search for an optimal solution by a search team in search space, taking search team as population and the site of the searcher as task method. Using “experience gradient” to determine the search direction, we use uncertain reasoning to resolve the search step measurement, through the scout direction and search step size to complete the searchers’ position in the search interspace update, to attain the optimization of the solution.
2.1. Key Update Points for SOA
The SOAs have three main updating steps. In this section, i is the ith searcher individual and j represents the individual dimension. s is the total number of individuals; D is the total number of dimensions of the variable; t means the current algebra; and iter_{max} represents the maximum optimization algebra. x_{ij}(t) and x_{ij}(t + 1), respectively, represent the searchers’ site at algebras t and (t + 1).
2.1.1. Search Direction
The forward orientation of a search is defined by the experience gradient obtained from the individuals’ movement and the evaluation of other individuals’ search historical position. The egoistic direction , altruistic direction , and preemptive direction of the ith individual in any dimension can be obtained.
The searcher uses the method of a random weighted average to obtain the search orientation.where t_{1}, t_{2} ∈ {t, t − 1, t − 2}; and are the best advantages of separately; _{i,best} is the historical optimal location in the neighborhood where the ith search factor is located; p_{i,best} is the optimal locality from the ith search factor to the current locality; ψ_{1} and ψ_{1} are random numbers in [0, 1]; and ω is the weight of inertia.
2.1.2. Search Step Size
The SOA refers to the reasoning of the fuzzy approximation ability. The SOA, through the computer language, describes some of the human natural languages that can simulate human intelligence reasoning search behavior. If the algorithm expresses a simple fuzzy rule, it adapts to the best approximation of the objective optimization problems. Greater search step length is more important. However, the smaller fitness corresponds to the smaller search step length. The Gaussian distribution function is adopted to describe the search step measurement.whereα andδ are parameters of a membership function.
According to equation (3), the probability of the output variable exceeding [−3δ, 3δ] is less than 0.0111. Therefore, μ_{min} = 0.0111. Under normal circumstances, the optimal position of an individual has μ_{max} = 1.0 and the worst place is 0.0111. However, to accelerate the convergence speed and get the optimal individual to have an uncertain step size, μ_{max} is set as 0.9 in this paper. Select the following function as the fuzzy variable with a “small” target function value:where μ_{ij} is determined by equations (4) and (5) and I_{i} is the count of the sequence x_{i}(t) of the current individuals arranged from high to low by function value. The function rand(μ_{i}, 1) is the real number in any partition [μ_{i}, 1]. It can be seen from equation (4) that it simulates the random search behavior of human beings. Step measurement of jdimensional search interspace is determined bywhere δ_{ij} is a parameter of the Gaussian distribution function, which is defined bywhere ω is the weight of inertia. As the evolutionary algebra increases, ω decreases linearly from 0.9 to 0.1. and are, respectively, the variate of the minimum value and maximum value of the function.
2.1.3. Individual Location Updates
After obtaining the scout direction and scout step measurement of the individual, the location update is represented by
f_{ij}(t) and α_{ij}(t), respectively, represent the searchers’ search direction and search step size at time t.
2.2. The Algorithm Improvement Strategies
Five strategies for improving the algorithm are listed in this paper.
2.2.1. The Best Empirical Parameter Strategy
The first strategy is an empirical parameter change strategy. In the basic SOA, equation (8) is changed to equation (10), and the empirical value C is changed to a fixed empirical value. Through a large number of experimental tests, the empirical value is C = 0.2. The individual position update is still the same as equation (9).where δ_{ij} is a parameter of the Gaussian membership function [38, 39] and is the variate of the minimum value of the function.
2.2.2. The Dynamic Adaptive Gaussian Variation of Empirical Parameter
In the SOA, equation (8) is changed to equation (11), and the empirical value C_{1} is changed to an adaptive empirical value that varies between 0.1 and 0.5 with the change of optimization algebra according to equation (12). The individual position update is still the same as equation (9).where δ_{ij} is a parameter of the Gaussian membership function [38, 39] and is the variate of the minimum value of the function.
2.2.3. The Chebyshev Chaos of Order Three
The Chebyshev map of order is defined aswhere when , x_{ij} is chaotic and ergodic and has orthogonality. In this case, no matter how close different initial values are, the sequences derived from multiple iterations are not correlated with each other.
2.2.4. The Multichain Strategy/the DoubleChain Strategy
The multichain strategy includes taking the real and the imaginary parts of the plural as separate parallel solutions and the randomly generating parallel solutions according to the complex number coding law.
In this paper, the meaning of the multichain strategy is that a single individual variable in the original SOA is converted into six parallel individual parameters when the CMSOA optimizes a problem. In complexvalued coding, there are real part X_{R}, imaginary part X_{I}, and real number X_{K}. In each iterative loop optimization, X_{R}, X_{I}, and X_{K} are adjusted to the variables that meet the scope of X (X_{min} = A_{k}, X_{max} = B_{k}). X_{R}, X_{I}, and X_{K} were taken as the relative optimal solution variables, respectively, to solve the objective function problem. Secondly, a group of variables that randomly generate X_{R_Random}, X_{I_Random}, and X_{K_Random} according to formulas (9)–(11) and meet the scope of X (X_{min} = A_{k}, X_{max} = B_{k}) should be added in each cycle optimization and taken as the relative optimal solution variable to solve the objective function, respectively. At the end individual of the single solution, the respective optimal solutions are saved, and the global optimal value is saved as the current optimal value after the comparison of each optimal solution. The optimal solution variables of the next generation of X_{R}, X_{I}, and X_{K} are changed according to formulas (13)–(15). The next generation optimal solution variables of X_{R_Random}, X_{I_Random}, and X_{K_Random} are generated randomly according to formulas (9)–(11). In other words, a single individual variable X in the original SOA is converted to six individual variables X_{R}, X_{I}, X_{K}, X_{R_Random}, X_{I_Random}, and X_{K_Random} when solved by the CMSOA, and this is shown in Figure 1. So, instead of solving for one main chain, we are solving for six parallel chains. A multichain strategy is used in the CMSOA; the strategy adds the variety of the individual, enhances the local scout, and averts premature convergence.
For the SOA of real number coding, the real number coding is one chain, and the random generation of real number population is another chain. So, a doublechain is made up of a real number coding chain and a random generation of real number chain.
2.2.5. The ComplexValved Encoding
(1) Initial Population Generation. In light of the variable interval [A_{k}, B_{k}], k = 1, 2, …, 2s − 1, 2s, the modules ρ_{k}, the phase angles θ_{k}, and the plural are produced [40] as follows:
(2) Individual Location Updates. The real part is updated bywhere α_{R} represents the scout direction of the real parts, f_{R} is the scout step measurement of the real parts, and X_{R} represents the location of the real number parts.
The imaginary part is updated bywhere α_{I} represents the scout direction of the imaginary part, f_{I} is the scout step measurement of the imaginary part, and X_{I} represents the location of the imaginary number part.
(3) Fitness Evaluation Method. When calculating fitness values using the SOA, we convert plural to real numbers. The formula is as follows.(1)Take the plural mathematical module as real number:(2)Define the sign according to the phase angle: where X_{k} is the real number.
3. The CMSOA Process
The chromosomes of complex organisms are regarded as doublestranded or multistranded construction. Since a complex value is made up of the real part and the imaginary part [26, 41–43], the complex value is represented as a doublechain. A doublechain represents a chromosome pair, and the individuals that make up the doublechain have the same length. The twobody framework enhances the variety of individuals and makes the algorithm have better searching and calculation capacity.
The CMSOA is based on a multiple population evolution model, three populations evolved by the SOA, and three other populations evolved from random generation. The individual groups use the informationsharing mechanisms to realize coevolution. Algorithm 1 shows the primary process of the CMSOA.

4. Experimental Results
4.1. Experimental Setup
The algorithms used in the experiment in this paper were run under MATLAB R2016a. The computer is configured as Intel (R) Core (TM) i77500U CPU @2.7 GHz 2.9 GHz processor with 8 GB of memory, and the operating system is Windows 10.
4.2. Algorithm Performance Comparison in Benchmark Functions
To ensure that the comparison of these algorithms is fair, the population number of algorithms is 30, and the evolutionary algebra is 1000. At the same time, for further ensuring the fairness of algorithm comparison and reducing the effect of randomness, the results of the seven algorithms after 30 independent runs were selected for comparison.
4.2.1. The Benchmark Functions
In this field, it is common to base the capability of algorithms on mathematic functions that are known to be globally optimal. Fifteen benchmark functions in the literature are used as the comparative test platform [7, 10, 44–46]. Table 1 shows the functions in the experiment. Variables are set to one thousand.
4.2.2. Algorithm Performance Comparison of the SOA with Different Improvement Methods
In this paper, the SOA is improved by six different methods: the parameter changing SOA (PCSOA), the parameter adaptive Gaussian transform SOA (PAGTSOA), the SOA based on the Chebyshev chaos of order three (CCSOA), the SOA based on real coding doublelink (DSOA), the SOA based on complexvalued encoding (CSOA), and the complexvalued encoding multichain seeker optimization algorithm (CMSOA).
(1) Parameter Setting of SOA with Different Improvement Methods. This section will introduce the parameter setting of the improved SOAs used in the experiment in this paper. Dai et al. have done a lot of research on the parameter set of the SOA [33], and we did a lot of practice tests and comparative studies about the parameters. The specific parameters of the improved SOA are shown in Table 2. In the next section, we use these improved SOAs for experimental comparison and choose a relatively optimal improved algorithm to compare with other advanced intelligent algorithms.
(2) Improved Algorithm Performance Comparison in the Benchmark Functions. The SOA is improved in six different ways: the parameter changing SOA (PCSOA), the parameter adaptive Gaussian transform SOA (PAGTSOA), the SOA based on the Chebyshev chaos of order three (CCSOA), the SOA based on real coding doublelink (DSOA), the SOA based on complexvalued encoding (CSOA), and the complexvalued encoding multichain seeker optimization algorithm (CMSOA). To test the performance, each improved algorithm was optimized for the fifteen functions in Table 1. Each algorithm and each function were run independently 30 times. The performance of the SOA and the six improved SOAs in fifteen function optimizations was compared by the mean (Mean), standard deviation (Std.), best fitness (Best), the program running time (Time), and the best fitness rank (Rank) of 30 running results. The optimal fitness reflects the optimization accuracy of the algorithm, the average value and standard deviation reflect the robustness of the algorithms, and the running time reflects the time of the program. The results of functions f_{1}–f_{15} are displayed in Table 3. The values in bold and italics indicate that the optimal result is better.
Based on Table 3, for the benchmark functions f_{1}–f_{15}, the comparison between the seven improved SOAs in this paper and the original SOA shows that the optimization result of the CMSOA is the best value. The mean (Mean), standard deviation (Std.), best fitness (Best), and best fitness rank (Rank) of the CMSOA were the best after 30 independent runs. The total program running time of f_{1}–f_{15} ranks fifth among the seven algorithms compared in this paper. The running time of the CMSOA is longer than that of others algorithms. From the perspective of optimization accuracy and robustness, the CMSOA has the best optimization performance than these improved SOAs in this paper. Section 4.2.3 compares the CMSOA with other intelligent optimization algorithms widely used today.
(3) Search History of the CMSOA. Figure 2 shows the graph of the optimized function f_{1}, the convergence curves, the initial population’s positions, and the search history; the search history behaviors of the search seekers are marked with red mark +. Based on Figure 2, for the benchmark function f_{1}, the convergence curve of the CMSOA is fast. From the search history of the CMSOA, the search seekers of the CMSOA extensively move towards promising search regions in the search space; the search seekers searched the given search space by the moment in the change search step size and different search directions; this gives the way to increase local search, escape local optima, and avoid premature convergence.
Similarly, Figure 3 shows the graph of the optimized function f_{10}, the convergence curves, the initial population’s positions, and the search history; the search history behaviors of the search seekers are marked with red mark +. Based on Figure 3, for the benchmark function f_{10}, the convergence curve of the CMSOA is fast. From the search history of the CMSOA, the search seekers of the CMSOA extensively move towards promising search regions in the search space; the search seekers searched the given search space by the moment in the change search step size and different search directions; this gives the way to increase local search, escape local optima, and avoid premature convergence.
Similarly, Figure 4 shows the graph of the optimized function f_{14}, the convergence curves, the initial population’s positions, and the search history; the search history behaviors of the search seekers are marked with red mark +. Based on Figure 4, for the benchmark function f_{14}, the convergence curve of the CMSOA is the fast. From the search history of the CMSOA, the search seekers of the CMSOA extensively move towards promising search regions in the search space; the search seekers searched the given search space by the moment in the change search step size and different search directions; this gives the way to increase local search, escape local optima, and avoid premature convergence.
4.2.3. The Algorithm Performance Comparison of Different Algorithms in the Benchmark Functions
To test the performance of the CMSOA, the CMSOA is compared with the PSO, SAGA, GSA, SCA, MVO, and SSA, using fifteen benchmark functions [7, 10, 44–46] in Table 1, which have been widely used in the test.
(1) The Parameter Setting of Different Algorithms. In this section, the parameter set of the PSO [47], SA_GA [48], GSA [6], SCA [8], MVO [9], SOA [29], and CMSOA is presented. According to references [6, 8, 23, 29, 47, 48], we did a lot of practice tests and comparative studies for the parameter set. The parameters of the seven algorithms depend on the real experience to take the right value. Table 4 lists the parameters in the test.
(2) The Result Comparison of Different Algorithms in Benchmark Functions. This section uses the same fifteen functions as in Table 1, but we have expanded the dimension of the variables to 1000 dimensions. The mean values, standard deviation, best fitness, and best fitness rank of the algorithms of 30 allalone runs and the data of optimization results of functions f_{1}–f_{15} are shown in Table 5. The values in bold and italics indicate that the optimal outcome is better.
For the benchmark functions f_{1}–f_{15}, based on Table 5, except f_{4}, f_{7}, f_{9}, f_{10}, f_{11}, f_{14}, and f_{15}, the optimal value of the CMSOA is better than that of the others. To f_{9}, the optimal value of the CMSOA has reached the theoretical best value, although the optimal fitness value of the CMSOA is inferior to that of the PSO and the GSA. For f_{7} function, the optimal value of the CMSOA is only worse than that of the PSO algorithm. The optimal fitness value result of the CMSOA to f_{4} is worse than that of the PSO, GSA, and SOA. The optimal fitness value results of the CMSOA for f_{15} function are only worse than those of the PSO algorithm, the optimal fitness value result of the CMSOA for f_{14} function is worse than that of the PSO and SOA, the optimal fitness value result of the CMSOA for f_{11} function is worse only than that of the SCA algorithm, and the result of the CMSOA for f_{10} is worse than that of the SOA and the MVO.
For the benchmark functions f_{1}–f_{15}, based on Table 5, except f_{2}, f_{3}, f_{4}, f_{7}, f_{9}, f_{10}, f_{11}, f_{12}, and f_{14}, the standard deviation results of the CMSOA are better than those of the others. The standard deviation results of the CMSOA for f_{9} are only worse than those of the PSO algorithm, the result of the CMSOA for f_{7} is worse than that of the PSO, GSA, and SOA, the result of the CMSOA for f_{4} is worse than that of the SCA, GSA, SA_GA, PSO, and the MVO, the result of the CMSOA for f_{3} is worse than that of the MVO and SOA, and the result of the CMSOA for f_{2} is only worse than that of the SOA. To f_{2} function, the standard deviation results of the PSO, SA_GA, and the SCA algorithm have no solution, and the GSA and the MVO algorithm have an infinite standard deviation. The CMSOA is better than the others. For f_{14}, the CMSOA is worse than the PSO and the GSA. For f_{11} and f_{12}, the CMSOA is worse than the PSO, SA_GA, GSA, MVO, and SOA, and the standard deviation results of the CMSOA for f_{10} function are worse than those of the PSO, SA_GA, GSA, SCA, and the MVO algorithm.
For the benchmark functions f_{1}–f_{15}, based on Table 5, except f_{3}, f_{4}, f_{7}, f_{9}, f_{10}, and f_{14}, the mean values of the CMSOA are better than those of the others. For f_{9}, the mean test results of the CMSOA have reached the theoretical best value; although the mean test result of the CMSOA is worse than that of the PSO, the result of the CMSOA for f_{7} is worse than that of the PSO, the result of the CMSOA for f_{4} is worse than that of the PSO, GSA, and SOA, and the result of the CMSOA for f_{3} is only worse than that of the GSA and SOA. The CMSOA is better than the others, for f_{10}, the CMSOA is worse than the MVO and SOA, and for f_{14}, the CMSOA is only worse than the PSO algorithm.
According to the optimal fitness value mean rank and all rank results from Table 5, the CMSOA can find solutions and has strong optimization ability and strong robustness to benchmark function.
(3) Convergence Curve Comparison of Algorithms in the Benchmark Functions. Figure 5 shows the ﬁtness curves of the best fitness for the benchmark functions f_{1}–f_{15} (D = 1000). As seen from Figure 5, the CMSOA is compared to the other six algorithms; the convergence of the CMSOA is faster, and the precision of the CMSOA is better, except f_{4}, f_{7}, f_{9}, f_{10}, f_{14}, and f_{15}. Although the CMSOA for f_{9} is worse than the PSO in terms of convergence and the precision, the CMSOA has reached the theoretical best value, for f_{7}, the CMSOA is only worse than the PSO, and for f_{4}, the CMSOA is worse than the PSO, GSA, and SOA. The CMSOA for f_{15} is only worse than the SOA in terms of convergence and precision, for f_{14}, the CMSOA is worse than the SOA and the PSO, and for f_{10}, the CMSOA is worse than the MVO and SOA. Because of the multichain strategy to augment the individuals’ diversity and local scout intensity, the CMSOA has better optimization property.
(a)
(b)
(c)
(d)
(4) ANOVA Test Comparison of Algorithms in Benchmark Functions. Figure 6 shows the ANOVA of the global best values for benchmark functions f_{1}–f_{15} (D = 1000). As seen from Figure 6, the CMSOA is the most robust, except f_{3}, f_{4}, f_{7}, f_{10}, f_{12}, and f_{14}. The ANOVA test results of the CMSOA for f_{7} are only worse than those of the PSO algorithm, the result of the CMSOA for f_{4} is worse than that of the PSO, GSA, and SOA, and the result of the CMSOA for f_{3} function is only worse than that of the GSA and SOA. The ANOVA test results of the CMSOA for f_{10} function are only worse than those of the MVO algorithm, and the results of f_{12} and f_{14} are worse than that of the PSO and SOA. The CMSOA showed better robustness.
(a)
(b)
4.2.4. Complexity Analysis
The calculation complexity of the basic SOA is O (N.D.M), where N is the total individual count, D is the dimension count, and M is the maximum count of algebras. The computational complexity of the first phase of the SOA stage is O (N.D.M). The complex coding strategy is introduced to calculate the O (N.D.M) value. The introduced multichain strategy’s calculational complexity value is O (N.D.M). So, the overall complexity of the CMSOA is O (N.D.M + N.D.M + N.D.M). Based on the principle of BigO representation [49], if the count of algebras is high (M ≫ N, D), the calculational complexity is O ((N.D.M). Therefore, the overall calculational complexity of the CMSOA is almost the same as the basic SOA.
4.2.5. Run Time Comparison of Algorithms in Benchmark Functions
In this section, we recorded the running time of each algorithm under the same conditions: population number 30, evolution algebra 1000, and 30 independent runs of the above fifteen benchmark functions f_{1}–f_{15} (D = 1000). Then, the running time of the fifteen functions is added to obtain the sum of the 30 independent running times of each algorithm for the fifteen functions listed in this paper and the ranking of the total time, as shown in Table 6. As seen from Table 6, the PSO algorithm has the most minor program running time, followed by the SCA algorithm, which has more minor program running time, and the CMSOA ranks sixth, which has relatively more program running time. At the bottom of the list is the SA_GA algorithm, which takes the most running time.
To learn more about the program running time of the seven algorithms in the fifteen functions, a bar chart (Figure 7) was made for the total time of each algorithm after 30 independent runs. From Figure 7, the program running time of the PSO is the least, while that of the SA_GA algorithm is the most, and the program running time of the CMSOA is less than half of that of the SA_GA algorithm, which is relatively large.
4.2.6. Exploration and Exploitation in Benchmark Functions
According to [50–52], formulas (22)–(25) represent the exploration and development capability of an algorithm.where median x^{j} is the median of dimension j in whole swarm, is the dimension j of the swam individual i, n is the size of swarm, Div_{j} is the average for all the individuals, Div is the diversity of swarm in an iteration, Div_{max} is the maximum diversity in all iterations, and Xpl% and Xpt% are the exploration and exploitation percentages for an iteration, respectively.
Figure 8 shows the exploration and exploitation abilities of the CMSOA as the number of iterations increases in the benchmark functions f_{1}–f_{15}. As observed from the plotted curves shown in Figure 8, the CMSOA maintains good balance between the exploration and exploitation ratios as the number of iterations increases.
(a)
(b)
4.2.7. Performance Profiles of Algorithms in Benchmark Functions
The average fitness was selected as the capability index. The algorithmic capability is expressed in performance profiles, which are calculated by the following formulas:where represents an algorithm, G is the algorithm set, f represents a function, F is the function set, is the number of algorithms in the experiment, n_{f} is the number of functions in the experiment, is the average fitness after the algorithm solves function f, is the capability ratio, is the algorithmic capability, and τ is a factor of the best probability [53].
Figure 9 shows the capability ratios of the mean fitness for the seven algorithms on the benchmark functions f_{1}–f_{15} (D = 1000). The results are displayed by a log scale 2. As shown in Figure 9, the CSMOA has the highest probability. When τ = 1, the CMSOA is about 0.6, which is better than others. When τ = 4, the CMSOA is about 0.87, the PSO is 0.53, the SOA is 0.40, the GSA is 0.067, the MVO is 0.067, the SCA is 0.067, and the SA_GA is 0.067. When τ = 12, the CMSOA is 0.87, the PSO is 0.73, the SOA is 0.80, the GSA is 0.33, the MVO is 0.33, the SCA is 0.27, and the SA_GA is 0.2. The capability curve of the CMSOA lies above others, and the CMSOA can achieve about 0.87 when τ ≥ 4. Thus, the property of the CMSOA is better than that of other algorithms.
4.3. Algorithm Performance Comparison in PID Controller Parameter Optimization Problems
In this section, we use four test control system models optimizing the PID parameters to test the capability of the CMSOA. For the , the population number of all algorithms is 20, the max number of algebras is 20, the step response time of is set to 10 s, and the step response time of is set to 30 s. For , the population number of all algorithms is 50, the max number of algebras is 50, and the step response time is set at 50 s.
4.3.1. Control System Models
Equations (28)–(31) show the test control system models optimizing PID parameters used in our experiment. Figure 10 shows the process diagram for optimizing the test control system PID parameters by the CMSOA. Figure 11 shows the optimization of PID parameter model structure of the test control system.
4.3.2. Result Comparison of Algorithms in PID Controller Parameter Optimization
For testing the capability of the CMSOA, the CMSOA is compared with the PSO, SAGA, GSA, SCA, MVO, and SOA in the PID controller parameter optimization. The mean values, standard deviation values, best fitness values, and best fitness value ranks of the algorithms of 30 allalone runs, for , are displayed in Table 7. The values in bold and italics indicate that the optimal result is better.
For the PID controller parameter optimization problems, according to Table 7, except and , in terms of best fitness, the CMSOA is better than others. The optimal fitness value results of the CMSOA for model are only worse than those of the SA_GA algorithm; the optimal fitness value result of the CMSOA for g4 model is only worse than that of the PSO algorithm. Except for and , in terms of standard deviation results, the CMSOA is better than others, and the CMSOA is only worse than the SA_GA. In terms of mean, the CMSOA is better than others. According to the optimal fitness value mean rank and all rank results from Table 7, the CMSOA can find solutions and has very strong robustness for the PID controller parameter optimization problems.
4.3.3. The Convergence Curve Comparison of Algorithms in PID Controller Parameter Optimization
Figure 12 shows the ﬁtness curves of PID controller parameter optimization for . As shown in Figure 12, the CMSOA is compared with the other six algorithms; the convergence of the CMSOA is fast, and the precision of the CMSOA is best. The CMSOA can find the optimal value.
4.3.4. ANOVA Test Comparison of Algorithms in the PID Controller Parameter Optimization
Figure 13 shows the ANOVA of the global best values’ PID controller parameter optimization for . As seen from Figure 13, the CMSOA is the most robust compared to other algorithms.
4.3.5. The Unit Step Functions of PID Controller Parameter Optimization
Figure 14 shows the unit step functions of PID controller parameter optimization for . As seen from Figure 14, by the CMSOA to optimization the unit step models PID controller parameter for , the unit step functions tend to stabilize very quickly and accurately.
Therefore, the CMSOA is an effective and feasible PID parameter optimization solution for the control system model.
4.4. Algorithm Performance Comparison in Constrained Engineering Optimization Problems
We use six engineering problems to test the capability of the CMSOA further. The engineering problems are very popular in the literature. The penalty function is used to calculate the constrained problem. The parameter set for all of the heuristic algorithms still adopts the parameter setting from Table 4 of Section 4.2.3. The formulations of these problems are available in Appendix.
4.4.1. Welded Beam Design Problem
This is a least fabrication cost problem, which has four parameters and seven constraints. The parameters of the structural system are shown in Figure 15 [9]. Some of the works come from these kinds of literature: GSA [6], MFO [7], MVO [9], coevolutionary particle swarm optimization (CPSO) [54], and harmony search (HS) [55]. For the problem in this paper, the CMSOA is compared with the PSO, SA_GA, GSA, SCA, MVO, and SOA, and it provides the bestobtained values in Table 8.
In Table 8, the CMSOA is better than the GSA, MFO, MVO, GA, CPSO, and HS algorithms. The CMSOA is also better than the PSO, SA_GA, GSA, SCA, MVO, and SOA. Therefore, the CMSOA is an effective and feasible solution to the problem.
4.4.2. Pressure Vessel Design Problem
This is also a least fabrication cost problem of four parameters and four constraints. The parameters of the structural system are shown in Figure 16 [9]. Some of the works come from the literature: the MFO [7], the evolution strategies (ESs) [56], the differential evolution (DE) [57], the ant colony optimization (ACO) [58], and the GA [59]. For the problem, the CMSOA is compared with the PSO, SA_GA, GSA, SCA, MVO, and SOA, and it provides the bestobtained values in Table 9.
In Table 9, the CMSOA is better than the MFO, ES, DE, ACO, and GA. The CMSOA is also better than the PSO, SA_GA, GSA, SCA, MVO, and SOA. Therefore, the CMSOA is an effective and feasible solution to the problem.
4.4.3. Cantilever Beam Design Problem
This is a problem that is determined by five parameters and is only applied to the scope of the variables of constraints. The parameters of the structural system are shown in Figure 17 [7]. Some of the works come from these kinds of literature: the MFO [7], the cuckoo search algorithm (CS) [60], the generalized convex approximation (GCA) [61], the method of moving asymptotes (MMA) [61], and the symbiotic organism search (SOS) [62]. For the problem, the CMSOA is compared with the PSO, SA_GA, GSA, SCA, MVO, and SOA, and it provides the bestobtained values in Table 10.
In Table 10, the CMSOA is better than the MFO, CS, GCA, MMA, and SOS. The CMSOA is also better than the PSO, SA_GA, GSA, SCA, MVO, and SOA. Therefore, the CMSOA is an effective and feasible solution to the problem.
4.4.4. Gear Train Design Problem
This is a gear ratio minimization problem, which has four variables and the scope of variables of constraints. Figure 18 shows the schematic diagram [63]. Some of the works come from these kinds of the literature: the MFO [7], the MVO [9], the CS [60], the artificial bee colony (ABC) [64], and the mine blast algorithm (MBA) [64]. In this paper, the CMSOA is compared with the PSO, SA_GA, GSA, SCA, MVO, and SOA, and it provides the bestobtained values in Table 11.
In Table 11, the CMSOA proves to be better than the MFO, MVO, CS, ABC, and MBA. Except for the SA_GA, GSA, and PSO, the CMSOA is better than the SCA, MVO, and SOA. The optimal fitness value of the CMSOA has reached the theoretical best value, although the optimal fitness value of the CMSOA is worse than that of the SA_GA, GSA, and PSO. The CMSOA finds a new value. Therefore, the CMSOA can resolve the problem.
4.4.5. ThreeBar Truss Design Problem
This is a weight minimization problem under stress, which has two variables and only applies to the scope of the variables of constraints. The schematic diagram of the components [63] is shown in Figure 19 [9].
Some of the works come from these kinds of literatures: MFO [7], MVO [9], CS [60], MBA [64], and differential evolution with dynamic stochastic selection (DEDSS) [65]. In this paper, the problem is resolved by the CMSOA. For the problem, the CMSOA is compared with the PSO, SA_GA, GSA, SCA, MVO, and SOA, and it provides the bestobtained values in Table 12.
In Table 12, except MVO and PSO, the CMSOA is better than the others. The optimal fitness value of the CMSOA has reached the theoretical best value, although the optimal fitness value of the CMSOA is worse than that of the MVO and the PSO. Therefore, the CMSOA can resolve the problem.
4.4.6. IBeam Design Problem
This is a vertical deflection minimization problem that has four variables and a constraint. Figure 20 shows the design diagram [7]. Some of the works come from these kinds of literatures: MFO [7], CS [60], SOS [62], the adaptive response surface method (ARSM) [66], and the improved adaptive response surface method (IARSM) [66]. For the problem, the CMSOA is compared with the PSO, SA_GA, GSA, SCA, MVO, and SOA, and it provides the bestobtained values in Table 13.
In Table 13, except MFO, GSA, SOA, and SAGA, the CMSOA is better than the others. The fitness of the MFO is best. Although the most minor vertical deviation of the CMSOA is not as good as that of the GSA, SOA, and SAGA, it is very close to other relative optimal values. Therefore, the CMSOA is an effective and feasible solution to the Ibeam design optimization problem.
In brief, the CMSOA proves to be better than the other algorithms in most actual studies. The CMSOA can resolve these practical problems.
5. Conclusion
A CMSOA is presented, with a complexvalued encoding method and a multichain strategy. The CMSOA is tested in four stages from different perspectives such as benchmark function, PID control parameters, and constraint engineering. Besides, the CMSOA was compared with the PSO, SAGA, GSA, SCA, MVO, and SOA.
In the first phase, the SOA is improved in six different ways: the parameter changing SOA (PCSOA), the parameter adaptive Gaussian transform SOA (PAGTSOA), the SOA based on the Chebyshev chaos of order three (CCSOA), the SOA based on real coding doublelink (DSOA), the SOA based on complexvalued encoding (CSOA), and the complexvalued encoding multichain seeker optimization algorithm (CMSOA). Each improved algorithm was optimized for the fifteen functions. The result is that the CMSOA is feasible in the benchmark functions. In this phase, we consider the ranking values of 30 allalone runs between the CMSOA mean values, standard deviation values, best fitness values, best fitness value ranks, the convergence curves of functions f_{1}, f_{10}, and f_{14}, and the population’s positions search history of functions f_{1}, f_{10}, and f_{14}.
In the second phase, fifteen benchmark function optimization problems are used to test the CMSOA further. The CMSOA is compared to the PSO, SAGA, GSA, SCA, MVO, and SOA for verification. The CMSOA is feasible in the benchmark functions. The second phase is also about the ranking values of 30 allalone runs between the CMSOA mean values, standard deviation values, best fitness values, best fitness value ranks, convergence curves, and the variance tests for the global minimum values. In the benchmark function optimization problems, the optimal solution curves obtained by the CMSOA are in good agreement with the theoretical optimal solution curves, and the accuracy of the CMSOA is better. The ANOVA of the global best values to benchmark functions is studied, and the CMSOA is the most robust algorithm. Based on the complexity analysis, the CMSOA is known as an efficient algorithm. Based on the run time comparison of seven algorithms in benchmark functions, the CMSOA has relatively more program running time, and it is not optimal in terms of running time. The exploration and exploitation abilities of the CMSOA in benchmark functions are studied, and the CMSOA maintains good balance between the exploration and exploitation abilities as the number of iterations increases. From the results of the performance ratios of the average solution for the seven algorithms, the optimization probability of the CMSOA is the highest.
In the third test phase, four PID control parameter optimization problems are used to test the CMSOA in practice further. The problems were a parameter optimization model of secondorder PID controller without time delay, a parameter optimization model of PID controller with firstorder microdelay, a parameter optimization model of firstorder PID controller with significant time delay, and a parameter optimization model of highorder PID controller without time delay problems. The third test phase also considered the CMSOA mean values, standard deviation values, best fitness values, and best fitness values rank of 30 all alone runs, the convergence curves, and the ANOVA. From the results of PID parameter optimization problems, compared with the other six algorithms, the CMSOA is effective and feasible in the practical problem.
Eventually, in the last test phase, six engineering problems further tested the CMSOA. The CMSOA was compared with various algorithms. The results prove that the CMSOA is the highest competitive algorithm for the practical optimization problems.
According to the comparative analysis of the experiments, the conclusion is as follows:(i)We use the complexvalued encoding and the multichain strategy for each seeker to increase the scout region and avoid convergence to local optimality.(ii)Among the six improved algorithms: PCSOA, PAGSOA, CCSOA, DSOA, CSOA, and CMSOA, the CMSOA performed best in the benchmark function test.(iii)Among the seven algorithms PSO, SA_GA, GSA, SCA, MVO, SOA, and CMSOA, the CMSOA optimization benchmark functions have higher optimization capability.(iv)The CMSOA optimization benchmark function has almost the same calculational complexity as the SOA.(v)The running time of the CMSOA optimization benchmark function is relatively long. Among the seven algorithms compared, the running time is only shorter than that of the SAGA.(vi)The CMSOA can solve real challenging problems, such as the PID control parameter optimization problems and the real constrained engineering optimization problems.(vii)Further improvement and application can be incorporated into future studies.
Appendix
A. Welded Beam Design Problem
where , , , , , , , , , P = 6000 lb, L = 14 in, E = 30 × 10^{6} psi, G = 12 × 10^{6} psi, τ_{max} = 136000 psi, σ_{max} = 30000 psi, δ_{max} = 0.25 in.
B. Pressure Vessel Design Problem
C. Cantilever Design Problem
D. Gear Train Design Problem
E. ThreeBar Truss Design Problem
F. IBeam Design Problem
Data Availability
The data used to support the findings of this study are available from the corresponding author upon request.
Conflicts of Interest
The authors declare that there are no conflicts of interest regarding the publication of this paper.
Acknowledgments
This study was supported by the National Natural Science Foundation of China (grant nos. 51766005 and 52166001) and Science and Technology Project of Yunnan Tobacco Company of China (grant no. 2019530000241019).