Abstract

To solve the premature convergence problem of the standard chicken swarm optimization (CSO) algorithm in dealing with multimodal optimization problems, an improved chicken swarm optimization (ICSO) algorithm is proposed by referring to the ideas of bacterial foraging algorithm (BFA) and particle swarm optimization (PSO) algorithm. First, in order to improve the depth search ability of the algorithm, considering that the chicks have the weakest optimization ability in the whole chicken swarm, the replication operation of BFA is introduced. In the role update process of the chicken swarm, the chicks are replaced by the same number of chickens with the strongest optimization ability. Then, to maintain the population diversity, the elimination-dispersal operation is introduced to disperse the chicks that have performed the replication operation to any position in the search space according to a certain probability. Finally, the PSO algorithm is integrated to improve the global optimization ability of the algorithm. The experimental results on the CEC2014 benchmark function test suite show that the proposed algorithm has good performance in most test functions, and its optimization accuracy and convergence performance are also better than BFA, artificial fish swarm algorithm (AFSA), genetic algorithm (GA), and PSO algorithm, etc. In addition, the ICSO is also utilized to solve the welded beam design problem, and the experimental results indicate that the proposed algorithm has obvious advantages over other comparison algorithms. Its disadvantage is that it is not suitable for dealing with large-scale optimization problems.

1. Introduction

The multimodal optimization problem is a complex function optimization problem with one or more local extrema [1]. In practical applications, there are many multimodal optimization problems, such as parameter estimation and identification of models [2, 3], engineering structure optimization, welded beam design [4], and medical diagnosis [5]. It is difficult to find the global optimum because there are many local extrema in the multimodal optimization problem. Therefore, it is of very importance to study efficient and reasonable algorithms [6, 7].

The swarm intelligence optimization algorithm is a kind of bionic random search algorithm which can solve complex optimization problems by imitating the ecosystem mechanism in nature. Because of its strong global search ability, high efficiency, and insensitivity to initial values, it has been widely concerned by relevant researchers [810]. At present, hundreds of algorithms based on swarm intelligence have emerged, such as genetic algorithm (GA) [11], particle swarm optimization (PSO) algorithm [12], bacterial foraging algorithm (BFA) [13], and artificial fish swarm algorithm(AFSA) [14]. In the swarm intelligence optimization algorithm, there are no individuals with centralized control, and the interaction between individuals is extremely simple, but they have the ability to solve complex problems in a short time, which makes them very suitable for solving complex optimization problems in practice. Moreover, the swarm intelligence optimization algorithm does not need the gradient information of the optimization problem, so it belongs to the nongradient optimization algorithm. Therefore, it has a wide range of applications [15, 16]. At present, it has been applied in optimization calculation [17, 18], workshop scheduling [19, 20], image engineering [21], network structure optimization [22], vehicle routing problem [23], control of teleoperating systems, and other fields [24].

As a kind of swarm intelligent algorithm, the chicken swarm optimization (CSO) algorithm was proposed by Meng et al. in 2014 [25]. Due to its good stability and global search ability, the algorithm has attracted extensive attention since it was proposed and has been successfully applied in some fields [26, 27]. Liang et al. combined the CSO algorithm with the pulse-coupled neural network for image segmentation. The experimental results show that this method has obvious advantages in convergence speed and segmentation accuracy [28]. Cristin et al. combined the CSO algorithm with the deep neural network for the classification of brain tumors and achieved good performance in terms of accuracy, specificity, sensitivity, and so on [29].

Although the CSO algorithm has shown good performance in solving many benchmark problems and practical problems, it also has some inherent shortcomings, such as premature convergence and falling into local extrema. Therefore, researchers have proposed many improved algorithms. At present, the improvement of the CSO algorithm can be mainly classified into three categories as follows:(1)Combination with other swarm intelligent algorithms. For example, Li et al. improved the search ability of the CSO algorithm by integrating the operation of gray wolf optimization algorithm and PSO algorithm into the CSO algorithm. The experimental results show that the algorithm is superior to other basic swarm intelligent algorithms in accuracy, convergence speed [30], etc. Sampath fine-tuned the solution of the CSO algorithm by introducing the differential evolution algorithm to avoid premature convergence and applied the proposed algorithm to solve routing problems [31].(2)Modification of position update formulas of the chicken swarm. For example, Gu et al. improved the CSO algorithm by introducing an adaptive update factor and designing the inertia weight and successfully applied it to parameter estimation of the Richards model [32]. Considering that the imbalance between the diversity and intensification of the population may affect the performance of the CSO algorithm, Lin et al. proposed an improved CSO algorithm by modifying the position update formulas of the roosters and hens. Experiments show that this algorithm has obvious advantages over other swarm intelligent algorithms in terms of optimization accuracy and robustness [33].(3)Multiobjective CSO algorithm. In order to solve the configuration problem of electric vehicle charging stations, Deb et al. proposed a new hybrid multiobjective CSO algorithm based on pareto optimization. The effectiveness of the algorithm is verified on some multiobjective benchmark problems and the electric vehicle charging station configuration problem [34].

The performance of the aforementioned swarm intelligence optimization algorithms has been improved to a certain extent, but there are still some disadvantages. For example, the literature [18] introduced the differential evolution strategy and quantum behavior into the bird swarm algorithm. Although the convergence speed of the algorithm was enhanced, the problem of premature convergence still existed. In the literature [30], several improved factors are introduced into the position update formulas of the chicken swarm, which improves the ability of the algorithm to jump out of the local extrema to a certain degree. But the optimization accuracy still needs to be further improved.

To solve the abovementioned problems, in this paper, an improved chicken swarm optimization (ICSO) algorithm is proposed which combines the idea of PSO with the replication and elimination-dispersal operations of BFA. More specifically, our contributions are as follows:(1)The replication operation of BFA is applied to the chicks with the weakest optimization ability to inherit the optimal food source in the whole chicken swarm, which is profitable to enhance the depth search ability of the algorithm.(2)The chicks are dispersed to any position in the search space according to a certain probability by using the elimination-dispersal operation of BFA, which is beneficial to improve the population diversity.(3)The idea of PSO is integrated to improve the global search ability of the algorithm. The experimental results on CEC2014 standard function test suite preliminarily show that this algorithm has obvious advantages over other swarm intelligent algorithms in optimization accuracy and convergence performance [35]. In addition, compared with other comparison algorithms, it also obtains competitive results in solving the welded beam design problem.

2. CSO Algorithm

The CSO algorithm is a meta-heuristic optimization algorithm that simulates the foraging behavior of roosters, hens, and chicks in nature. The characteristics of the algorithm are as follows:(1)Correspondence with the optimization problem. The algorithm regards several randomly generated positions in the solution space of the optimization problem as several chickens, and the food source of each chicken is measured by the fitness function value of the corresponding optimization problem.(2)Hierarchical order, that is, the role assignment of the chicken swarm. The whole chicken swarm is composed of four roles, namely, the roosters, hens, chicks, and mother hens. Their role assignment is based on the content of food sources. The rooster has the best food sources, the hens take the second place, the food sources of the chicks are the worst, and the mother hens are randomly selected from the hens.(3)Subgroup division. During the whole foraging process, the chicken swarm is divided into several subgroups. The number of subgroups is determined by the number of the roosters, because each subgroup is composed of a randomly selected rooster, several hens, and chicks, and there is at least one hen in each subgroup.(4)Relation of dependence. In the foraging process of the chicken swarm, the chicks follow the chicks’ mothers (mother hens) and the hens follow the roosters in their subgroups to forage for food. They can also steal the good food sources found by other subgroups at random.(5)Information exchange. The hierarchical order of the chicken swarm and the mother-child relationship between the mother hens and the chicks will be updated after several iterations. The information exchange between subgroups will be realized through continuous role assignment.(6)Parallel optimization. The whole chicken swarm can realize parallel optimization through the division of labor and cooperation mechanism between subgroups, quickly find the best food sources, and then obtain the solution to the optimization problem. The formulas used by chickens with different roles in foraging are given below.

The formula used by the roosters in foraging is as follows:where represents the position of the ith rooster in the t-th iteration, and dim is the dimension of the optimization problem. is a normal distribution with a mean value of 0 and a standard deviation of . is a very small number that can be expressed by the computer, which is used to avoid the situation that the denominator is 0 in the formula. represents the content of the food source of any other rooster which is different from the ith rooster. rNum represents the number of roosters.

The formula used by the hens in foraging is as follows:where is the position of the ith hen in the t-th iteration. Rand is a random number function with a value range of . is the position of the rooster which is in the same subgroup as the ith hen. is any chicken randomly selected from the whole chicken swarm, which is different from the ith hen, and r1 ≠ r2.

The formula used by the chicks in foraging is as follows:where represents the ith chick, and is the position of the chick’s mother. is a following coefficient that the chick follows the chick’s mother to search for food.

The corresponding basic flowchart is shown in Figure 1.

3. The Proposed Algorithm

Aiming at the premature convergence of the standard CSO algorithm in solving multimodal optimization problems, an ICSO algorithm is proposed in this paper. Considering that the chicks have the weakest optimization ability in the whole chicken swarm, the algorithm improves the foraging behavior of the chicks by referring to the reproduction and elimination-dispersal operations of BFA. At the same time, considering that the PSO algorithm has good global search ability, on the basis of individual division of labor and cooperation optimization mechanism of the CSO algorithm, a hybrid CSO algorithm is constructed to enhance the global search ability of the algorithm by integrating the PSO algorithm.

3.1. CSO Algorithm with Reproduction and Elimination-Dispersal Operations (RECSO Algorithm)

In the standard CSO algorithm, chicks are the most vulnerable group. Therefore, in order to improve the depth search ability of the algorithm, this paper introduces the reproduction and elimination-dispersal operations of BFA into the chicks’ foraging behavior, and a RECSO algorithm is proposed. Firstly, in the role update process of the chicken swarm, the replication operation of BFA is introduced to replace the chicks with the same number of chickens with the strongest optimization ability. Through this behavior, the depth optimization speed of the chicken swarm can be accelerated. At the same time, in order to maintain the population diversity, the elimination-dispersal operation is introduced to disperse the chicks that have performed the replication operation to any position in the search space according to a certain probability. Through this operation, the chicken swarm can avoid falling into the local extrema. The specific reproduction and elimination-dispersal operations are as follows:

3.1.1. Reproduction Operation

According to the fitness function values of the chicken swarm, let the chicks with poor optimization ability inherit the position of the chickens with the strongest optimization ability. The formula is as follows:where rNum is the number of roosters and hNum is the number of hens.

3.1.2. Elimination-Dispersal Operation

The chicks are dispersed to any position in the search space with the probability , where . The concrete contents are given as follows:for i = (rNum + hNum + 1): pop do. if then end.end.

Here, pop is the population size. lb and ub are the lower and upper bounds of search range, respectively.

3.2. CSO Algorithm Based on PSO (CSO-PSO Algorithm)

To improve the global optimization ability of the CSO algorithm, in this section, we construct a CSO-PSO algorithm to enhance the ability of the algorithm to jump out of the local extrema by integrating PSO into the CSO algorithm. The flowchart of the CSO-PSO algorithm is shown in Figure 2.

The main steps are described as follows:(1)Population initialization. It mainly involves the parameter settings and determination of initial individual between CSO and PSO algorithms.(2)Fitness evaluation. The initial swarm optimal value is recorded on the bulletin board.(3)Subgroup division. The initial population is divided into two parts with the same scale as follows: subgroup 1 and subgroup 2.(4)CSO. We assign roles to subgroup 1 according to the CSO algorithm and perform the foraging behavior of the chicken swarm to search for the global optimal value.(5)PSO. The velocity and position of particles in subgroup 2 are updated according to the PSO algorithm to search for the global optimal value.(6)Information exchange. We merge subgroup 1 and subgroup 2 to realize information exchange.(7)Updating swarm optimal value. The swarm optimal value is updated according to the subgroup optimal values obtained in steps (4) and (5).(8)Werepeat steps (3)–(7) until the maximum number of iterations is reached and the optimal value is output.

3.3. ICSO Algorithm

In view of the premature convergence problem of the standard CSO algorithm in dealing with multimodal optimization problems, an ICSO algorithm is proposed in this section. Considering that the chicks have the weakest optimization ability in the whole chicken swarm, the RECSO algorithm in Section 3.1 is introduced to improve the depth search ability of the algorithm. At the same time, the CSO-PSO algorithm in Section 3.2 is introduced to improve the global search ability of the algorithm. The flowchart of the ICSO algorithm is shown in Figure 3. The green part is the improvement strategy of reproduction and elimination-dispersal operations introduced in Section 3.1 and the yellow part is the improvement strategy of the PSO algorithm integrated in Section 3.2.

The corresponding detailed steps are as follows:(1)Parameter settings. It mainly involves the maximum number of iterations M, the population size pop, the dimension of solution space dim, and the limited number of role updates G.(2)Population initialization. pop solutions are randomly generated in the solution space of the optimization problem, which are used as the initial positions of the population. According to the fitness function of the optimization problem, the fitness value of each position is calculated as a food source.(3)Fitness evaluation. By comparing the food source content of the whole initial population, the optimal food source and corresponding individual position of the population are recorded.(4)Subgroup division. The whole population is randomly divided into two parts, namely, subgroup 1 and subgroup 2.(5)RECSO. The subgroup 1 is optimized according to the RECSO algorithm. The details are summarized as follows:(a)Judgment of the role update condition. The role update condition of the whole algorithm is mod(t,G) = = 1, where t is the current iteration number, mod is a remainder function. If the condition is false, we jump to step (d); otherwise, we judge whether it is the first iteration of the algorithm, if so, we go to step (c), if not, we go to step (b).(b)Reproduction and elimination-dispersal operations. We perform the reproduction and elimination-dispersal operations which are described in Section 3.1 on the chicks.(c)Role assignment and subgroup division. According to the fitness function of the current chicken swarm, we update the hierarchical order and mother-child relationship of the whole chicken swarm. After that, we divide the subgroups and determine the number of subgroups according to the number of the roosters. Each rooster belongs to different subgroups. The hens and chicks are randomly assigned to different subgroups, but it is necessary to ensure that there is at least one hen in each subgroup.(d)Foraging behavior. According to (1), (3), and (6), the foraging behaviors are performed by chickens with different roles.(e)Update of the optimal food source. At the end of each iteration, the optimization situation of chickens with different roles will change accordingly. We calculate the food source content of the current chicken swarm according to the fitness function and record the optimal food source and its corresponding position by comparing them with the previous situation.(6)PSO. The velocity and position of particles in subgroup 2 are updated according to the PSO algorithm. The optimal food source and its corresponding position are recorded.(7)Information interaction. We merge subgroups 1 and 2 to realize the information interaction.(8)Update of swarm optimal value. The swarm optimal value is updated according to the subgroup optimal values obtained in steps (e) and (6).(9)Judgment of the algorithm’s termination conditions. If the maximum number of iterations specified by the algorithm is reached, the optimal value is output and the program operation is ended; otherwise, we jump to step (4).

4. Simulation Experiment

4.1. Experimental Setup
4.1.1. The Experimental Environment

The experimental environment of this paper is described as follows: Windows 7 operating system, CPU: 3.5 GHz RAM: 12 GB and the programming environment is MATLAB R2016a. In order to verify the effectiveness and superiority of the ICSO algorithm, we conducted experiments on CEC2014 function test suite, which provides 30 test functions, including 3 unimodal functions: f1f3, 13 simple multimodal functions: f4f16, 6 hybrid functions: f17f22, and 8 composition functions: f23f30. The search range of all test functions is [−100 100]. In order to understand CEC2014 function test suite more intuitively, the function types, numbers, names, and theoretical global optimal values are given in Table 1.

4.1.2. The Parameter Settings

To make a more reasonable comparison, for all algorithms involved in this paper, the population size is set to 100, the maximum number of iterations is 10000, and the dimension of solution space is 10. All algorithms are independently run 30 times on each test function, and then the mean values are calculated. Other parameter settings involved in this experiment are as follows:(1)Parameter settings of ICSO and CSO algorithms. The limited number of role updates in the chicken swarm G = 10. The ratios of roosters and hens in the whole chicken swarm rPercent and hPercent are 0.15 and 0.7, respectively. The proportion of mother hens in the hens mPercent = 0.5. In addition, the elimination-dispersal probability of the ICSO algorithm is 0.25.(2)Parameter settings of BFA. The chemotactic operation Nc = 100. The reproduction operation Nre = 10. The elimination-dispersal operation Ned = 10, and the length of a swim Ns = 4.(3)Parameter settings of the PSO algorithm. Two learning factors c1 and c2 are both 2.(4)Parameter settings of AFSA. The visual field of artificial fish visual = 2.5. The step length step = 0.3. The maximum tentative number try_number = 5.(5)Parameter settings of GA. The binary digits PRECI = 20. The generation gap GGAP = 0.9. The crossover probability Pc = 0.7, and the mutation probability Pm = 0.01.

In aforementioned parameter settings, the parameters of CSO, BFA, and AFSA are set according to the literature [13, 25] and [14], where CSO, BFA, and AFSA are proposed. It is worth noting that the reason why Nre and Ned are set to 10 in BFA is to ensure that the maximum number of iterations is 10000. The parameter settings of PSO and GA are derived from the comparison algorithms mentioned in the literature [28, 32], respectively.

In the ICSO algorithm, considering that if the value of Ped is large, chicks are easy to fall into random exhaustive search. If the value of Ped is small, it is not conducive to maintaining population diversity, which will reduce the local search ability of the algorithm. Therefore, in this section, to give an appropriate value of Ped, we select four typical functions f3, f6, f20, and f27 from the CEC2014 function test suite for experiments, where f3 is a unimodal function, f6 is a simple multimodal function, f20 is a hybrid function, and f27 is a composition function. At the same time, we choose the ICSO algorithm to run these four functions 30 times independently to calculate their mean values. The experimental results are shown in Table 2.

In Table 2, represents the theoretical global optimal values corresponding to different functions, where i = 3, 6, 20, 27. The value of Ped is taken from 0.05 to 0.85 with an interval of 0.2. represents the absolute value of the difference between the theoretical global optimal value and the actual mean one obtained on each function, where i = 0.05, 0.25, , 0.85. It is easy to see from Table 2 that when Ped = 0.25, the value of is the smallest, that is, the actual mean optimal values obtained by the ICSO algorithm on four functions are the closest to the theoretical ones. Furthermore, when the value of Ped changes from 0.25 to 0.85 with an interval of 0.2, the value of is larger than that of . Therefore, in the experiment, we choose Ped = 0.25.

4.2. The Effectiveness Test of ICSO, RECSO, and CSO-PSO

To verify the effectiveness of the three improved algorithms proposed in this paper, namely ICSO, RECSO, and CSO–PSO, compared with the standard CSO algorithm, in this section, these four algorithms are tested on the CEC2014 function test suite, and the experimental comparison is made in terms of optimization accuracy and convergence performance. The experimental parameter settings are shown in Section 4.1.

4.2.1. The Effectiveness Test for Optimization Accuracy

To verify the effectiveness of the three improved algorithms (ICSO, RECSO, and CSO-PSO) in terms of optimization accuracy, in this section, we use ICSO, RECSO, CSO-PSO, and CSO to run all 30 functions of the CEC2014 test suite independently 30 times to obtain their mean values. The results are shown in Table 3, where the symbols “>,” “=,” and “<” indicate that the experimental results of the comparison algorithms are superior, equal, and inferior to the CSO algorithm, respectively. The optimal results are shown in bold.

It can be clearly seen from Table 3 that the ICSO algorithm has the largest number of optimal values, followed by CSO-PSO, and the CSO algorithm is the worst. Most specifically, the operation results of the ICSO algorithm on 28 functions are better than those of the CSO algorithm, and the operation results on 2 functions are the same as those of the CSO algorithm. The operation results of the RECSO algorithm on 19 functions are better than those of the CSO algorithm. The operation results of it on 6 functions are worse than those of the CSO algorithm, and its operation results are the same as those of the CSO algorithm on the other 5 functions. The results of the CSO-PSO algorithm are similar to those of the ICSO algorithm. There are 28 functions whose operation results are better than those of the CSO algorithm, and the operation results on 2 functions are the same as those of the CSO algorithm. However, from the number of optimal values obtained, the ICSO algorithm is far superior to the CSO-PSO algorithm. This shows the effectiveness of the three improved strategies in terms of optimization accuracy compared with the CSO algorithm, and the performance of the ICSO algorithm is the best among the four algorithms.

4.2.2. The Effectiveness Test for Convergence Performance

To verify the effectiveness of the abovementioned three improved algorithms in terms of convergence performance compared to the CSO algorithm, Figure 4 shows the average convergence curves of the four algorithms independently running 30 times on 30 functions. In order to show the convergence effect of each algorithm more clearly, the average fitness function values are logarithmically processed in Figure 4. In addition, the convergence curves of some functions contain subgraphs, such as those of f4, f8, and f13, which are locally magnified renderings.

It can be seen from Figure 4 that the convergence performance of the ICSO algorithm on functions f1, f5, f9, f10, f11, f17, f18, f25, and f27 is significantly better than that of the other three algorithms. On functions f2, f3, f6, f8, f12, f16, f22, and f24, ICSO and CSO-PSO algorithms have similar convergence performance. The former is slightly better than the latter, and both of them are better than CSO and RECSO algorithms (only on function f16, CSO-PSO algorithm is slightly better than the ICSO algorithm). On functions f4, f7, f13, f14, f15, f19, f20, f21, f28, f29, and f30, the four algorithms have comparable convergence performance, and ICSO is slightly superior. On functions f23 and f26, the RECSO algorithm has the best convergence performance.

To sum up, the convergence performance of the ICSO algorithm on 27 functions is the best, and it is better than that of the CSO algorithm on all 30 functions. The CSO-PSO algorithm has the best convergence performance on one function, and its convergence performance on 28 functions is better than that of the CSO algorithm. The RECSO algorithm has the best convergence performance on two functions, and only the convergence performance on functions f8, f13, and f17 is slightly inferior to that of the CSO algorithm. This fully shows the effectiveness of the three improved algorithms in terms of convergence performance compared with the CSO algorithm, and the ICSO algorithm has the best convergence performance.

It is concluded that the reason why the ICSO algorithm is superior to the other three algorithms may be that the algorithm integrates the ideas of BFA and PSO, which not only improves the depth search ability of the algorithm but also improves its breadth searchability.

4.3. The Superiority Comparison among Several Swarm Intelligent Algorithms

To verify the superiority of the ICSO algorithm proposed in this paper, in this section, we compare the performance of seven algorithms, namely, ICSO, CSO, ICSOII proposed in the literature [30] (named ICSOII to distinguish it from the ICSO algorithm), BFA, PSO, AFSA, and GA from the aspects of optimization accuracy and convergence performance.

4.3.1. The Superiority Comparison in Optimization Accuracy

To verify the superiority of the ICSO algorithm in terms of optimization accuracy, this section presents the experimental results of the abovemetioned seven algorithms to optimize the CEC2014 function test suite. The data in Table 4 are the mean values of 30 independent runs of each algorithm on each function. The bold data in Table 4 are the optimal values.

It can be seen from Table 4 that for functions f1 and f2, the ICSO algorithm directly reduces the order of magnitude of optimization accuracy from 5 and 7 to 3 and 2, respectively. In addition, from the number of optimal values that can be found, the number of optimal values that can be found by CSO and GA is 2. AFSA and PSO can find 3 and 9 optimal values, respectively. The optimal value of ICSOII and BFA is both 6. The number of optimal values that can be found by the ICSO algorithm is 18, which shows the superiority of the ICSO algorithm in optimization accuracy.

4.3.2. The Superiority Comparison in Convergence Performance

To verify the superiority of the ICSO algorithm in convergence performance, in this section, we give the average convergence curves of 30 test functions on the CEC2014 function test suite optimized by the abovementioned seven algorithms (each algorithm is independently run 30 times on each test function). The parameter settings in this section are shown in Section 4.1.2. As mentioned in Section 4.2.2, the ordinates in Figure 5 are the logarithms of the average fitness function values, and subgraphs in some convergence curves are locally magnified renderings, so as to show the convergence effect of each algorithm more clearly.

It can be seen from Figure 5 that the convergence performance of the ICSO algorithm is the best on functions f1, f5f7, f11, f14, f15, f17, f19, f21, f22, f24, f25, and f27, among which, on the functions f5, f6, f11, f25, and f27, the convergence advantage of the ICSO algorithm is particularly obvious. On functions f2, f3, f8, f9, and f20, the convergence performance between ICSO and PSO algorithms is very close, the performance of the PSO algorithm is slightly better, and they all outperform the performance of the other algorithms. On functions f13 and f26, the convergence performance of ICSO and CSO algorithms is the best, and the performance of the ICSO algorithm is slightly inferior to that of the CSO algorithm, ranking second. On the function f16, ICSOII has the best convergence performance and the ICSO algorithm ranks second. Only on functions f4, f12, f18, f23, and f28f30, the convergence performance of the ICSO algorithm is not as good as that of BFA, AFSA, and GA, ranking 3rd or 4th. From this, we can see the superiority of the ICSO algorithm proposed in this paper in terms of convergence performance.

4.3.3. Friedman Test of Algorithms

To compare the performance of various algorithms more reasonably, this section uses the Friedman test to test the performance of the abovementioned 7 algorithms (ICSO, ICSOII, PSO, CSO, GA, BFA, and AFSA) from a statistical point of view. The Friedman test is a nonparametric test method, which is often used to test the performance of algorithms due to its simple operation and lax requirements on the test data [32, 33, 36]. For the minimum optimization problem, the smaller the average ranking of the algorithm, the better its performance. Table 5 is the Friedman test results of 7 algorithms on 30 functions.

It can be seen from Table 5 that the average ranking of the ICSO algorithm is 1.90, ranking the highest, 0.87 lower than that of ICSOII, 2.70 lower than that of the CSO algorithm and 1.38 lower than that of the PSO algorithm, which fully demonstrates the effectiveness of the improved strategy in this paper.

To sum up, the reason why the ICSO algorithm is superior to CSO, PSO, and BFA may be that it greatly enhances the ability of the algorithm to jump out of the local extrema by innovative cooperation between the chicken swarm and particle one to achieve information interaction and improves the depth searchability of the algorithm by integrating the replication and elimination-dispersal operations of BFA. The reason why the ICSO algorithm is superior to ICSOII, GA, and AFSA may be that the ICSO algorithm has a mechanism of subgroup division and multiswarm cooperation based on chicken swarm and particle one, which realizes parallel optimization through group cooperation. Although the ICSOII algorithm has subgroup division, its population cooperation is limited to the cooperation within the chicken swarm, so its optimization ability is weaker than that of the ICSO algorithm.

4.4. Experimental Comparison between ICSO and ICSOII

To further compare the performance between the ICSO algorithm proposed in this paper and the ICSOII algorithm proposed in the literature [30], we set the parameters of the algorithm according to the literature [30] in this section. The statistical results of the abovementioned two algorithms running 51 times independently on the CEC2014 function test suite are shown in Table 6. The experimental data of the ICSOII algorithm comes from its corresponding literature.

It can be clearly seen from Table 6 that the number of optimal values obtained by the ICSOII algorithm is 18, and the theoretical optimal values are obtained on 10 functions. While the number of optimal values obtained by the ICSO algorithm is 21, and its theoretical optimal values are obtained on 12 functions.

4.5. Experimental Comparison between ICSO and a State-of-the-Art Algorithm

To further verify the performance of the ICSO algorithm, DMSDL-QBSA which is a state-of-the-art algorithm proposed in the literature [18] is also used to compare with the ICSO algorithm in this section. In order to make the experimental comparison fairer and more reasonable, the parameter settings of the ICSO algorithm are the same as those of DMSDL-QBSA, that is, the population size is set to 30, and the maximum number of iterations is set to 100000. Other parameter settings can be seen in Section 4.1.2.

The experimental data including the maximum (Max), minimum (Min), mean (Mean), and variance (Var) values are shown in Table 7, where the optimal results are shown in bold. It is worth noting that the experimental data of DMSDL-QBSA are extracted from its corresponding literature.

By analyzing the data in Table 7, we can see that our ICSO algorithm can get all the best values on functions f1f3, f5f9, f12f21, f26, and f29. On functions f4, f11, f22, f24, f25, and f30, the ICSO algorithm has the best results in terms of the maximum, minimum, and mean values but is slightly inferior to DMSDL-QBSA in variance value. DMSDL-QBSA only achieves relatively good results on functions f23, f27, and f28. To sum up, our ICSO algorithm has advantages in most test functions due to the improvement of global and deep search abilities.

4.6. Welded Beam Design Problem

To verify the performance of the ICSO algorithm to solve practical optimization problems, a welded beam design problem is considered, which has been described in detail in the literature [4, 37] and [38]. The problem is a minimum problem, which can be formulated as follows:where

, L = 14in., E = 30 × 106 psi, G = 12 × 106 psi,  = 13600 psi,  = 30000, in.

The comparison results of optimal solutions obtained by different algorithms are shown in Table 8. The statistical results are shown in Table 9, where “Worst,” “Mean,” “Best,” and “SD” stand for the worst, mean, best, and standard deviation values obtained by 30 independent runs, respectively. In addition, the optimal results are shown in bold. It is worth noting that the experimental results of comparison algorithms are extracted from their corresponding literature.

For HFPSO [4] and EPSO [37], because the optimal solutions of the four parameters are not given in the literature [4, 37], they are not listed in Table 8. It can be seen from Tables 8 and 9, ICSO and MBA [38] have obvious advantages over the other two algorithms in terms of the worst, mean values, standard deviation, etc. Although the stability of the ICSO algorithm is slightly inferior to that of MBA [38], it has a higher optimization accuracy, which preliminarily shows that ICSO can be used to solve the welded beam design problem.

5. Conclusion and Future Directions

To overcome the premature convergence problem of the standard CSO algorithm when solving complex optimization problems, an ICSO algorithm is proposed in this paper. For the chicks with the weakest optimization ability, we introduce the reproduction and elimination-dispersal operations of BFA to improve the deep searchability of the CSO algorithm. In addition, in order to improve the global convergence speed of the algorithm, the theory of PSO is integrated to construct a hybrid CSO algorithm. The experimental results show that the ICSO algorithm proposed in this paper can significantly improve the optimization accuracy and convergence performance.

The disadvantage of the proposed algorithm is that with the increase of the dimension of the optimization problem, the optimization ability of the algorithm will decrease, which makes it not suitable for dealing with large-scale optimization problems. Therefore, in the future research work, how to dynamically adjust the limited number of role updates in the chicken swarm according to the number of iterations and how to improve the individual position update formula for hens with relatively weak search ability to further improve the optimization ability still need further research. In addition, we will also consider applying the ICSO algorithm to deal with practical problems, such as path planning in logistics distribution, workshop scheduling, and land use forecast [39].

Data Availability

All data generated or analyzed during this study are included in this published article. Color versions of one or more of the figures in this paper are available from the corresponding authors upon reasonable request.

Conflicts of Interest

The authors declare that they do not have conflicts of interest.

Acknowledgments

This work is partially supported by Hainan Provincial Natural Science Foundation of China (620QN230), The Scientific Research Project of Colleges And Universities in Hainan Province of China (Hnky2020-3), and the National Natural Science Foundation of China (61877038).