Advances in Civil Engineering

Volume 2019, Article ID 6902428, 16 pages

https://doi.org/10.1155/2019/6902428

## Truss Structure Optimization Based on Improved Chicken Swarm Optimization Algorithm

Correspondence should be addressed to Shiwen Wang; moc.361@newihsgnawumut

Received 23 May 2019; Revised 26 July 2019; Accepted 27 August 2019; Published 13 October 2019

Academic Editor: Dimitris Rizos

Copyright © 2019 Yancang Li et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

To improve the efficiency of the structural optimization design in truss calculation, an improved chicken swarm optimization algorithm was proposed for truss structure optimization. The chicken swarm optimization is a novel swarm intelligence algorithm. In the basic chicken swarm optimization algorithm, the concept of combining chaos strategy and reverse learning strategy was introduced in the initialization to ensure the global search ability. And the inertia weighting factor and the learning factor were introduced into the chick position update process, so as to better combine the global and local search. Finally, the overall individual position of the algorithm was optimized by the differential evolution algorithm. The improved algorithm was tested by multipeak function and applied to the truss simulation experiment. The study provided a new method for the truss structure optimization.

#### 1. Introduction

Engineering structure optimization is a problem that has plagued scholars for many years. Mechanical constraints and optimization methods are combined in engineering structure optimization. According to the different engineering requirements, some parameters in the project are involved in the optimization calculation of the engineering structure in the form of variables, thus forming a solution domain of the optimized structure. Then, based on the constraints of the project, the mathematical model is established, and the mathematical solution is used to find the most reasonable design scheme in accordance with the design requirements. In 1974, Schimit and Farshi proposed an approximation problem in structural optimization [1]. After that, with the application of computer technology, modern structure optimization become a real possibility.

The traditional structural optimization method had certain defects. There was no effective optimization design method. At first, it was judged by people directly, such as the limit stress method and the ultimate strain method. These methods were difficult to find the best results, and there was no corresponding theoretical basis. Later, as the time passed, the corresponding mathematical theory support appeared, but it only existed in mathematical theory, and it still cannot describe the actual engineering structure problem. Faced with complex engineering optimization problems, more and more scholars had proposed the theory of bionic intelligent algorithms based on the habit of biological evolution and survival in nature. The optimization of structure was mainly focused on reducing the total quality of the structure, improving the design rationality, and reducing the engineering cost [2–5], for example, improving the application of artificial fish swarm algorithm in truss structure optimization, the truss optimization of artificial bee colony algorithm, and the application of firefly algorithm in truss structure optimization [6–8].

After years of research and development, intelligent optimization algorithms had been widely used in various fields. Fiore et al. proposed an improved differential evolution algorithm to optimize the structure of flat steel trusses. The weight of the steel truss was designed as the minimum objective function, and the square hollow section was used as the calculation variable. The design optimization involved size optimization, shape optimization, and topology optimization [9]. Karaboga et al. proposed an improved artificial bee colony algorithm (ABC) based on chaos theory, which was applied to the truss structure example to verify the effectiveness of the improved algorithm for truss engineering structure optimization [10]. Kayabekir explored a new intelligent algorithm, the flower pollination algorithm (FPA), and discussed its practical application in civil engineering, mechanical engineering, electronic communication, and chemistry, In the civil engineering field, the FPA algorithm was applied to the 25-bar truss structure to optimize the structure, and the feasibility of the algorithm was verified [11]. Chen et al. proposed a hybrid particle swarm optimization (PSO) algorithm based on the improved Nelder–Mead algorithm (NMA). The improved NMA selected a part of the *n*-simplex subplane for optimization, using modal strain energy. The index (MSEBI) to locate the damage effectively improved the convergence speed and accuracy of the PSO algorithm and greatly improved the computational efficiency in the field of structural damage detection (SDD) [12]. Andrea Caponio et al. proposed a metaheuristic algorithm based on population distribution theory for the application of practical engineering structure optimization. The proposed algorithm can converge well when faced with nonlinear and indistinguishable optimization problems. The optimization process was stable and provides a new way of thinking [13].

In 2014, Chinese scholar Meng proposed the chicken swarm optimization (CSO) [14], which was a group-based stochastic optimization algorithm. The advantage was that the implementation was simple. The disadvantage was that it was easy to fall into the local optimal solution, converges slowly, and is low in precision. Scholars had done a lot of research on this optimization algorithm. In 2015, Fei et al. proposed an improved chicken swarm optimization (CSO), which analyzed the weighting factors in the reference particle swarm optimization algorithm. The adaptive weight was introduced in the process of updating the individual position of the chicken, which solved the problem that the optimal search solution was easy to skip due to the large search space in the early stage of the algorithm; the late search space was small, the convergence was slow, and the learning part of the individual with the cock was added. This allowed the chick to enjoy a comprehensive learning mechanism [15]. Yu et al. improved the chicken swarm optimization (CSO) in 2016 using a variety of hybrid methods. The reverse direction learning method was used to initialize the population individuals to improve the quality of the population, and the variation idea was introduced in the boundary processing to enhance the diversity of the population, improving the global optimization ability of the algorithm. Finally, the idea of simulated annealing was used to accept the inferior solution with a certain probability, which enhanced the ability of the algorithm to jump out of local optimum [16]. These improved algorithms effectively improved the optimization performance of the algorithm, and the results of its function optimization proved that the improved algorithm was effective. The existing research showed that the chicken swarm optimization (CSO) had been successfully applied to resource scheduling [17], engineering optimization design [14], cluster analysis, and optimization of classifier coefficients [18, 19]. In 2016, Hafez et al. proposed a feature selection system based on chicken swarm optimization (CSO), which was used to select features in a wrapper mode to search for feature space for the best combination of features, thus maximizing classification performance while minimizing the number of selected features [20]. Shayokh et al. used the chicken swarm optimization (CSO) to solve the wireless sensor network (WNS) node location problem [21]. In 2016, Roslina et al. proposed an improved chicken swarm optimization (CSO) for ANFIS performance, which can more accurately solve the ANFIS network training classification problem [22]. In 2017, Awais et al. combined the chicken swarm optimization (CSO) with energy optimization for home users, enabling home users to reduce power costs, power consumption, and peak-to-average ratio [23, 24]. Ahmed et al. improved the chicken swarm optimization (CSO) search ability by applying logistic and tend chaotic mapping to help the chicken swarm optimization (CSO) better explore search space [25]. These successful applications show that the chicken swarm optimization (CSO) has a good development and application prospects. However, in these application studies, the chicken swarm algorithm (CSO) is also imperfect. For example, the algorithm is not initialized, the chicken position update does not prevent the individual from being out of bounds, and the algorithm has no overall individual optimization.

Based on the above research, this paper improved the chicken swarm optimization and applied to truss structure optimization; the concept of combining chaos strategy and reverse learning strategy was introduced in the initialization to ensure the global search ability. And the inertia weighting factor and the learning factor were introduced into the chick position update process, so as to better combine the global and local search. Finally, the overall individual position of the algorithm was optimized by the differential evolution algorithm. The improved algorithm was tested by multipeak function and applied to the truss simulation experiment.

#### 2. Chicken Swarm Optimization Algorithm

The traditional chicken swarm optimization mainly treats the optimization problem as the process of chickens’ searching for food. The whole chicken swarm is divided into several chicken flocks, each of which has a cock, several hens, and several chicks. There is a competition between each chicken swarm, and the best cluster individuals are obtained through competitions. The process of simplifying is as follows:(1)In each chicken swarm, there are many subchicken swarms, each of which included one cock, several hens, and several chicks.(2)The chicken swarm divides several subchicken swarms and determines the fitness value of individuals on which cocks, hens, and chicks depend. Several individuals with the best fitness values can act as cocks. Each cock is a leader of a chicken swarm. The worst fitness can be used as a chicken, and the rest can be hens. The hens randomly follow a cock, and the relationship between the hen and the chick is randomly formed.(3)The dominance relationship, hierarchy, and mother-child relationship in the chicken swarm are unchanged; chickens regroup and update roles every *G* generation.(4)The subchicken swarms look for food with the cock, the chicks look for food around the hens, and the individual has an advantage in finding food. The cocks, hens, and chicks in the chicken swarms perform different ways of optimizing.

Individuals in the chicken swarm move according to their own rules until they find the best position. Therefore, the individual position in the chicken swarm correspond to a solution to the optimization problem, and finding the best position is the optimal solution to the optimization problem. In the whole chicken swarm optimization, the number of individuals in all flocks is set to , and the position of each chicken swarm individual is represented by , and its meaning indicate the position obtained in the -th iteration of the -th flock individual in the -th dimension. Therefore, there are different positions for the three different types of chickens in the chicken swarm optimization; that is, the position update of the individual flocks is changed with different positions depending on the type of chicken. The cock has the best fitness value in each subgroup, and it can find and locate food in a wide range of spaces.

The position corresponding to the cock is updated as follows:where produces a mean of 0 and Gaussian distribution random number with standard deviation , is an extremely small number to prevent the denominator from being zero, is the fitness value of individual , and is the fitness value of the individual . Individual is randomly selected from the rooster population and .

The location of the hen is updated as follows:where is a random number between 0 and 1, is the spouse cock of the -th hen, is any individual of all cocks and hens in the flock, and ≠ *.*

The location corresponding to the chick is updated as follows:where represents the hen corresponding to the -th chick and is the follow-up coefficient, which means that the chick follows the hen to find food.

#### 3. Improved Chicken Swarm Optimization Algorithm

##### 3.1. Initial Selection

As can be seen from above, the population described in the traditional chicken swarm optimization is not initialized, which brought disadvantages to the traditional algorithm, and there is no guarantee that the optimal solution of the algorithm can evenly distribute your distribution in the search space, which limits the efficiency of the algorithm and reduces the performance of the algorithm. Chaos strategy and reverse learning strategy were combined in the initialization of the algorithm [26–28]. Firstly, the chaotic strategy was used to make the state nonrepetitive and ergodic, and then the reverse learning strategy was used to reduce the blindness of the algorithm to expand the initial search range of the whole population; the global detection capability of the algorithm is ensured, so as not to fall into local optimum. The process was as follows:(1)Select the chicken swarm optimization population , and each individual in the population was , .(2)Generate chaotic sequences by model iterations and describe the logistic mapping of equation (4) :where the parameter .(3)Find the inverse solution corresponding to individual by equation (5) ; thus, the inverse population was obtained:where , is the dynamic boundary of the search space and is a random number subject to uniform distribution.(4) and choose the best fitness for , calculate , and finally get the inverse optimal solution as follows:By introducing chaos strategy and reverse learning strategy, the chicken swarm optimization could find the optimal solution in a larger search space and can guide the individual to evolve to the optimal solution, so that the overall convergence speed is improved.

##### 3.2. Chick Location Update

In the traditional chicken swarm optimization, the position update of the chick is only related to the position of the hen, but not to the position of the cock with the best fitness in the algorithm [29, 30]. It can cause the algorithm to fall into local optimum to a certain extent, resulting in a lower overall efficiency. It is not difficult to see that the chicks in the chicken swarm optimization and the particles in the particle swarm optimization have similarities [31–33]. In order to improve the position update of the chicks, we referred to the particle swarm optimization to obtain the position of the particles locally and globally, so that the chicks can obtain the local position in the hen’s side and the optimal position in the whole cock-guided group. In this paper, the inertia weight value and the learning factor in the particle swarm optimization algorithm were used to improve the chick position update. Equation (3) was improved as follows:where is the individual in the subgroup and the individual corresponding to the hen; is the individual of the cock corresponding to the chick in the subgroup; and are the two learning factors, respectively, indicating the degree of learning of the chick to the hen and the cock; and is the weight value.

###### 3.2.1. Setting the Weight Value

The setting of the weight value is related to whether the algorithm can better achieve local and global search, which is related to the diversity of the algorithm population in the later stage. This paper introduced a nonlinear inertia weight value , as follows:where is the maximum inertia weight, is the minimum inertia weight, is the maximum number of iterations, and is the rounding function, in order to ensure that is an integer.

After introducing the nonlinear weight value, it can ensure the global search of the individual chicken in the early stage of the algorithm, thereby improving the accuracy of the algorithm and strengthening the local search later.

###### 3.2.2. Setting the Learning Factor

The learning factors and , appearing in the chick position update, indicate the degree of learning of chicks to hens and cocks to some extent, just like the individual learning factors and social learning factors in the particle swarm optimization [33]. From the learning factor in the particle swarm optimization, it can be inferred that, in the initial stage of the search, the larger makes the chicks search for a greater probability around the hen and more is to find the optimal solution globally; in the later stages of the search, the larger makes the chicks search for a greater chance around the cocks, which is a local search near the optimal solution [34]. This paper introduced a nonlinear learning factor that allowed chicks to perform local and global searches. The formula was as follows:

##### 3.3. Select the Best Individual

In the selection of the optimal position of the chicken swarm optimization, a differential algorithm was used, which consists of three processes: variation, intersection, and selection [35, 36]. The main formula was as follows:(1)*Variation Process*. Two bodies and of the same iteration number were randomly selected, and the mutation operation was performed according to the following equation: where is the mutated individual and is a random factor of , which controls the degree of expansion of the check score vector.(2)*Cross Process.* The cross-probability factor was introduced, and the cross operation was performed according to the following equation:(3)*Selection Process*. For comparing the fitness function values of two individuals, select individuals with large function values to perform mutation and crossover operations and compare the generated new individuals with the previous generation. If the former was larger than the latter, enter the next iteration; otherwise, remain unchanged.

##### 3.4. Improved Chicken Swarm Optimization Steps and Flow Chart

The steps to improved chicken swarm optimization are described as follows: Step 1: set relevant parameters of the algorithm, population size, cock, hen, chick scale factor, update iteration number, etc. Step 2: initialize the population by combining chaos strategy and reverse learning strategy Step 3: determine the fitness value of the individual, the ratio of cock, hen, and chicken, grouping, and various relationships; record the current global optimal fitness value, that is, the optimal individual position Step 4: enter the iterative update to determine whether the update condition is met; the order of the flock, the mother-child relationship, and the partnership are updated; the position of the cock, hen, and chick is updated; and the boundary is processed Step 5: select the optimal individual through three steps mutation, intersection, and selection Step 6: when the number of iterations is less than the maximum number of iterations, go to step (4) to continue execution; otherwise, perform step (7) Step 7: the individual who chooses the best position is the optimal solution

The flow chart of the improved chicken swarm optimization is shown in Figure 1.