ResearchArticle  Open Access
Feng Qian, Mohammad Reza Mahmoudi, Hamïd Parvïn, KimHung Pho, Bui Anh Tuan, "An Adaptive Particle Swarm Optimization Algorithm for Unconstrained Optimization", Complexity, vol. 2020, Article ID 2010545, 18 pages, 2020. https://doi.org/10.1155/2020/2010545
An Adaptive Particle Swarm Optimization Algorithm for Unconstrained Optimization
Abstract
Conventional optimization methods are not efficient enough to solve many of the naturally complicated optimization problems. Thus, inspired by nature, metaheuristic algorithms can be utilized as a new kind of problem solvers in solution to these types of optimization problems. In this paper, an optimization algorithm is proposed which is capable of finding the expected quality of different locations and also tuning its explorationexploitation dilemma to the location of an individual. A novel particle swarm optimization algorithm is presented which implements the conditioning learning behavior so that the particles are led to perform a natural conditioning behavior on an unconditioned motive. In the problem space, particles are classified into several categories so that if a particle lies within a low diversity category, it would have a tendency to move towards its best personal experience. But, if the particle’s category is with high diversity, it would have the tendency to move towards the global optimum of that category. The idea of the birds’ sensitivity to its flying space is also utilized to increase the particles’ speed in undesired spaces in order to leave those spaces as soon as possible. However, in desirable spaces, the particles’ velocity is reduced to provide a situation in which the particles have more time to explore their environment. In the proposed algorithm, the birds’ instinctive behavior is implemented to construct an initial population randomly or chaotically. Experiments provided to compare the proposed algorithm with the stateoftheart methods show that our optimization algorithm is one of the most efficient and appropriate ones to solve the static optimization problems.
1. Introduction
Optimization is a subfield of artificial intelligence [1–10] in which a raised problem could be transformed into an optimization function with a random initial solution which is improved through the next designed steps [1, 2]. Since there is more than one solution to a problem, we can receive the best response to an arbitrary problem using mathematical global optimization tools [11]. In order to determine the best answer in local optimization, some factors should be taken into account including the solution, the amount of the permissible error, and the problems [12–14] such as the best work of art, the most beautiful landscapes, and the most pleasant piece of music. Some popular stochastic algorithms, such as naturally motivated ones and populationbased ones, emulate problemsolving strategies in nature including the race of some creatures with the main goal of survival which makes these creatures to be evolved in several shapes. An artificial intelligence technique based on the collective behavior in decentralized and selforganized systems, which usually includes a number of simple agents locally interacting with each other and with the environment they are in, is swarm intelligence (SI). In these algorithms, although there is not a usual centralized control for the agents’ behaviors, a collective behavior is often provided due to the local interactions of the agents.
Pavlov, a Russian physiologist, recognized that the dogs splashed their saliva only when they see someone who had fed them before even if those people had not any food in their hand the next time. Based on this observation, he proposed the conditioning learning in which the animal finds out how to relate a reward or punishment with a natural motive. This theory is used to move the swarms with high spatial distribution diversity to the global optimum and also to move the swarms with low level of diversity to the local optima in the problem space. In addition, in order to modify the proposed algorithm to an improved version of the PSO algorithm, some other ideas related to the instinctive behavior of the birds and their speed are considered.
There are some defects in the original PSO algorithm including the following instances: first, this algorithm is weak in creation of the random initial population; second, quality of the problem space is not considered, and speed of the particles with that quality is not adjusted; third, PSO is time wasting to achieve an optimal solution by choosing a point between the local and global optima. These mentioned drawbacks are all resolved in this work using the instinctive conditioning behavior of the birds. The rest of this paper is organized as follows. In Section 2, a summary of the related previous works is outlined. The proposed algorithm of the paper is presented in Section 3. Simulation results and the consequences are provided in Section 4, and Section 5 deals with the conclusions and future works.
2. Literature
There are two groups of the naturally motivated metaheuristic optimization algorithms. The first one is singlesolutionbased algorithms which offer a gradually improving single random solution. The second group, which is more popular in the literature, is called the multisolutionbased algorithm. The group which we focus on offers multiple solutions that are gradually enhanced as a whole.
One of the best evolutionary algorithms, proposed by Holland [15] in 1975, is the genetic algorithm (GA) which is a random evolutionary optimization algorithm inspired by the evolution of creatures in nature. This is a general purpose optimizer because it has an extraordinary optimization performance. At the beginning of this algorithm, an initial population of the chromosomes or potential solutions which is usually a chromosome or a binary vector is considered and then a relevant fitness function is used to measure the merit of each chromosome which results in production of a new solution set [16] by choosing the best chromosomes and entering them into a mating stage where they are crossed over and mutated.
Similar to other related articles, we can show that optimization problems of this work lead to cost function minimizations. Using instinctive and collective behavior of the flying birds, a naturally motivated optimization algorithm is provided, whose mechanism is similar to that of the particle swarm optimization algorithm. Conditioning learning behavior of the animals, which is the basic of this article and is a type of consistent learning, is expected to result in a mechanism to be able to solve any complex problem. The proposed algorithm offers some useful advantages. One advantage in the proposed PSO version is that this algorithm utilizes a beneficial collective behavior inspired by nature, and another one is that in comparison with other optimization algorithms, many of its specific parameters can be automatically set without significant loss of performance. But, a drawback of this proposed algorithm is that it is highly probable that an improper problem for the original PSO version is an improper problem for the proposed optimization algorithm.
The social behavior of birds to search for food had been the motive for proposing of a populationbased general purpose optimization algorithm in 1995 [17], namely, PSO, which is a computationoriented random intelligence optimization algorithm. Many advantages, such as computational efficiency, search mechanism, simple concept, and ease of implementation, have introduced SIbased algorithms as helpful tools in optimization applications. Each particle in PSO represents a population member with small or lowvolume random mass which induces reaching a better behavior with speed and acceleration. Therefore, a particle could be a solution in a multidimensional space which can set its position in the search space based on the best position it has reached by itself (pbest), the best position reached by swarm (gbest) during the search process, and its speed. SIbased method has been used in some applications of the PSO algorithm [18–20] to calculate the trajectory in the binary search space to solve the knapsack problem (KP). Some problems could also be solved using a combination of PSO with guided local search and some concepts derived from the GA [21]. A binary PSO algorithm has been proposed to solve the KP using the mutation operator.
Optimization algorithms motivated by the bees’ behavior result in some SI algorithms different from PSO; the most wellknown one is called the artificial bee colony (ABC) optimization algorithm. It is a prevalent optimization algorithm developed from the bees’ SI foraging behavior [22]. ABC algorithm includes the artificial bee colony containing three types of worker, onlooker, and scout bees. Waiting in a dance area, onlooker bees create and choose a food source, worker bees go to the food sources which are previously recognized, and the scout bees make a random search for new food sources. A possible solution to the optimization problem is represented by location of a food source, and quality of the solution depends on the quantity of the nectar from the food source. Also, a virtual swarm of bees moving randomly in the twodimensional search space is supposed, and when some target solution is found, bees have interacted with each other. Some other methods are derived from the original ABC method. Jobshop scheduling problem [23] (JSSP) has been presented when behavioral change of the onlooker behavior has changed, and SI is used to convert continuous values to binary ones. Also, the combinatorial ABC (CABC) algorithm has been presented [24] with discrete coding for the travel salesman path (TSP) problem.
Another algorithm, namely, the bees algorithm (BA), introduced on a mathematical function [25], includes a Ddimensional bee vector, which corresponds to the problem variables, and is named as a candidate solution here, which represents a site (food source) visit possessing a certain fitness value. In BA, scout bees randomly searching for new sites and worker bees frequently searching the neighbors for higher values of fitness function balance the explorationexploitation of the algorithm. Best bees are those with the best fitness values, and elite sites are the sites visited by them. This algorithm assigns the majority of the bees near to the best sites to search neighbors of the best selected sites. Adjoined optimization functions, scheduling tasks [26], and binary data clustering [27] are examples of BA applications.
A metaheuristic algorithm searching for suitable conditions during improvisation of jazz music in the natural process of musical performance is called the harmony search algorithm (HSA) [28]. An assigned form by the standard of aesthetics is searched by the improvisation to find harmony in a piece of jazz music (a proper condition), which is equivalent to an optimization process in which a global solution (a proper condition) found by a specified objective function is searched.
Motivated by the behavior of a set of imperialists who compete with each other to overcome the colonies, an evolutionary optimization technique called the imperialist competitive algorithm (ICA) has been introduced. This algorithm also begins with an initial population whose individual members are classified into two groups of colonists and imperialists. Power of the colonies assigns them to their related imperialists. Power of any state is reversely related with its cost, so that the imperialists with more power are more dominant [29]. Interaction of the imperialist powers and their colonies has a characteristic in which the culture of the colonies is changed after a while so that they gradually become similar to one of their ruling imperialists. This is named the attraction policy which means the colonies move towards their imperialists and is executed by imperialist countries after the nineteenth century. In recent years, many other algorithms in line with improving famous optimization algorithms such as the particle bee optimization algorithm (PBOA) [30], novel particle swarm optimization algorithm (NPSOA) [31], cuckoo search optimization algorithm (CSOA or CS) [32], differential search optimization algorithm (DSOA or DSA) [33], and bird mating optimization algorithm (BMOA or BMO) [34] are presented. In order to improve the PSO algorithm [35], many algorithms have been presented using a dynamic multipopulation method. Some of these algorithms are the sinusoidal differential evolution optimization algorithm (SDEOA) [36], joint operation optimization algorithm (JOOA) [37], and dynamic multiswarm particle swarm optimizer with cooperative algorithm (DMSPSOCA) [35].
In order to make different algorithms more efficient, parameters and ideas from other methods could be used. For instance, the CSOA could be enhanced by employing chaos parameters [38]. In 2015, the PSO algorithm was enhanced employing some cognitive learning mechanisms for solving an optimization problem [39]. In 2016, the ant colony optimization algorithm (ACOA) was integrated with GA, and a new optimization algorithm was introduced [40]. A study used GA to discover the closest solution to the best one to deal with nonlinear multimodal optimization problems [41]. There are also other methods among the latest comparable optimizers [31, 42–45].
3. Adaptive PSO
The proposed method is presented in this section. The population is initialized at first. After that, the population is randomly partitioned into a set of subpopulations. Also, the problem space is classified into several virtual subspaces. Each subspace is a hypercube. Indeed, each dimension out of all of the dimensions is partitioned into equalsize slices. Therefore, we have subspaces. Then, the particles will be transferred with lower speed in the more valuable subspaces. Also, in each subpopulation, a special set of the movement coefficients ( and ) is used. Also, the special set of the movement coefficients for each subpopulation adaptively changes during optimization. Finally, the best solution produced during optimization is considered to be the optimal solution to the problem found by the proposed optimization algorithm called also as the adaptive particle swarm optimization algorithm (APSOA).
The pseudocode of the APSOA is depicted in Algorithm 1. Variables , , , , , , , , and function are inputs of this algorithm. Here, and are upbound and lowbound vectors of the problem space. is the number of dimensions. is the population size. is the number of subpopulations. Finally, shows the objective function. First, the algorithm calls the initialization function, denoted by . The pseudocode of the initialization function is depicted in Algorithm 2. It takes variables , , , , , and function . It returns a population , the best particle , and a sparse subspace rating vector . is an array whose key is a string. If it is called with a numeric string as its key, it will return the value of the th subspace where the value of is an integer in . As it is a sparse subspace rate array, if a subspace has still no value, we assume its value as 1 by default. If is called with “sum” as its key, it will return the summation of all values associated with all subspaces. Indeed, is at first. Each time the best particle of a subpopulation is located in the th subspace, its value is added by one unit. Also, each time the best particle of the whole population is located in the th subspace, its value is added by one unit. is a population of individuals, where stands for the th individual in population . Each individual in population is an object containing the following fields: : the position of the th individual : the velocity of the th individual : the position of the best place met by the th individual : index of the subspace in which the th individual is located : fitness of the th individual : fitness of the best memory of the th individual


After the proposed method (depicted in Algorithm 1) called the initialization function, it iterates a loop times (statement 2 in Algorithm 1). Each time, all population individuals are updated using an individual update function (depicted in Algorithm 3 and explained in the following paragraph) (statement 2.1 in Algorithm 1). After updating all population individuals, the global best, i.e., , will be updated. Then, of the subspace where the global best is located is updated (statement 2.2 to statement 2.6 in Algorithm 1). In the following step, of the subspaces where the local bests of subpopulations are located is updated (statement 2.8 in Algorithm 1). Finally, the coefficients of any subpopulation are updated using the coefficient update function (depicted in Algorithm 4 and explained in the following paragraph) (statement 2.9 in Algorithm 1).


For the individual update function , which is depicted in Algorithm 3, the only different section from the original PSO is the statement 2.2 in Algorithm 3 where a coefficient (denoted by ) is computed to be multiplied in the movement equation in the following statement. It speeds up the individuals in the useless subspaces and slows them down in the useful subspaces. The coefficient update function , which is depicted in Algorithm 4, adaptively changes the values of coefficients of each subpopulation.
4. Implementation of Experimentations
In this section, simulation results of the proposed method are provided in several parts, and they are compared with other similar methods. The results are provided in four parts, and different modern methods have been compared in each part on different problems which include a realworld industrial application.
In the first three parts, the average value and standard deviation of the cost function value, i.e., , are calculated to evaluate the performance of each algorithm, where is the minimum value (of the th objective function) found by an algorithm in a run and is the actual optimal value (of the th objective function).
4.1. Experimental Results: CEC 2009
Here, problems defined by the CEC 2009 benchmark [46] are provided in Table 1. By comparison with the proposed method performance when the chaotic number generator (CNG) is used against the proposed method performance when the random number generator (RNG) is used (on objective functions of the CEC 2009 benchmark), we concluded that slightly better results are obtained when CNG is used. We only used the logistic map [47] with parameter (as CNG) and uniform distribution (as RNG). Therefore, though we can use both throughout this paper, we totally experimented using the logistic map [47] with parameter (as CNG).

The proposed method in this part is compared with the following algorithms: GA [15], differential evolution (DE) [48], PSO [17], BA [25], PBOA [30], NPSO [31], moderaterandomsearch strategy PSO (MRPSO) [49], ensemble of mutation strategies and control parameter S with DE (EPSDE) [50], cooperative coevolution inspired ABC (CCABC) [51], and firefly algorithm (FA) [52]. The results of each method are derived for the same parameters used by its main corresponding method. But, some of the shared parameters are fixed in all methods including the population size () and the number of fitness evaluations (), respectively, equal to and , where is the problem size and is always set to 50. We set the target objective cost function (denoted by ) always as where is an integer in and is the optimal target value of the th objective function. We have used mean and std. dev as two criteria in presenting the results which are, respectively, defined as the average of the best costs of the particles and the standard deviation of the best costs of the particles in different runs on all of the 26 functions shown in Table 1. Table 2 shows the results for assessment of other methods on all of the 26 functions shown in Table 1 in detail; also, it validates the results using the Friedman test with a value of 3.12E − 03. The results presented in Table 2 have been summarized in Table 3.
 
The value below means that the average of the best values found by the method on 30 independent runs over function is and the standard deviation is . It also means that the rank of the method among all methods over function is . The sign & indicates that the proposed method performance is superior/inferior/equal to that of the method , if its value is + /−/ × . 

The results show that, in almost all of the functions in the CEC 2009 benchmark, the proposed method converges to the optimum as the particles’ average cost is counted. Since the proposed algorithm fully converges to the optimal point, the particles will have a zero standard deviation at the end of the assessments in different methods which means that the results of the proposed method together with other methods have presented the best performance. The results also show that there are some functions such as F8, F12, and F18 in which only the proposed method achieves the optimal point which proves the superiority of the proposed method. In the process of the Friedman test, the results’ process is proved to be normal, and it is proved that those results are not obtained by random.
4.2. Experimental Results: CEC 2005
The proposed method in this part is compared with the following algorithms: the modified bat algorithm hybridizing by differential evolution (MBADE [10]), rain optimization algorithm (ROA) [42], CS [32], teachinglearningbased optimization (TLBO) algorithm [53], DSA [33], and BMO algorithm [34]. In this part, we have compared the proposed method based on 25 problems defined by the CEC 2005 benchmark [54]. All of the problems in the CEC 2005 benchmark [54] are summarized in Table 4. The results of each method are derived for the same parameters used by its main corresponding method. But, some of the shared parameters are fixed in all methods including and , respectively, equal to and , where is always set to 50. We set the target objective cost function always as where is an integer in and is the optimal target value of the th objective function. We use mean and std. dev as two criteria in presenting the results which are, respectively, defined as the average of the best costs of the particles and the standard deviation of the best costs of the particles in different runs on all of the 25 functions shown in Table 4. Table 5 shows the results for assessment of other methods on all of the 25 functions shown in Table 4 in detail; also, it validates the results using the Friedman test with a value of 7.92E − 05. The results presented in Table 5 have been summarized in Table 6.
 
U means unimodal function, M means multimodal function, E means extended function, and HC means hybrid composition function. 
 
The value below column means that the average of the best values found by the method on 30 independent runs over function is and the standard deviation is . It also means that the rank of the method among all methods over function is . The sign & indicates that the proposed method performance is superior/inferior/equal to that of the method , if its value is +/−/ × . 

According to Table 5, it is easy to conclude that the proposed algorithm has a better quality in almost all functions compared with other methods; it is always in top3 methods. Studying Table 5 also shows that the proposed method has the best performance in 16 functions.
The proposed approach has the best performance due to reaching the zero error global optimum with the CS algorithm in F1 and also with DSA in F9. There are only 9 functions in which the proposed method does not have the best performance. We can also conclude that the proposed method has a desirable ability to result a satisfactory diversity in the problem space. It is shown by the statistical test that results of the proposed method are with the best values among different methods.
Finally, the proposed algorithm complexity according to the specified guidelines of CEC 2005 [54] is shown in Table 7.

All the mentioned methods are tested on different numbers of fitness evaluations, and the results are shown in Figure 1, where , , and are, respectively, equal to 30, , and . We can see from Figure 1 that regardless of the changes in , the proposed algorithm converges to a better solution, and also it needs less rather than other methods to converge to a solution with the same quality in most of the fitness functions. This test also shows that the proposed algorithm is one of the best methods on all different numbers of fitness assessments and meets the best cost among them. In addition, when the fitness assessment number is increased, better results are derived.
(a)
(b)
(c)
(d)
(e)
(f)
(g)
(h)
(i)
(j)
(k)
(l)
In Figure 2, the results of the proposed method in terms of some criteria for different in the set of are presented for the first six objective functions in the CEC 2005 benchmark. The considered population size in this experiment is 20, for fitness function assessments. Figure 2 includes the best and the mean costs of the population members and also their standard deviation and the execution time when applying the APSOA method on test benchmark functions F1–F6 for different dimension sizes. Using the presented results in Figure 2, we can express that increasing dimension sizes makes the problem more complicated which, in turn, causes the best cost value to be raised.
4.3. Experimental Results: CEC 2010 and RealWorld Problems
In this part, we are going to compare our proposed method with other recently proposed metaheuristic methods. Here, we use the benchmark functions of the test series CEC 2010 [10] and set and to 40 and , respectively. 20 famous objective functions of the CEC 2010 [10] benchmark are used as F1–F20 for assessment of this section (F1–F3 are separable, F4–F8 are 1group Nnonseparable, F9–F13 are groups Nnonseparable, F14–F18 are groups Nnonseparable, and finally F19–F20 are nonseparable). is 1,000 throughout this section; besides, , i.e., the number of variables in each nonseparable subcomponent, is 50 here. Also, a set of 4 realworld problems is used as F21–F24 for assessment of this section. The ﬁrst two realworld problems, i.e., F21 and F22, are, respectively, problem number 1 and problem number 7 in CEC 2011 [6]. F23 and F24 are the problem of the linear equation system [7] and the problem of the polynomial ﬁtting system [8]. It is worthy to be mentioned that 51 independent runs are performed, and the results are averaged over them. The comparison is taken with the following methods: SDEOA [36], JOOA [37], diversity neighborhood searchenhanced particle swarm optimization (DNSPSO [9]), and DPSOC [35].
According to the results presented in Table 8, except for function F13 in which APSOA is not in top3, APSOA is always among the top3. It is the best at the end of the specified number of fitness function evaluations, for half of our problems. The summary of the results presented in Table 8 is shown in Table 9. The proposed APSOA exhibits the best performance among all methods according to the results presented in Table 9. The results in Table 8 are validated by the Friedman test with a value of 1.92E − 02.
 
The value below column means that the average of the best values found by the method on 30 independent runs over function is and the standard deviation is . It also means that the rank of the method among all methods over function is . The sign & indicates that the proposed method performance is superior/inferior/equal to that of the method , if its value is +/−/×. 

4.4. Experimental Results: A RealWorld Problem
Artificial intelligence is an appropriate candidate in solving many of the realworld electrical problems [55–71]. In this part, we have solved a problem of combined heat and power economic dispatch [6, 71] using our proposed method and different optimization algorithms. A particle is shown as a vector with size 9 denoted by P = [pow_{1}, pow_{2}, pow_{3}, pow_{4}, pöw_{1}, pöw_{2}, hëat_{1}, hëat_{2}, heat_{1}]. Then, the cost function of the problem issubject towhere the functions , , , , , , and are defined as follows:
Also, two other conditions and , for two cogeneration powerheat units depicted in Figure 3 should be met. In Figure 3, the valid work regions for cogeneration powerheat unit numbers 1 and 2 are depicted.
According to the parameters defined in the papers related to different optimization algorithms, their population size has been set here. Let and represent the best solution in the population of a method after fitness evaluations and its cost function value, respectively. We have used and the related cost value in method for the results to be fair. For this problem, the provided solutions using different optimization algorithms are shown in Table 10 in which we can see that the proposed method has the best performance.

5. Conclusions and Future Works
Natureinspired social and solitary behaviors have motivated numerous algorithms in different scientific studies which are usually successful and efficient. In this paper, instinctive behaviors of birds are used to provide a more accurate, more targetoriented, and more controlled algorithm than the basic swarm algorithms. According to the classical conditioning learning behavior, a model is presented in which a normal task based on a natural stimulant for each particle is implemented in the search space. This model implies that when a particle experiences a low diversity category, it will move towards the local optimal point, while if it lies in a high diversity category, it will be moved towards the global optimum of its category.
An initial population according to the elite particles is also generated based on the assumption that those birds which have no sufficient energy will encounter flight problems. Another goal in the proposed algorithm is to provide more exploitation time for the particles in valuable spaces which motivated us to reduce their velocity by creating some changes in the velocity equation and viceversa. Simulation results of the proposed method provided in four parts proved that it is an efficient and reliable algorithm in static functions compared with other algorithms. We concluded that our method uses a mechanism that finds more accurate solutions in a simpler and faster way, and it also has a better operation in industrial applications in comparison with prior methods.
For future works, we propose to use the following ideas based on what this paper was accomplished: applying the idea of chaos theory in the initial population, studying the quantum particles in the abovementioned algorithm, and implementing the algorithm of this paper to solve dynamic optimization problems.
Data Availability
The data used to support the findings of this study are available from the corresponding author upon request.
Conflicts of Interest
The authors declare that they have no conflicts of interest.
References
 M. Song and D. Chen, “An improved knowledgeinformed NSGAII for multiobjective land allocation (MOLA),” Geospatial Information Science, vol. 21, no. 4, pp. 273–287, 2018. View at: Publisher Site  Google Scholar
 T. Pukkala, “Optimized cellular automaton for stand delineation,” Journal of Forestry Research, vol. 30, no. 1, pp. 107–119, 2019. View at: Publisher Site  Google Scholar
 A. Kalantari, A. Kamsin, S. Shamshirband, A. Gani, H. AlinejadRokny, and A. T. Chronopoulos, “Computational intelligence approaches for classification of medical data: stateoftheart, future challenges and research directions,” Neurocomputing, vol. 276, pp. 2–22, 2018. View at: Publisher Site  Google Scholar
 R. Andonie, A. T. Chronopoulos, D. Grosu, and H. Gâlmeanu, “An efficient concurrent implementation of a neural network algorithm,” Concurrency and Computation: Practice and Experience, vol. 18, no. 12, pp. 1559–1573, 2006. View at: Publisher Site  Google Scholar
 G. E. PhillipsWren, L. S. Iyer, U. R. Kulkarni, and T. Ariyachandra, “Business analytics in the context of big data: a roadmap for research,” Communications of the Association for Information Systems, vol. 37, p. 23, 2015. View at: Publisher Site  Google Scholar
 S. Das and P. N. Suganthan, Problem Deﬁnitions and Evaluation Criteria for CEC 2011 Competition on Testing Evolutionary Algorithms on Real World Optimization Problems, Jadavpur University, Kolkata, India, 2010.
 C. GarcíaMartínez, M. Lozano, F. Herrera, D. Molina, and A. M. Sánchez, “Global and local realcoded genetic algorithms based on parentcentric crossover operators,” European Journal of Operational Research, vol. 185, no. 3, pp. 1088–1113, 2008. View at: Publisher Site  Google Scholar
 F. Herrera and M. Lozano, “Gradual distributed realcoded genetic algorithms,” IEEE Transactions on Evolutionary Computation, vol. 4, no. 1, pp. 43–63, 2000. View at: Publisher Site  Google Scholar
 K. Tang, X. Li, P. N. Suganthan, Z. Yang, and T. Weise, “Benchmark functions for the CEC’2010 special session and competition on largescale global optimization,” Nature Inspired Computer Application Lab, University of Science and Technology of China, Hefei, China, 2009, http://goanna.cs.rmit.edu.au/xiaodong/publications/lsgocec10.pdf. View at: Google Scholar
 G. Yildizdan and Ö. K. Baykan, “A novel modified bat algorithm hybridizing by differential evolution algorithm,” Expert Systems with Applications, vol. 141, p. 112949, 2020. View at: Publisher Site  Google Scholar
 R. L. Haupt and S. E. Haupt, Practical Genetic Algorithms, John Wiley & Sons, Hoboken, NJ, USA, 2nd edition, 2004.
 S. B. L. Vandenberghe, Convex Optimization, Cambridge University Press, Cambridge, UK, 2004.
 W. Sun and Y. Yuan, Optimization Theory and Methods: Nonlinear Programming, Springer, Berlin, Germany, 2006.
 J. Nocedal and S. J. Wright, Numerical Optimization, Berlin, Germany, 2nd edition, 2006.
 J. Holland, “Genetic algorithms and the optimal allocation of trials,” SIAM Journal on Computing, vol. 2, pp. 88–105, 1973. View at: Publisher Site  Google Scholar
 S. Binitha and S. S. Sathya, “A survey of bio inspired optimization algorithms,” International Journal of Soft Computing and Engineering, vol. 2, 2012. View at: Google Scholar
 J. Kennedy and R. C. Eberhart, “Particle swarm optimization,” in Proceedings of the 4th IEEE International Conference on Neural Networks, pp. 1942–1948, Perth, Australia, December 1995. View at: Google Scholar
 N. F. Wan and L. Nolle, “Solving a multidimensional knapsack problem using hybrid particle,” in Proceedings of the 23rd European Conference on Modelling and Simulation, Lancaster, UK, October 2008. View at: Google Scholar
 K. B. Deep, “A sociocognitive particle swarm optimization for multidimensional,” in Proceedings of the First International Conference on Emerging Trends in Engineering, pp. 355–360, Nagpur, India, July 2008. View at: Publisher Site  Google Scholar
 X. Shen, Y. Li, C. Chen, J. Yang, and D. Zhang, “Greedy continuous particle swarm optimisation algorithm for the knapsack problems,” International Journal of Computer Applications in Technology, vol. 44, no. 2, pp. 37–144, 2012. View at: Publisher Site  Google Scholar
 H. S. Lopes and L. S. Coelho, “Particle swarm optimization with fast local search for the blind traveling salesman problem,” in Proceedings of the Fifth International Conference on Hybrid Intelligent Systems, pp. 245–250, Rio de Janeiro, Brazil, November 2005. View at: Publisher Site  Google Scholar
 D. Karaboga and B. Basturk, “A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm,” Journal of Global Optimization, vol. 39, no. 3, pp. 459–471, 2007. View at: Publisher Site  Google Scholar
 A. Banharnsakun, B. Sirinaovakul, and T. Achalakul, “Job shop scheduling with the bestsofar ABC,” Engineering Applications of Artificial Intelligence, vol. 25, no. 3, pp. 583–593, 2012. View at: Publisher Site  Google Scholar
 D. Karaboga and B. Gorkemli, “A combinatorial artificial bee colony algorithm for traveling salesman problem,” in Proceedings of the International Symposium on Intelligent Systems and Applications, pp. 50–53, Istanbul, Turkey, June 2011. View at: Publisher Site  Google Scholar
 D. T. Pham, A. Ghanbarzadeh, E. Koc, S. Otri, S. Rahim, and M. Zaidi, The Bees Algorithm. Technical Note, Cardiff University, Cardiff, UK, 2005.
 D. Pham, E. Koc, J. Lee, and J. Phrueksanant, “Using the bees algorithm to schedule jobs for a machine,” in Proceedings of Eighth International Conference on Laser Metrology CMM and Machine, Cardiff, UK, 2007. View at: Google Scholar
 D. T. Pham, S. Otri, A. Afify, M. Mahmuddin, and H. AlJabbouli, “Data clustering using the bees algorithm,” in Proceedings of the 40th CIRP International Seminar on Manufacturing Systems, Liverpool, UK, May 2007. View at: Google Scholar
 Z. Geem, J. Kim, and G. Loganathan, “A new heuristic optimization algorithm: harmony search,” Simulation, vol. 76, 2001. View at: Publisher Site  Google Scholar
 X. Miao, J. Chu, L. Zhang, and J. Qiao, “An evolutionary neural network approach to simple prediction of dam deformation,” Journal of Information & Computational Science, vol. 10, pp. 1315–1324, 2013. View at: Publisher Site  Google Scholar
 M.Y. Cheng and L.C. Lien, “Hybrid artificial intelligencebased PBA for benchmark functions and Facility Layout Design optimization,” Journal of Computing in Civil Engineering, vol. 26, no. 5, pp. 612–624, 2012. View at: Publisher Site  Google Scholar
 W. Feng and C. Liu, “A novel particle swarm optimization algorithm for global optimization,” Computational Intelligence and Neuroscience, vol. 2016, Article ID 9482073, 9 pages, 2016. View at: Publisher Site  Google Scholar
 X. S. Yang and S. Deb, “Cuckoo search via Levy flights,” in Proceedings of the 2009 World Congress on Nature & Biologically Inspired Computing (NaBIC), pp. 210–214, Coimbatore, India, December 2009. View at: Publisher Site  Google Scholar
 P. Civicioglu, “Transforming geocentric Cartesian coordinates to geodetic coordinates by using differential search algorithm,” Computers & Geosciences, vol. 46, no. 9, pp. 229–247, 2012. View at: Publisher Site  Google Scholar
 A. Askarzadeh, “Bird mating optimizer: an optimization algorithm inspired by birdmating strategies,” Communications in Nonlinear Science and Numerical Simulation, vol. 19, no. 4, pp. 1213–1228, 2014. View at: Publisher Site  Google Scholar
 X. Xu, Y. Tang, J. Li, C. Hua, and X. Guan, “Dynamic multiswarm particle swarm optimizer with cooperative learning strategy,” Applied Soft Computing, vol. 29, pp. 169–183, 2015. View at: Publisher Site  Google Scholar
 A. Draa, S. Bouzoubia, and I. Boukhalfa, “A sinusoidal differential evolution algorithm for numerical optimisation,” Applied Soft Computing, vol. 27, pp. 99–126, 2015. View at: Publisher Site  Google Scholar
 G. Sun, R. Zhao, and Y. Lan, “Joint operations algorithm for largescale global optimization,” Applied Soft Computing, vol. 38, pp. 1025–1039, 2016. View at: Publisher Site  Google Scholar
 J. Wang, B. Zhou, and S. Zhou, “An improved Cuckoo search optimization algorithm for the problem of chaotic systems parameter estimation,” Computational Intelligence and Neuroscience, vol. 2016, Article ID 2959370, 8 pages, 2016. View at: Publisher Site  Google Scholar
 E. R. Tanweer, S. Suresh, and N. Sundararajan, “Selfregulating particle swarm optimization algorithm,” Innovative Applications of Artificial Neural Networks in Engineering, vol. 294, pp. 182–202, 2015. View at: Google Scholar
 F. T. Zhao, Z. Yao, J. Luan, and X. Son, “A novel fused optimization algorithm of genetic algorithm and Ant colony optimization,” Mathematical Problems in Engineering, vol. 2016, Article ID 2167413, 10 pages, 2016. View at: Publisher Site  Google Scholar
 M. Thankur, “A new genetic algorithm for global optimization of multimodal continuous functions,” Journal of Computational Science, vol. 5, no. 2, pp. 298–311, 2014. View at: Publisher Site  Google Scholar
 N. Zare, H. Shameli, and H. Parvin, “An innovative naturalderived metaheuristic optimization method,” Applied Intelligence, 2016. View at: Publisher Site  Google Scholar
 C. Cubukcuoglu, I. Chatzikonstantinou, M. Tasgetiren, I. Sariyildiz, Q.K. Pan, and Q. K. Pan, “A multiobjective harmony search algorithm for sustainable design of floating settlements,” Algorithms, vol. 9, no. 3, p. 51, 2016. View at: Publisher Site  Google Scholar
 I. Obagbuwa and A. Abidoye, “Binary Cockroach swarm optimization for combinatorial optimization problem,” Algorithms, vol. 9, no. 3, p. 59, 2016. View at: Publisher Site  Google Scholar
 R. M. Rizk Allah., “Hybridization of Fruit fly optimization algorithm and Firefly algorithm for solving Nonlinear Programming problems,” International Journal of Swarm Intelligence and Evolutionary Computation, vol. 5, no. 2, 2016. View at: Publisher Site  Google Scholar
 Q. Zhang, A. Zhou, S. Zhao, P. Suganthan, W. Liu, and S. Tiwari, “Multiobjective optimization test instances for the CEC 2009 special session and competition,” Tech. Rep., American Society of Mechanical Engineers, New York, NY, USA, 2009, Technical Report CES487. View at: Google Scholar
 R. M. May, “Simple mathematical models with very complicated dynamics,” Nature, vol. 261, no. 5560, pp. 459–467, 1976. View at: Publisher Site  Google Scholar
 R. Storn and K. Price, “Differential evolution a simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341–359, 1997. View at: Publisher Site  Google Scholar
 H. Gao and W. Xu, “A new particle swarm algorithm and its globally convergent modifications,” IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), vol. 41, no. 5, pp. 1334–1351, 2011. View at: Publisher Site  Google Scholar
 R. Mallipeddi, P. N. Suganthan, Q. K. Pan, and M. F. Tasgetiren, “Differential evolution algorithm with ensemble of parameters and mutation strategies,” Applied Soft Computing, vol. 11, no. 2, pp. 1679–1696, 2011. View at: Publisher Site  Google Scholar
 Y. Liang, Y. Liu, and L. Zhang, “An improved Artificial Bee Colony (ABC) algorithm for large scale optimization,” in Proceedings of the 2nd International Symposium on Instrumentation and Measurement, Sensor Network and Automation (IMSNA), Toronto, Canada, December 2013. View at: Google Scholar
 X. S. Yang, NatureInspired Metaheuristic Algorithms, Luniver Press, Washington, DC, USA, Second edition, 2011.
 S. C. Satapathy and A. Naik, “Modified TeachingLearningBased Optimization algorithm for global numerical optimizationa comparative study,” Swarm and Evolutionary Computation, vol. 16, pp. 28–37, 2014. View at: Publisher Site  Google Scholar
 P. N. Suganthan, N. Hansen, and J. J. Liang, Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real Parameter Optimization, Nanyang Technological University, Nanyang, Singapore, 2005.
 M. Gitizadeh, S. Goodarzi, and A. R. Abbasi, “An efficient linear network model for transmission expansion planning based on piecewise McCormick Relaxation,” IET Generation Transmission & Distribution, vol. 13, no. 23, pp. 5404–5412, 2019. View at: Publisher Site  Google Scholar
 A. R. Abbasi, M. R. Mahmoudi, and Z. Avazzadeh, “Diagnosis and clustering of power transformer winding fault types by crosscorrelation and clustering analysis of FRA results, IET Generation,” Transmission & Distribution, vol. 12, no. 19, pp. 4301–4309, 2018. View at: Publisher Site  Google Scholar
 A. Abbasi and A. Seifi, “A novel method mixed power flow in transmission and distribution systems by using masterslave splitting method,” Electric Power Components and Systems, vol. 36, no. 11, pp. 1141–1149, 2008. View at: Publisher Site  Google Scholar
 A. R. Abbasi and A. R. Seifi, “Considering cost and reliability in electrical and thermal distribution networks reinforcement planning,” Energy, vol. 84, pp. 25–35, 2015. View at: Publisher Site  Google Scholar
 A. R. Abbasi and A. R. Seifi, “A new coordinated approach to state estimation in integrated power systems,” International Journal of Electrical Power & Energy Systems, vol. 45, no. 1, pp. 152–158, 2013. View at: Publisher Site  Google Scholar
 A. R. Abbasi and A. R. Seifi, “Energy expansion planning by considering electrical and thermal expansion simultaneously,” Energy Conversion and Management, vol. 83, pp. 9–18, 2014. View at: Publisher Site  Google Scholar
 A. R. Abbasi and A. R. Seifi, “Unified electrical and thermal energy expansion planning with considering network reconfiguration, IET Generation,” Transmission & Distribution, vol. 9, no. 6, pp. 592–601, 2015. View at: Google Scholar
 A. Zare, A. KavousiFard, A. Abbasi, and F. KavousiFard, “A sufficient stochastic framework to capture the uncertainty of load models in the management of distributed generations in power systems,” Journal of Intelligent & Fuzzy Systems, vol. 28, no. 1, pp. 447–456, 2015. View at: Publisher Site  Google Scholar
 A. R. Abbasi and A. R. Seifi, “Simultaneous Integrated stochastic electrical and thermal energy expansion planning,” IET Generation, Transmission & Distribution, vol. 8, no. 6, pp. 1017–1027, 2014. View at: Publisher Site  Google Scholar
 A. R. Abbasi and A. R. Seifi, “Fast and perfect damping circuit for ferroresonance phenomena in coupling capacitor voltage transformers,” Electric Power Components and Systems, vol. 37, no. 4, pp. 393–402, 2009. View at: Publisher Site  Google Scholar
 A. KavousiFard, S. Abbasi, A. Abbasi, and S. Tabatabaie, “Optimal probabilistic reconfiguration of smart distribution grids considering penetration of plugin hybrid electric vehicles,” Journal of Intelligent & Fuzzy Systems, vol. 29, no. 5, pp. 1847–1855, 2015. View at: Publisher Site  Google Scholar
 M. Javidsharifi, T. Niknam, J. Aghaei, G. Mokryani, and P. Papadopoulos, “Multiobjective dayahead scheduling of microgrids using modified grey wolf optimizer algorithm,” Journal of Intelligent & Fuzzy Systems, vol. 36, no. 3, pp. 2857–2870, 2019. View at: Publisher Site  Google Scholar
 M. Mohammadi, S. Soleymani, T. Niknam, and T. Amraee, “Stochastic multiobjective distribution automation strategies from reliability enhancement point of view in the presence of plug in electric vehicles,” Journal of Intelligent & Fuzzy Systems, vol. 36, no. 3, pp. 2933–2945, 2019. View at: Publisher Site  Google Scholar
 M. Saeidi, T. Niknam, J. Aghaei, and M. Zare, “Multiobjective coordination of local and centralized volt/var control with optimal switch and distributed generations placement,” Journal of Intelligent & Fuzzy Systems, vol. 36, no. 6, pp. 6605–6617, 2019. View at: Publisher Site  Google Scholar
 M. Ahmadi, K. Kazemi, A. Aarabi, T. Niknam, and M. S. Helfroush, “Image segmentation using multilevel thresholding based on modified bird mating optimization,” Multimedia Tools and Applications, vol. 78, no. 16, pp. 23003–23027, 2019. View at: Publisher Site  Google Scholar
 S. Pirouzi, J. Aghaei, T. Niknam et al., “Power conditioning of distribution networks via singlephase electric vehicles equipped,” IEEE Systems Journal, vol. 13, no. 3, pp. 3433–3442, 2019. View at: Publisher Site  Google Scholar
 M. J. Mokarram, T. Niknam, J. Aghaei, M. Shafiekhah, and J. P. S. Catalão, “Hybrid optimization algorithm to solve the nonconvex multiarea economic dispatch problem,” IEEE Systems Journal, vol. 13, no. 3, pp. 3400–3409, 2019. View at: Publisher Site  Google Scholar
Copyright
Copyright © 2020 Feng Qian et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.