Table of Contents Author Guidelines Submit a Manuscript
Modelling and Simulation in Engineering
Volume 2018, Article ID 1851275, 19 pages
https://doi.org/10.1155/2018/1851275
Research Article

Structured Clanning-Based Ensemble Optimization Algorithm: A Novel Approach for Solving Complex Numerical Problems

1Malaviya National Institute of Technology, Jaipur, India
2Swami Keshvanand Institute of Technology, Management & Gramothan, Jaipur, India
3Indian Institute of Technology Delhi, New Delhi, India

Correspondence should be addressed to Avinash Sharma; moc.liamg@03tinmhsaniva

Received 18 August 2018; Revised 10 October 2018; Accepted 4 November 2018; Published 9 December 2018

Academic Editor: Azah Mohamed

Copyright © 2018 Avinash Sharma et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

In this paper, a novel swarm intelligence-based ensemble metaheuristic optimization algorithm, called Structured Clanning-based Ensemble Optimization, is proposed for solving complex numerical optimization problems. The proposed algorithm is inspired by the complex and diversified behaviour present within the fission-fusion-based social structure of the elephant society. The population of elephants can consist of various groups with relationship between individuals ranging from mother-child bond, bond groups, independent males, and strangers. The algorithm tries to model this individualistic behaviour to formulate an ensemble-based optimization algorithm. To test the efficiency and utility of the proposed algorithm, various benchmark functions of different geometric properties are used. The algorithm performance on these test benchmarks is compared to various state-of-the-art optimization algorithms. Experiments clearly showcase the success of the proposed algorithm in optimizing the benchmark functions to better values.

1. Introduction

With the ever increasing demand for new and better utility-driven technologies, the challenges surrounding them are becoming more and more complex. These optimization problems were traditionally solved using classical deterministic methods [1], which were often quite efficient in finding the solutions. But the ever increasing complexity of these problems has made these classical techniques quite unreliable for various real-world engineering problems. Here, various metaheuristic optimization [1] algorithms prove to be quite successful. Often inspired by nature, these algorithms makes use of various stochastic [2] techniques for finding the solution. These metaheuristics are higher-level heuristics that try to find a partial solution called heuristic representing a sufficiently good solution to the optimization problem [36]. The presence of randomness drives the algorithms towards promising regions of the search space. Simplicity of use, high reliability, and high flexibility has made these algorithms quite popular in last few years [7].

Among these algorithms, many are based on multiagent paradigms. These include various evolutionary algorithms [8] inspired by the evolutionary mechanisms found in nature. These algorithms [915] use various mechanisms like crossover, mutation, and selection for solving the optimization problems. Among the most popular metaheuristic, GA [9] was proposed by Holland in 1970s. It represents the information in the form of genetic representation as bits and performs evolutionary processes like selection, mutation, and crossover on it. Other population-based algorithms utilize swarm intelligence-based mechanisms [1620] to solve the optimization problem. These algorithms make use of information distributed in the individual participants of the swarm. The lack of any central control structure makes it vital for the distributed information to be shared amongst the individuals of the swarm. The interaction between these individuals leads to the global intelligent behaviour which plays an important role in the performance of these algorithms. Thus swarm intelligence can be seen as the accumulation of various agents generally comprising social creatures like birds, ants, bees, fish, and so on and their intelligent behaviour at the individual level to replicate a complicated and collaborative behaviour at the social level. Social researchers around the world have analyzed such behaviours and developed algorithms replicating these behaviours that can be used to solve optimization problems in scientific and engineering domains.

Most of these algorithms focus on one specific strategy to evaluate the search space for the solution. This makes them vulnerable to solve problems that they are not designed to solve. Due to the presence of multiple solution update strategies, ensemble-based approaches can generally be more robust to find the solution in a more complex and unknown search space. The proposed algorithm incorporates an ensemble of solution update strategies to efficiently evaluate the problem search space.

Formulating any swarm intelligence-based optimization algorithm can be a tricky task. For any stochastic optimization method to be successful, it requires to maintain a proper balance between its exploratory and exploitative nature. It requires the understanding of whether a group replicates features of self-organization or not. Self Organization is quite common in various decentralized societies like fish schools and insect colonies. These systems utilize the local information individually to achieve a common complex task. Here, the self-organization process helps in the form of local interactions between the lower system of entities and enables some coordination in the working of the group. This process occurs with some sort of initial randomness and is not controlled by any external/central authority. This overall process takes place in a decentralized manner such that individual entities work on the basis of local information.

This paper proposes a novel swarm intelligence-based metaheuristic optimization algorithm “Structured Clanning-based Ensemble Optimization (SCEO)” inspired by the social organization of the elephant clan. The algorithm utilizes the individual diversity found within the complex social structure of the elephant population to formulate an ensemble-based optimization algorithm. So far, only a small amount of work has been done on developing elephant-social-structure-inspired algorithms. In 2016, Wang et al. [21] proposed the elephant herding optimization (EHO) algorithm inspired by herding behaviour in the elephant groups. In elephant herding optimization (EHO) algorithm, only one type of male elephants (apart from the bounded clan elephants) are simulated that although live in isolation with the parent clan, stay in touch with the clan by simulating low-frequency vibrations. Compared to that, the proposed SCEO algorithm simulates a clan with elephants possessing multiple types of behaviour. The algorithm simulates elephants with varying relational strength to the clan. Apart from the various types of bounded elephants (child, female, bounded, and semibounded), some male elephants are unbounded and can move independently to any particular clan. This enables the proposed algorithm to use an ensemble of solution search strategies within it making it more robust to different types of optimization problem.

The rest of the paper is organized as follows: Section 2 gives an overview of the social structure of the elephant population followed by the explanation of the proposed algorithm in Section 3. Section 4 contains analysis of impact of various parameters on the performance of the algorithm. Section 5 compares the proposed algorithm with various state-of-the-art optimization algorithms. Finally, conclusion of this study is given in Section 6.

2. Social Structure of the Elephants

From the Elephantidae family and the Proboscidea order, elephants are one of the largest land animals on Earth. Studies had shown them to be among the most intelligent organisms. Elephants are well known to have a very complex social structure. Their social order is highly fission-fusion based with relationship between individuals ranging from mother-child bond, bond groups, independent males, and strangers [22]. Apart from this, the degree of bonding can vary quite a lot with time. The behaviour of individuals and their relationship with others varies widely over time. As with many sexually dimorphic mammals, the roles and behaviour characteristics of male and female elephant are quite different.

An elephant society is organized in the form of groups of families bound together in the form of a clan. Each clan is composed of those families who share the same foraging area. A clan is usually made up of several bond groups and numerous families such that several hundred elephants may make up a clan. Each family is led by a female elephant called Matriarch. These female leaders are generally the largest and oldest female in the group and largely affect the structure and fortune of their group. The family unit can vary widely in size with anywhere from 2 to 50 elephants. Apart from the female leader “Matriarch,” each family consists of a mix of other individuals including child elephants, younger female elephants, and younger dependent male elephants.

In contrast to female elephants, the male elephants can show different kinds of behaviour. Young male elephants are bounded with their native family and at 9 to 18 years of age, they leave the group as independent elephants. These independent male elephants can form a small group with other male elephants. But these groups are generally loosely bound. During the mating period, these independent elephants join different family groups moving from one group to other. They do not have preferences for specific family units and will randomly move to different groupings for reproductively receptive females.

This diversity in the individual behaviour and fission-fusion-based social structure is the main inspiration for the proposed algorithm Structured Clanning-based Ensemble Optimization (SCEO).

3. Elephant Social Order-Based Ensemble Optimization Algorithm

To properly mimic the fission-fusion-based social structure of the elephant clan, the population is divided into 2 or more groups. Further, the population as a whole consists of different types of agents with different individualistic property associated with each. These agents are based on behaviour of female leaders, child elephants, other female elephants, bounded male elephants (young males), semibounded male elephants, and unbounded male elephants (fully independent males) (Figure 1).

Figure 1: Structured clanning-based ensemble optimization algorithm.

The agents have different behaviours associated with them modelled in the form of the properties and strategies associated with them.

3.1. Global Female Leader (GFL)

The global female leader is a local female leader that occupies the best position found by the population so far. This leader retains its position in the next iteration and is used to guide other female local leaders to possibly better position in the search space.

3.2. Local Female Leader (LFL)

The local female leader occupies the best position found by her family so far. All the other agents of the family move in a position relative to her; thus, this female acts as a guide to all other members in the family. The position of the local leader is updated relative to the global female leader (GFL). This helps in maintaining an interaction between different families of the swarm. The position of the local female leader is updated as per the following equation:

The local female leader belongs to the family unit. j represents the particular dimension of the position of the agent to be updated. is a randomly generated number between 0 and 1. is the weight coefficient affecting the influence of global female leader on the local female leader. Its value is taken between 0 and 2. By taking to be 2, the value allows the local female leader (LFL) to explore the hypervolume covered by maximum distance in each dimension around the global female leader (GFL) (where is distance between LFL and GFL). The manuscript is now improved to include the above explanation. This limits the net movement of LFL to be within 0 to 2 times the distance between GFL in each direction. This can also be seen in Figure 2. In vector form, the position update equation for local female leader can be formulated as given in Equation (2). Here, is the old position and is the new position of the agent. The matrix operation is a Boolean operation and compares the two vectors resulting in 1 at indexes where the condition is met and 0 where the condition is not met:

Figure 2: range for local female leader.

The numerical example shown in Equation (3) shows the update equation step by step calculation for one of the trials for a 5-dimensional sphere function.

Here,  , , and :

Not all the dimensions of the position is changed by . This is decided on the basis of perturbation rate (). The value of perturbation rate is taken between 0 and 1 with larger value causing more dimensions to be affected. Perturbation rate () in the SCEO algorithm is taken to introduce mutation factor into the position update of the LFL agents. In many ways, this is similar in principle to the perturbation factors used in various evolutionary algorithms like differential evolution. By generating a random number () between 0 and 1 and comparing it to the value, some of the dimensions of the LFL can be restricted from change (if ). Thus, the perturbation rate can be defined as the average ratio of dimensions that are mutated in the direction of GFL.

3.3. Other Permanently Bounded Agents

These include agents inspired by the behaviour of child elephants, other female elephants, and young male elephants which are permanently bounded to their native family unit. These agents move in a fixed proximity of their group’s local female leader. The child agents move relatively near to the local female leader forming the innermost family circle. Child agents are followed by the female agents that are still in close proximity to the local leader as compared to the young male agents. These agents change their position as per the following equation:

In vector form, the position update equation for bounded agents can be formulated as given in Equation (5). Equation (6) shows the numerical calculation example for solution update of a bounded agent. The example uses child agents with . The α value in the example is taken to be 1 representing very first iteration in the solution search space:

Here, represents the agent of the swarm bounded to the group. and are the upper and lower bounds of the search space. is a randomly generated number between −1 and 1. is the weight coefficient defining the local leader proximity region in which the agent can move. Its value is taken to be the lowest for child agents and highest for the young male agents. The low value of helps in better exploitation of the search region. α is used to control the balance between exploration and exploitation of the algorithm. Its value is linearly decreased from high value (near 1) at the start to a low value (near 0) at the end of the algorithm. The high value at the start helps in better exploration as the agents are able to move larger area around the local female leader. The low value of α at the end reduces the movement region for the agent, thus helping in the exploitation of the search space near the female leaders at the end of Algorithm 1.

Algorithm 1
3.4. Semibounded Male Agents

These male agents are not bounded to a single group and instead move from one group to other randomly. The movement of these agents in the search space is implemented using the following equation:

The position update equation in the vector form for semibounded agents can be formulated as given in Equation (8). Equation (9) shows the numerical calculation example for solution update of a semibounded agent. In the example, the weights and are taken to be 1.33:

Here, represents the agent of the swarm. is the randomly selected family and is the best position that has been found by the so far. and are random numbers between 0 and 1. and are weight coefficients affecting the influence of the female leader and the agent’s best position on its movement.

3.5. Unbounded Male Agents

These are independent agents that are not bound to any particular family and move randomly in the search space. This is formulated by random selection of three agents in the search space and using their positions to change the position of this unbounded agent. The position update equation is as follows:.

The position update equation in the vector form for unbounded agents can be formulated as given in Equation (11). Equation (12) shows the numerical calculation example for solution update of an unbounded agent. In the example the weight is taken as 0.5 and perturbation rate is taken 0.6:

, , and are 3 randomly selected agents from the swarm. The update probability of the position is decided on the basis of the perturbation rate (), with higher value causing more dimensions to get modified.

The overall SCEO algorithm can be implemented as in Algorithm 19. Figure 3 shows the overall flowchart of the proposed SCEO algorithm.

Figure 3: Flowchart of the SCEO algorithm.

4. Impact of Parameters

The performance of every optimization algorithm is largely affected by the choice of the related hyperparameter values. The proposed algorithm has 8 such hyperparameters: number of groups (), number of child agents per group (), number of female agents per group () and number of bounded agents per group (), number of semibounded agents (), number of unbounded agents (), and perturbation rates ( and ).

To find out the proper configuration of the hyperparameters, a tuning strategy called F-Race [23] is used. F-Race is type of racing algorithm that utilizes Friedman test [24] to evaluate a set of configurations and eliminate the poor ones based on their performance. This leads to efficient allocation of the computational resources towards testing the promising configurations more. The proposed algorithm was tested with 4374 different configurations with different possible values of parameters as shown in Table 1.

Table 1: SCEO checked configuration for hyperparameters.

For F-Race, the number of initial races r was taken as 5 and significance level α was taken as 0.05. The maximum executions for the algorithm was set to 60000. For evaluation, 8 different functions (as mentioned in Table 2) were taken. The configurations were evaluated using vertical approach based on the mean fitness with a precision up to 8 decimals. The maximum function evaluation was set to 300000 (). For first 40000 executions, the functions , , , and were used for evaluation; after that, functions , , , and were used. In this way, both the learning and testing sets have functions with similar characteristics with regards to modality. The resulting tuned values are shown in Table 1.

Table 2: Set of functions for F-Race-based SCEO tuning.

5. Experimental Setup

This section benchmarks the performance of the proposed SCEO algorithm compared to other state-of-the-art algorithms on various benchmark functions taken from [2528]. The benchmark functions in Tables 35 provide a wide range of optimization benchmarks with varying degree of complexity in terms of shift, rotation, and modality. The benchmark function tables show the number of dimensions, range, optimal value, and modality of the functions along with their mathematical expressions. Further, to crank up the benchmark difficulty, the proposed algorithm is tested on full set of CEC 2014 [28] test set and compared against some relatively new algorithms.

Table 3: Lattice parameters, volume, density, and crystallite size derived using Scherre’s equation.
Table 4: Comparison (atomic % and weight %) of LiCoO2 and Ce‐doped compositions from EDX analysis.
Table 5: Benchmark functions.
5.1. Evaluation on Benchmark Functions

For each function, the optimization algorithm is used for 10000 function evaluations per dimension of the benchmark function. Each function is tested 51 times and mean, standard deviation, and best and worst of these runs are used to evaluate the performance of the compared algorithms. An error of has been taken as error tolerance and is considered to be zero error. The proposed algorithm (SCEO) has been compared with other state-of-the-art optimization algorithms: FEP [15], ABC [29], GWO [7], and GSA [30]. The comparison results are tabulated in Tables 69. The parameters of these algorithms were tuned by using sensitivity analysis [31]. The hyperparameter setting for the compared algorithms is shown in Table 10.

Table 6: Comparison with FEP [15], ABC [29], GWO [7], and GSA [30].
Table 7: Comparison with FEP [15], ABC [29], GWO [7], and GSA [30].
Table 8: Comparison with PSO [17], FEP [15], DE [14], ABC [29], GWO [7], and GSA [30].
Table 9: Comparison with FEP [15], ABC [29], GWO [7], and GSA [30].
Table 10: Hyperperameter setting for compared algorithms.

Further comparison of these algorithms has been done using parametric and nonparametric tests. t-value and h-value of the algorithm pairs is calculated using two-tailed t-test (parametric test with 5% significance level and 100 degrees of freedom) and tabulated in Tables 11 and 12. In Tables 11 and 12, the negative t-value indicates that SCEO algorithm is better than the compared algorithm with higher value indicating more statistical difference between from compared algorithms. Further, Shapiro–wilk test [32, 33] was conducted and tabulated in Table 13. This tests the null hypothesis that if the result data are coming from a normal distribution or not. h-value of 1 indicates that the benchmark performance data is not coming from normal distribution and the parametric test is not sufficient to compare the performance of the algorithms. Thus, to properly compare the algorithms, a nonparametric test in the form of Wilcoxon ranked sum test [34, 35] is conducted and tabulated in Table 14.

Table 11: t-value and h-value comparison with FEP [15], ABC [29], GWO [7], and GSA [30].
Table 12: t-test against FEP [15], ABC [29], GWO [7], and GSA [30].
Table 13: Shapiro–Wilk test for SCEO, FEP [15], ABC [29], GWO [7], and GSA [30] samples.
Table 14: Wilcoxon test against FEP [15], ABC [29], GWO [7], and GSA [30].

It is clearly seen from Tables 69, 11, and 12 that SCEO easily outperformed other state-of-the-art algorithms for most of the tested benchmark functions. This is clearly indicated by the negative t-value for t-test with h-value being 1 indicating better performance of SCEO in comparison to these algorithms. As evident from Tables 11 and 12, t-value and h-value of the 30 tested benchmark functions SCEO outperformed FEP on 25, ABC on 14 (ABC outperformed SCEO on 1 function: ), GWO on 16, and GSA on 21 of the functions. On other functions, the performance of the algorithms was comparable, and there was not much of a statistical difference in results for those functions. Shapiro–Wilk test (Table 13) clearly shows that parametric test is enough for performance comparison of the tested algorithms as most of the h-values in the table are 0. Similar results can be seen with Wilcoxon test (Table 14) as well. The Wilcoxon test clearly shows that SCEO outperformed FEP on 24, ABC on 13, GWO on 15, and GSA on 21 functions.

5.2. Evaluation on CEC 2014 Benchmark Functions

In order to test the efficiency and reliability of the algorithm, it has been tested on CEC 2014 [28] test set against some of the newly proposed algorithms. These include L-SHADE [36], MVMO [37, 38], jDE [39], CETMS [40], FWA-DM [41], TSC-PSO [42], and HCA-SA [43] algorithms. L-Shade [36] proposed by Tanabe et al. is a modification of the SHADE [44] algorithm incorporating linear population size reduction (LPSR) strategy within it. MVMO [37] proposed by Erlich et al. makes use of a mechanism that adopts a single parent-offspring pair approach along a normalized search space. jDE [39] algorithm introduces self-adaptivity to the vanilla DE [14] algorithm. CETMS [40] algorithm uses various features of tissue membrane systems and FWA-DM [41] uses differential mutation strategy on the vanilla fireworks algorithm [45]. TSC-PSO [42] applies spatial correlation-based movement strategy on the PSO [17] algorithm. HCA-SA [43] introduces self-adaptivity in the vanilla cuckoo search algorithm (CSA) [46].

Tables 15 and 16 compares the proposed algorithm with L-SHADE [36], MVMO [37, 38], jDE [39], CETMS [40], FWA-DM [41], TSC-PSO [42], and HCA-SA [43] algorithms on CEC 2014 [28] test set. Comparison is done in terms of mean and standard deviation of the 51 runs of each algorithm as per the guidelines of the CEC 2014 [28]. Clearly, the SCEO algorithm offers a comparable performance against all the newly proposed algorithms. To analyze the results, Shapiro–Wilk and Wilcoxon tests were performed as shown in Tables 17 and 18, respectively. As most of the h-values returned by Shapiro–Wilk test are 1, it is possible that the samples are not representing normal distribution, making parametric tests like t-test inefficient for comparing the algorithms. Thus, Wilcoxon test was performed and tabulated in Table 18. The Wilcoxon test clearly shows that SCEO outperformed L-SHADE on 7 (L-SHADE outperformed on 12), MVMO on 11 (MVMO outperformed on 8), jDE on 20 (jDE outperformed on 2), CETMS on 18 (CETMS outperformed on 7), FWA-DM on 21 (FWA-DM outperformed on 1), TSC-PSO on 8 (TSC-PSO outperformed on 14), and HCA-SA on 15 (HCA-SA outperformed on 8) functions. On other functions, the performance of the compared algorithms was similar.

Table 15: Comparison with L-SHADE [36], MVMO [37], jDE [39], CETMS [40], FWA-DM [41], TSC-PSO [42], and HCA-SA [43] algorithms on CEC 2014 functions.
Table 16: Comparison with L-SHADE [36], MVMO [37], jDE [39], CETMS [40], FWA-DM [41], TSC-PSO [42], and HCA-SA [43] algorithms on CEC 2014 functions.
Table 17: Shapiro–Wilk test for SCEO, L-SHADE [36], MVMO [37], jDE [39], CETMS [40], FWA-DM [41], TSC-PSO [42], and HCA-SA [43] algorithms on CEC 2014 functions.
Table 18: Wilcoxon test against L-SHADE [36], MVMO [37], jDE [39], CETMS [40], FWA-DM [41], TSC-PSO [42], and HCA-SA [43] algorithms.

This clearly proves the reliability of the proposed algorithm is complex optimization scenarios. This was even more evident in case of composition functions () where the SCEO algorithm performed much better than all the compared algorithms. Further, the high convergence rate (Figures 45) clearly shows the applicability of the proposed algorithm in real-world optimization problems.

Figure 4: CEC14 Fun3 convergence plot.
Figure 5: CEC14 Fun7 convergence plot.

6. Conclusion

Elephant population in nature is generally composed of highly complex fission-fusion-based social structure. The individuals in the elephant society can exhibit diverse types of behaviour. The population of elephants can consist of various groups with different types of individuals exhibiting different types of behaviours with member of the society. The relationship between individuals can range from mother-child bond, bond groups, independent males, and strangers. The paper proposes a novel swarm intelligence-based metaheuristic optimization algorithm called Structured Clanning-based Ensemble Optimization. The proposed algorithm is inspired by this complex social order of the elephant society and tries to model the individualistic behaviour. The algorithm is tested against other algorithms (FEP, ABC, GWO, and GSA) on 60 different benchmark functions. The experiments clearly show the superiority of the SCEO algorithm against the other tested algorithms proving it to be reliable and efficient for solving complex real-world numerical optimization problems.

The proposed algorithm still has a scope for improvement in its performance. Future research can be directed towards studying the various hyperparameters of the algorithm by incorporating adaptability within them. Further, constrained and multiobjective models of the proposed algorithm can be formulated to extend the utility of the proposed algorithm in the real-world optimization applications.

Data Availability

The data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

References

  1. M.-H. Lin, J.-F. Tsai, and C.-S. Yu, “A review of deterministic optimization methods in engineering and management,” Mathematical Problems in Engineering, vol. 2012, Article ID 756023, 15 pages, 2012. View at Publisher · View at Google Scholar · View at Scopus
  2. D. B. Shmoys and C. Swamy, “Stochastic Optimization is (almost) as Easy as deterministic optimization,” in Proceedings of 45th Annual IEEE Symposium on Foundations of Computer Science, pp. 228–237, Rome, Italy, October 2004. View at Publisher · View at Google Scholar
  3. I. Boussaїd, J. Lepagnot, and P. Siarry, “A survey on optimization metaheuristics,” Information Sciences, vol. 237, pp. 82–117, 2013. View at Publisher · View at Google Scholar · View at Scopus
  4. L. Bianchi, M. Dorigo, L. M. Gambardella, and W. J. Gutjahr, “A survey on metaheuristics for stochastic combinatorial optimization,” Natural Computing: An International Journal, vol. 8, no. 2, pp. 239–287, 2009. View at Publisher · View at Google Scholar · View at Scopus
  5. C. Blum and A. Roli, “Metaheuristics in combinatorial optimization: overview and conceptual comparison,” ACM Computing Surveys, vol. 35, no. 3, pp. 268–308, 2003. View at Publisher · View at Google Scholar · View at Scopus
  6. E.-G. Talbi, Metaheuristics: from Design to Implementation, Wiley, Hoboken, NJ, USA, 2009.
  7. S. Mirjalili, S. M. Mirjalili, and A. Lewis, “Grey wolf optimizer,” Advances in Engineering Software, vol. 69, pp. 46–61, 2014. View at Publisher · View at Google Scholar · View at Scopus
  8. A. Trivedi, D. Srinivasan, K. Sanyal, and A. Ghosh, “A survey of multiobjective evolutionary algorithms based on decomposition,” IEEE Transactions on Evolutionary Computation, vol. 21, no. 3, pp. 440–462, 2017. View at Publisher · View at Google Scholar · View at Scopus
  9. J. H. Holland, “Genetic algorithms: computer programs that “evolve” in ways that resemble natural selection can solve complex problems ever their creators do not fully understand,” in Scientific American, Nature Publishing Group, London, UK, 1992. View at Google Scholar
  10. L. Poli, G. Oliveri, and A. Massa, “An integer genetic algorithm for optimal clustering in phased array antenna,” in Proceedings of 2017 International IEEE Applied Computational Electromagnetics Society Symposium-Italy (ACES), pp. 1-2, Firenze, Italy, March 2017. View at Publisher · View at Google Scholar · View at Scopus
  11. M. H. Manser, I. Musirin, and M. M. Othman, “Immune Log-Normal Evolutionary Programming (ILNEP) for solving economic dispatch problem with prohibited operating zones,” in Proceedings of 2017 4th International Conference on Industrial Engineering and Applications (ICIEA), pp. 163–167, IEEE, Nagoya, Japan, April 2017.
  12. I. Rechenberg, “Evolution strategy,” in Computational Intelligence: Imitating Life 1, Institute of Electrical & Electronics Engineers, Piscataway, NJ, USA, 1994. View at Google Scholar
  13. J. R. Koza, Genetic Programming: On the Programming of Computers by Means of Natural Selection, vol. 1, MIT Press, Cambridge, MA, USA, 1992.
  14. R. Storn and K. Price, “Differential evolution−a simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341–359, 1997. View at Publisher · View at Google Scholar · View at Scopus
  15. X. Yao, Y. Liu, and G. Lin, “Evolutionary programming made faster,” IEEE Transactions on Evolutionary Computation, vol. 3, no. 2, pp. 82–102, 1999. View at Publisher · View at Google Scholar · View at Scopus
  16. M. Dorigo, M. Birattari, and T. Stutzle, “Ant colony optimization,” IEEE Computational Intelligence Magazine, vol. 1, no. 4, pp. 28–39, 2006. View at Publisher · View at Google Scholar
  17. R. C. Eberhart and J. Kennedy, “A new optimizer using particle swarm theory,” in Proceedings of the Sixth International Symposium on Micro Machine and Human Science, vol. 1, Nagoya, Japan, October 1995.
  18. K. M. Passino, “Biomimicry of bacterial foraging for distributed optimization and control,” IEEE Control Systems, vol. 22, no. 3, pp. 52–67, 2002. View at Publisher · View at Google Scholar · View at Scopus
  19. J. C. Bansal, H. Sharma, S. S. Jadon, and M. Clerc, “Spider monkey optimization algorithm for numerical optimization,” Memetic Computing, vol. 6, no. 1, pp. 31–47, 2014. View at Publisher · View at Google Scholar · View at Scopus
  20. A. Sharma, A. Sharma, B. K. Panigrahi, D. Kiran, and R. Kumar, “Ageist spider monkey optimization algorithm,” Swarm and Evolutionary Computation, vol. 28, pp. 58–77, 2016. View at Publisher · View at Google Scholar · View at Scopus
  21. G.-G. Wang, S. Deb, X.-Z. Gao, and L. dos Santos Coelho, “A new metaheuristic optimization algorithm motivated by elephant herding behavior,” International Journal of Bio-Inspired Computation, vol. 8, no. 6, p. 394, 2016. View at Publisher · View at Google Scholar · View at Scopus
  22. R. Sukumar, The Asian Elephant: Ecology and Management, Cambridge University Press, Cambridge, UK, 1992.
  23. M. Birattari, T. Stützle, L. Paquete, and K. Varrentrapp, “A racing algorithm for configuring metaheuristics,” in Proceedings of the 4th Annual Conference on Genetic and Evolutionary Computation, pp. 11–18, Morgan Kaufmann Publishers Inc., Honolulu, Hawaii, May 2002.
  24. E. Theodorsson-Norheim, “Friedman and Quade tests: BASIC computer program to perform nonparametric two-way analysis of variance and multiple comparisons on ranks of several related samples,” Computers in Biology and Medicine, vol. 17, no. 2, pp. 85–99, 1987. View at Publisher · View at Google Scholar · View at Scopus
  25. J. G. Digalakis and K. G. Margaritis, “On benchmarking functions for genetic algorithms,” International Journal of Computer Mathematics, vol. 77, no. 4, pp. 481–506, 2001. View at Publisher · View at Google Scholar
  26. P. N. Suganthan, N. Hansen, J. J. Liang et al., “Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization,” KanGAL Report, Rep. No. 2005005, 2005.
  27. X. S. Yang, “Appendix A: test problems in optimization,” in Engineering Optimization, pp. 261–266, Taylor & Francis, UK, 2010. View at Google Scholar
  28. J. J. Liang, B. Y. Qu, and P. N. Suganthan, “Problem definitions and evaluation criteria for the CEC 2014 special session and competition on single objective real-parameter numerical optimization,” Nanyang Technological University, Singapore, 2013, Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou China and Technical Report. View at Google Scholar
  29. B. Basturk and D. Karaboga, “An artificial bee colony (ABC) algorithm for numeric function optimization,” in Proceedings of IEEE Swarm Intelligence Symposium, vol. 8, no. 1, pp. 687–697, Indianapolis, IN, USA, May 2006.
  30. E. Rashedi, H. Nezamabadi-Pour, and S. Saryazdi, “GSA: a gravitational search algorithm,” Information Sciences, vol. 179, no. 13, pp. 2232–2248, 2009. View at Publisher · View at Google Scholar · View at Scopus
  31. A. E. Eiben and S. K. Smit, “Parameter tuning for configuring and analyzing evolutionary algorithms,” Swarm and Evolutionary Computation, vol. 1, no. 1, pp. 19–31, 2011. View at Publisher · View at Google Scholar · View at Scopus
  32. P. Royston, “Approximating the shapiro-wilk W-test for non-normality,” Statistics and Computing, vol. 2, no. 3, pp. 117–119, 1992. View at Publisher · View at Google Scholar · View at Scopus
  33. N. M. Razali and Y. B. Wah, “Power comparisons of shapiro-wilk, Kolmogorov-smirnov, lilliefors and anderson-darling tests,” Journal of Statistical Modeling and Analytics, vol. 2, no. 1, pp. 21–33, 2011. View at Google Scholar
  34. F. Wilcoxon, S. K. Katti, and R. A. Wilcox, “Critical values and probability levels for the Wilcoxon rank sum test and the Wilcoxon signed rank test,” Selected Tables in Mathematical Statistics, vol. 1, pp. 171–259, 1970. View at Google Scholar
  35. J. Derrac, S. García, D. Molina, and F. Herrera, “A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms,” Swarm and Evolutionary Computation, vol. 1, no. 1, pp. 3–18. View at Publisher · View at Google Scholar · View at Scopus
  36. R. Tanabe and A. S. Fukunaga, “Improving the search performance of SHADE using linear population size reduction,” in Proceedings of 2014 IEEE Congress on Evolutionary Computation (CEC), pp. 1658–1665, IEEE, Beijing, China, July 2014. View at Publisher · View at Google Scholar · View at Scopus
  37. I. Erlich, J. L. Rueda, S. Wildenhues, and F. Shewarega, “Solving the IEEE-CEC 2014 expensive optimization test problems by using single-particle MVMO,” in Proceedings of 2014 IEEE Congress on Evolutionary Computation (CEC), pp. 1084–1091, IEEE, Beijing, China, July 2014. View at Publisher · View at Google Scholar · View at Scopus
  38. I. Erlich, J. L. Rueda, S. Wildenhues, and F. Shewarega, “Evaluating the mean-variance mapping optimization on the IEEE-CEC 2014 test suite,” in Proceedings of 2014 IEEE Congress on Evolutionary Computation (CEC), pp. 1625–1632, IEEE, Beijing, China, July 2014. View at Publisher · View at Google Scholar · View at Scopus
  39. J. Brest and M. S. Maučec, “Self-adaptive differential evolution algorithm using population size reduction and three strategies,” Soft Computing, vol. 15, no. 11, pp. 2157–2174, 2011. View at Publisher · View at Google Scholar · View at Scopus
  40. C. Liu and L. Fan, “A hybrid evolutionary algorithm based on tissue membrane systems and CMA-ES for solving numerical optimization problems,” Knowledge-Based Systems, vol. 105, pp. 38–47, 2016. View at Publisher · View at Google Scholar · View at Scopus
  41. C. Yu, L. Kelley, S. Zheng, and Y. Tan, “Fireworks algorithm with differential mutation for solving the CEC 2014 competition problems,” in Proceedings of 2014 IEEE Congress on Evolutionary Computation (CEC), pp. 3238–3245, IEEE, Beijing, China, July 2014. View at Publisher · View at Google Scholar · View at Scopus
  42. A. Sharma, R. Kumar, B. K. Panigrahi, and S. Das, “Termite spatial correlation based particle swarm optimization for unconstrained optimization,” Swarm and Evolutionary Computation, vol. 30, pp. 93–107, 2017. View at Publisher · View at Google Scholar · View at Scopus
  43. U. Mlakar and I. Fister, “Hybrid self-adaptive cuckoo search for global optimization,” Swarm and Evolutionary Computation, vol. 29, pp. 47–72, 2016. View at Publisher · View at Google Scholar · View at Scopus
  44. R. Tanabe and A. Fukunaga, “Evaluating the performance of SHADE on CEC 2013 benchmark problems,” in Proceedings of 2013 IEEE Congress on Evolutionary Computation (CEC), pp. 1952–1959, IEEE, Beijing, China, July 2014. View at Publisher · View at Google Scholar · View at Scopus
  45. Y. Tan and Y. Zhu, “Fireworks algorithm for optimization,” in Proceedings of International Conference in Swarm Intelligence, pp. 355–364, Springer Berlin Heidelberg, Beijing, China, June2010.
  46. A. H. Gandomi, X.-S. Yang, and A. H. Alavi, “Cuckoo search algorithm: a metaheuristic approach to solve structural optimization problems,” Engineering with Computers, vol. 29, no. 1, pp. 17–35, 2013. View at Publisher · View at Google Scholar · View at Scopus