Table of Contents Author Guidelines Submit a Manuscript
Modelling and Simulation in Engineering
Volume 2018, Article ID 4945157, 14 pages
https://doi.org/10.1155/2018/4945157
Research Article

Application and Development of Enhanced Chaotic Grasshopper Optimization Algorithms

1Swami Keshvanand Institute of Technology, Jaipur 302017, India
2Malaviya National Institute of Technology, Jaipur 302017, India

Correspondence should be addressed to Akash Saxena; moc.liamtoh@anexas.hsakaa

Received 30 December 2017; Revised 23 March 2018; Accepted 8 April 2018; Published 23 May 2018

Academic Editor: Gaetano Sequenzia

Copyright © 2018 Akash Saxena et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

In recent years, metaheuristic algorithms have revolutionized the world with their better problem solving capacity. Any metaheuristic algorithm has two phases: exploration and exploitation. The ability of the algorithm to solve a difficult optimization problem depends upon the efficacy of these two phases. These two phases are tied with a bridging mechanism, which plays an important role. This paper presents an application of chaotic maps to improve the bridging mechanism of Grasshopper Optimisation Algorithm (GOA) by embedding 10 different maps. This experiment evolves 10 different chaotic variants of GOA, and they are named as Enhanced Chaotic Grasshopper Optimization Algorithms (ECGOAs). The performance of these variants is tested over ten shifted and biased unimodal and multimodal benchmark functions. Further, the applications of these variants have been evaluated on three-bar truss design problem and frequency-modulated sound synthesis parameter estimation problem. Results reveal that the chaotic mechanism enhances the performance of GOA. Further, the results of the Wilcoxon rank sum test also establish the efficacy of the proposed variants.

1. Introduction

Optimization is a term which refers to the selection of the best option amongst the given set of alternatives. Examples of optimization processes are everywhere such as in business, human resource management, challenging engineering design problems, transportation, profit making propositions, and industrial applications. Optimization can be done for the maximization of any proposition or minimization of any proposition. In engineering problems particularly, the use of maximization is for efficiency maximization, classification accuracy maximization, and revenue or profit maximization, and on the other hand, minimization can be performed for cost, loss, risk, and execution time of any engineering process. Apart from these classifications of optimization, another classification of the optimization problem can be done on the basis of constraints. An optimization problem without any constraints is called unconstrained optimization; similarly another type is constrained optimization with linear and nonlinear constraints. Another classification can be done on the basis of the objective of the optimization; when an optimization problem aims towards a single objective, it is called the single objective optimization problem, and similarly when it aims towards multiobjectives, the same is called the multiobjective optimization problem [1]. A recent trend is to employ metaheuristic optimization algorithms to solve challenging problems of the real world. The term refers to problem-independent higher level heuristic mechanism [2]. In recent years, applications of metaheuristic algorithms in engineering problems have been reported. The successful and effective implementation of these algorithms on real applications has attracted the attention of researchers to work in this direction.

The metaheuristic optimization approaches can be subdivided into three categories:(1)Evolutionary computing-based algorithms [35](2)Physics law-based algorithms [68](3)Swarm intelligence-based algorithms [916]

Evolutionary-based algorithms are based on natural evolution due to environmental pressure [2]. These algorithms employ selection or mixing criterion to generate an optimal solution set which possesses higher fitness values. Basic virtues of these algorithms are of stochastic nature, incorporating crossover and termination operators for hybridizing the solutions and enhancing the fitness value. A few examples of these algorithms are Genetic Algorithm [4] and Evolution Strategy and Evolutionary Programming [5]. Another class of algorithms is the algorithms which are inspired from physics and based on the laws of fundamental physics. A few examples of these algorithms are Gravitational Search Algorithm [14], Big Bang-Big Crunch Algorithm [7], and Black hole Algorithm [8].

The third category is based on swarm intelligence methods, where the cognitive and social behavior of the natural swarms like birds and school of fish is mimicked in the form of simulation. The most famous algorithm in this category is Particle Swarm Optimization (PSO), which works on the philosophy “Follow the Leader” [9]. Other examples of these algorithms are Bat Algorithm [10], Firefly Algorithm [11], and Cuckoo Search Algorithm [12]. A recently published swarm algorithm, which became popular nowadays, is Grey Wolf Optimizer (GWO), the algorithm that mimics the hunting behavior of grey wolves and is a fine example of the compliance of the social hierarchy of the wolf pack during searching, attacking, and hunting phases. A novel algorithm based on crow behavior named as Crow Search Algorithm (CSA) has been proposed [14]. CSA mimics the behavior of crow to store their excess food in hiding places and retrieve it when it is needed. Similarly, Ant Lion Optimizer Algorithm [15] and Grasshopper Optimisation Algorithm [16] are also a good example of social mimicry of the natural swarms.

The applications of swarm algorithms are very well reported in the literature and in many design problems, namely, Automatic Generation Control [17], Unit Commitment [18, 19], Feature Selection [20], and Ambient Air Quality Classification [21].

Many optimization algorithms have employed chaotic sequences over the random walk (random numbers generation) due to the fact that the random walk not always implements the global search well. Thus in some cases, the algorithm development is based on chaotic variables instead of random variables, and these algorithms are called chaotic algorithms [2226]. A chaotic Firefly Algorithm was proposed in [22]. In this work, attractive movement of fireflies was simulated with ten chaotic maps. Chaotic sequences are used for parameters and in the Chaotic GWO approach [27]. Chaos enhanced Accelerated Particle Swarm Optimization (CAPSO) which was proposed by Gandomi et al. [25]. An attraction parameter was tuned with normalized chaotic maps in that work. Chaotic Bat Algorithm was proposed in [23], and the tuning of the crucial parameter of this algorithm was done with the help of chaotic maps. Different ten chaotic maps were employed in gravitational search algorithm in [28].

Two mechanisms: diversification and intensification are essential parts of any swarm algorithm. The initial phase of any swarm algorithm started with random search; usually this process swifts and holds responsibility to search every possible direction of the search space, and thus, the process is random in nature. On the other hand, the intensification process is strategic. The outcome of this process is specific and is treated as the solution of the problem. It is empirical to say that speed of these processes is different. Every algorithm employs a bridging mechanism to maintain a good amount of trade-off between these two processes. Some algorithms use different operators and different models of random walks in different phases, and in short, these operators/mechanisms help the algorithm to maintain a fair balance between these two processes. This paper investigates the impact of different chaotic sequences on the bridging mechanism of the GOA, by evaluating the performance of the proposed variants on standard benchmark functions and real applications. 10 different chaotic sequences are embedded with the parameter , and the careful observation is presented. Following research objectives are framed for this work:(1)To employ 10 different chaotic maps through normalized function to propose chaotic variants of GOA. These variants are developed on the basis of adaptive, chaotic, and monotonically decreasing parameter .(2)To conduct a nonparametric Wilcoxon rank sum test for observing the efficacy of the chaotic variants with the GOA by observing values.(3)To apply these variants on three-bar truss design and parameter estimation for frequency-modulated sound waves and compare the performance with the other contemporary algorithms.

Remaining part of the paper is organized as follows: in Section 2, brief details of different chaotic maps are incorporated. In Section 3, an overview of GOA is presented. The development of chaotic variants is explained in Section 4. Simulation results on benchmark problems and engineering optimization problems are presented in Section 5. Last but not the least, major conclusions of this study have been presented in Conclusion.

2. Chaotic Map

In this section, the definitions of different chaotic maps are presented. Table 1 shows the definition range and names of the chaotic maps. These maps have also been studied in the approaches [22, 28]. The shape of these maps with starting point is shown in Figure 1.

Table 1: Definition of chaotic maps [28].
Figure 1: Chaotic maps.

3. Grasshopper Optimisation Algorithm: An Overview

Grasshopper Optimisation Algorithm (GOA) [16] is a recently proposed naturally inspired algorithm, which is based on one of the largest swarms of all creatures. As grasshoppers are herbivores, they cause severe damage to crops. The swarming behavior of a grasshopper depends on both nymphs and adults. The nymph moves on rolling on the ground and feeds on succulents and soft plants. An adult grasshopper can jump high in search of food and therefore have a larger area to explore. As a result, both type of movements are observed, that is, slow movement and abrupt movement of large range which represents exploration and exploitation. The mathematical frame work presented in [16] has been presented here. The swarming behavior of the grasshopper is represented mathematically aswhere is the position of the grasshopper, is the social interaction, is the gravity force in the grasshopper, and is the wind advection.

The social interaction is given aswhere is the distance between the and grasshopper and is a unit vector from the grasshopper to the grasshopper. Function implies the social forces which can be given mathematically aswhere is the intensity of attraction and is the attractive length scale. In the search of food, grasshoppers create three types of regions in terms of social interaction known as the comfort zone, repulsion region, and attraction region. When the distance is larger between grasshoppers, then function “” is not able to apply strong forces. To resolve this, the component in (1) is given aswhere is the gravitational constant and represents a unity vector towards the center of Earth. The component is calculated aswhere is the constant drift and is a unity vector in the direction of wind. Substituting values of , , and in (1), we getwhere is given by (3) and is the number of grasshoppers. A revised form of this formula can be used to solve optimization problem:where is the upper bound in the dimension, is the lower bound in the dimension, is the value of the dimension in the target, and is the decreasing coefficient to shrink the comfort zone, repulsive zone, and attraction zone. It is assumed that the wind direction is always towards a target. In the process of searching food, nymphs move on rolling on the ground and adults move on jumping in the air, creating both the cases exploration and exploitation. One can balance both of these two by decreasing the parameter in (8) proportionally to the number of iteration. This can be calculated aswhere is the maximum value, is the minimum value, indicates the current iteration, and is the maximum number of iterations. The application of GOA has been observed in many engineering optimization problems [29, 30].

4. Development of Enhanced Chaotic Grasshopper Optimization Algorithms

This section presents philosophy and chronological development of ECGOAs. In this work, we have obeyed the philosophy of GOA and decreased the parameter () in due course of iterations. However, with the inculcation of the different chaotic sequences in the comfort zone reduction parameter “,” the diversification virtue of the GOA enhances till the last iteration. To develop the variants, a normalization function is employed to distribute the sequences between maximum and minimum bias before it can be biased with the parameter . The mathematical expression for this function at any iteration can be given aswhere denotes the maximum iteration. The normalized chaotic sequence can be given as per following equation:where is the value of chaotic sequence computed as per Table 1.

The instantaneous value of the chaotic sequence embedded parameter for ECGOA will be given as per the following equation:

By using (8)–(10), one can easily get the value of the chaotic sequence for any of the chaotic maps given in Table 1. For example, we briefly present here (12)–(15) for piecewise map (ECGOA6). Further, in this paper, 10 different chaotic sequences are embedded with the mapping through a normalized function to the parameter in GOA. In classical GOA, this parameter act as a bridging mechanism for the exploration and exploitation phase over the whole course of iterations. In the initial phase, the search agents take large steps to explore the search space in effective manner, and in later case, these steps are reduced with the help of linear decrement in the parameter . In this work, the focus is on this linear variation with different chaotic sequences embedded through a normalized function. The major motivation to perform this experiment is to seek the possibility of better exploration and exploitation by introducing the chaotic sequences in each iteration. In GOA, this parameter decreases linearly, which means that algorithm either performs diversification (exploration) or intensification (exploitation). In this work, the authors have changed the parameter chaotically so that the exploration virtue can be kept alive in the final steps of iterations.

For justification, an implementation of a logistic chaotic map with the abovementioned procedure is shown in Figure 2. On the basis of this mathematical procedure, 10 different variants of chaotic algorithms are proposed here which are named as Enhanced Chaotic Grasshopper Optimization Algorithms (ECGOAs): the different variants are ECGOA1 with Chebyshev map, ECGOA2 with Circle map, ECGOA3 with Gauss map, ECGOA4 with Iterative map, ECGOA5 with Logistic map, ECGOA6 with Piecewise map, ECGOA7 with Sine map, ECGOA8 with Singer map, ECGOA9 with Sinusoidal map, and ECGOA10 with Tent map.Case 1: ,Case 2: ,Case 3: ,Case 4: ,

Figure 2: Development of chaotic bridging mechanism.

Parameter “” is an important parameter of GOA and used twice in (7), and the inner “c” contributes to shrink the attraction and repulsion zones between grasshoppers. This effect is analogous to the exploitation phase mechanism. However, with the increment in the iteration counter, outer reduces the search and helps algorithm to converge. For balancing the intensification and diversification processes in GOA, parameter decreases linearly with every passing iteration. The comfort of grasshoppers is reduced with every iteration by varying the parameter from 1 to zero linearly. However, in the proposed ECGOAs, chaotic sequence changes the boundary of the comfort zone randomly in monotonically decreasing trend. This mechanism assists the search agents to release themselves from the local minima trap. The transition from the diversification phase to the intensification phase can be achieved slowly with the employment of a different chaotic sequences-enabled adaptive approach. This change makes parameter “” adaptive and random concurrently. The values for and are considered as and 0.2, respectively. In the following section, benchmarking of these variants and application of these variants on two engineering problems are investigated.

5. Simulation and Results

The testing of the optimization problem on some known functions is the best way to showcase the efficacy of the algorithm. Some of the essential characteristics of these functions are that the functions should be multimodal or unimodal in nature, the function should be nonseparable, and moreover, the functions should lag in the global structure. By keeping these virtues in consideration, benchmarking of the variants is done on five unimodal and five multimodal shifted and biased benchmark functions [3133]. In the standard benchmark functions, the minima lies at zero; however, in multimodal functions, multioptima (local) can exist. To make the problem harder, shift and bias have been provided to the functions so that the robustness of variants can be tested. Figure 3 shows the 2D version of these functions and definition, and other relevant details of these functions are given in Table 2.

Figure 3: Shifted and biased benchmark functions. (a) Unimodal and (b) multimodal test functions.
Table 2: Benchmark functions [2831].

The results of the proposed variants on the unimodal functions are shown in Tables 3 and 4 for 30 dimensions and 50 dimensions; similarly the results on the multimodal benchmark problems are shown in Tables 5 and 6 for 30 and 50 dimensions, respectively. For making the analysis meaningful, four different statistical parameters, namely, standard deviation (SD), maximum value (Max), minimum value (Min), and Mean value parameters are calculated. The stopping criterion for these variants along with GOA is the maximum iteration which is set to 500. Each variant is tested on all the ten benchmark functions, and the results are averaged over 20 independent runs. The following subsection presents the results of unimodal benchmark functions.

Table 3: Results and comparison of ECGOAs with GOA (30-D) on unimodal functions.
Table 4: Results and comparison of ECGOAs with GOA (50-D) on unimodal functions.
Table 5: Results and comparison of ECGOAs with GOA (30-D) for multimodal functions.
Table 6: Results and comparison of ECGOAs with GOA (50-D) for multimodal functions.
5.1. Qualitative Results and Discussions of Unimodal Benchmark Problems

Unimodal functions are the functions which have no local minima, or in other words, they possess only one minima. These functions are suitable for benchmarking the exploitation quality and convergence speed of any algorithm. This section presents the results on unimodal benchmark problems for 30 dimensions and 50 dimensions.

5.1.1. Simulation Results of 30-D Unimodal Benchmark Problems

The chaotic variants are benchmarked for the exploitation of quality and convergence properties on unimodal benchmark functions. The results for 30 and 50 dimensions are shown in Tables 3 and 4, respectively. For the unimodal functions, that is, F1, F2, and F3, it is observed that, as per maximum values obtained for each variant, the lowest maximum value of function 1 is for ECGOA8 (1.68E + 03), and the mean value and the standard deviation value for this variant are also lowest, that is, 3.60E + 02 and 480.17. The convergence properties of this variant for function 1 is shown in Figure 4. It is observed that convergence properties of this variant is superior to others. Similarly, for functions 2 and 3, the algorithms ECGOA7 and ECGOA4 have the minimum standard deviation values. The differences in the minimum, maximum, standard deviation, and mean values are very marginal with other variants. Similarly, for function 4, ECGOA2 performs better than other variants as the values of three statistical parameters out of four are minimum. For function 5, again ECGOA8 provides better results as per the obtained Max, SD, and Mean values. The solution of function 1 by ECGOA8 is shown in Figure 5.

Figure 4: Convergence curve for function 1 (30-D).
Figure 5: Results of ECGOA8 for unimodal function 1.
5.1.2. Simulation Results of 50-D Unimodal Benchmark Problems

Further the analysis is carried out on 50-D unimodal functions. The results of all the developed variants with GOA on unimodal benchmark functions are shown in Table 4. From the careful inspection of the results, it is observed that for function 1, ECGOA8 possesses minimum values of statistical parameter Min. However, other parameters, namely, Max and Mean are low for ECGOA3. For function 2, the values of SD and Mean are optimal for ECGOA8. Hence, it can be concluded that this variant outperforms others for this particular function. For functions 3, 4, and 5, ECGOA9, ECGOA1, and ECGOA2 possess optimal mean values. From this analysis, it can be concluded that the exploitation capability of GOA has been substantially improved by the chaotic comfort zone function adaptation.

5.2. Qualitative Results and Discussions on Multimodal Benchmark Problems

In this section, experiments are carried out on multimodal functions. The results for both 30-D and 50-D problems are shown in Tables 5 and 6. Multimodal functions are those functions which possess one global minima and can have several local minima. The nature of these functions is used to benchmark the exploration quality of the proposed variants. This benchmarking exhibits the ability of the variants to search a global optimum, in such a challenging environment, where the probability of getting trapped in a local optimum is high. The bias and shift in conventional multimodal benchmark problems make the functions more complex and suitable for benchmarking the variants for the real-world engineering problems.

5.2.1. Simulation Results of 30-D Multimodal Benchmark Problems

Inspecting the results of the multimodal functions in Table 5, it is observed that ECGOA8 performs better as compared to other variants for function 6, as the value of the SD is low. It is evident to say that the values of these statistical parameters can be a meaningful indicator to judge the performance of variants. For function 6, the standard deviation value for ECGOA8 is 972.50 as compared with other variants. The standard deviation of GOA for this function is 1292.87. For function 7, this variant also performs better as compared to others as the standard deviation value for this variant is the lowest, that is, 39.69, and for GOA, it is 50.35. From the results, it can be concluded that ECGOA8 (Singer map-enabled chaotic mechanism) provides better results for first three multimodal functions. The convergence characteristics for function 7 are plotted in Figure 6. For function 8, again this variant shows promising results as the three out of 4 parameters are low as compared to other variants. For function 9, ECGOA3 performs better as the parameters associated with the judgement attain lower values. For function 10, ECGOA6 has least mean values. Hence, it can be concluded that for most of the functions these variants outperform GOA.

Figure 6: Convergence curve for multimodal function 7 (30-D).
5.2.2. Simulation Results of 50-D Multimodal Benchmark Problems

The results of this experiment are shown in Table 6. For function 6, the optimal value for parameter SD is attained for ECGOA4, and Mean and Min for ECGOA7; for function 9, the parameters Max, SD, and Mean attain optimal values for ECGOA7 variant. The optimal values of the statistical parameters are shown in boldface. Inspecting the results of this experiment on the multimodal functions, it is clearly evident that the exploration capability of the variants is enhanced substantially by employing the chaotic comfort zone functions. To judge the significance of the results, authors performed the Wilcoxon rank sum test [34] with 5% confidence interval. The test results ( values) are shown in Tables 7 and 8.

Table 7: Results of the Wilcoxon rank sum test on unimodal benchmark functions.
Table 8: Results of the Wilcoxon rank sum test on multimodal benchmark functions.
5.2.3. Discussion

Parameter is an important parameter and acts as a bridging mechanism between exploration and exploitation phases. This parameter ensures the swift movement of grasshoppers from exploration phase to exploitation phase by reducing the comfort zone of grasshoppers. Chaotic mechanism not only enhances the exploitation phase by keeping alive the virtue of exploration till the last iteration but also adds random behavior with each iteration on the basis of different adaptive chaotic comfort zone function-enabled mechanisms. Further, the following section presents the statistical analysis of the performance of these variants.

5.3. Wilcoxon Rank Sum Test

To judge the significance of the results, the Wilcoxon rank sum test [34] is performed with 5% significance interval, and the values are obtained. The test results are shown in Tables 7 and 8. The term N/A indicates that the variant has outperformed over others and cannot be compared with itself. From the results of function 1 (Table 7), it is observed that there is a significant difference between ECGOA1 and ECGOA7 as compared with ECGOA8 as the values obtained for these variants are less than 0.05. Similarly, for function 3, ECGOA4 is the best algorithm; however, in this category, all the algorithms are statistically not different from each other as the values are greater than 0.05. It is empirical to observe that the lowest value other than ECGOA4 is for ECGOA9. For function 2, it is observed that ECGOA7 is the best performer and a significant difference exists between ECGOA2, 3, 4, 8, and 10 variants as the values are less than 0.05. The values which are less than 0.05 are highlighted in boldface and underlined.

Inspecting the results of Table 8 for multimodal functions, ECGOA8 is the best performer, and for functions 4 and 5, variants ECGOA2 and ECGOA3 are the second best performers as per the values. For function 6, it is observed that a significant difference exists between variants ECGOA3, ECGOA4, and ECGOA5. In most of the functions, chaotic variants outperform GOA and the variants are significantly different from each other. In the following section, the application of these variants on real-world problems and comparative performance of the variants with other contemporary algorithms is presented.

5.4. Application of the ECGOAs on Real Applications

Application of these variants on structural design-constrained optimization problem and parameter estimation problem is investigated in this section. The impact of the chaotic bridging mechanism in reducing the comfort zone of the grasshoppers that results in the better exploration and exploitation properties of GOA are evaluated with these applications.

5.4.1. Three-Bar Truss Design Problem

Three truss bar design problem is a well-known engineering design problem and has been used for benchmarking of many problems [1416]. A schematic diagram of this problem is shown in Figure 7. The objective of this problem is to minimize the volume (X) by adjusting the cross-sectional area (x, y) as per (16) subject to the constraints [1719]. This objective function is nonlinear in nature and possesses three nonlinear constraints which contain the stress parameter. For solving this optimization problem, the number of search agents (30) and maximum iteration count (500) are considered and kept constant for all the variants. Each algorithm is run for 20 times and results shown in Table 9 are averaged over these runs. The convergence curve of the problem is shown in Figure 8. The expression for the volume is given as

Figure 7: Three truss bar design problem.
Table 9: Results of three truss bar design problem.
Figure 8: Convergence curve for three truss bar design problem.

Various parameters for this optimization problem have been considered as , , and with variable range .

The results of this problem are shown in Table 9, and it is observed that, for this design problem also, ECGOA8 variant outperforms others as the values of the standard deviation and other statistical parameters are optimal as compared with other opponents. This variant exhibits better convergence properties; for the sake of clarification, the convergence curve is shown in Figure 8.

5.4.2. Parameter Estimation for Frequency-Modulated Sound Waves

Parameter estimation of the frequency-modulated synthesizer is a six-dimensional optimization problem and a part of FM sound wave synthesis. The problem is formulated as the parameter estimation for generation of the sound as per the target sound. The problem is complex and multimodal in nature. The minima for the objective function is at zero. The parameter vector is estimated through the optimization process [35]. The vector has 6 parameters as per the following equation:

The expressions for the estimated and the target sound waves are as follows:

The results of this problem are shown in Table 10, and the convergence of the variants along with GOA is shown in Figure 9. To solve this optimization problem, the maximum number of function evaluations and the number of search agents are set to be 30,000 and 30, respectively. The optimization results are averaged over 30 independent runs. It has been observed that the performance of these variants is competitive with some of the recently published approaches. ECGOA1 possesses the minimum SD parameter value as compared with others. From these applications, it can be concluded that the variants show the competitive performance not only on shifted and biased benchmark functions but also on the real applications. In the following section, the conclusions drawn from this study are presented.

Table 10: Error in objective function values for frequency-modulated sound waves synthesis [35].
Figure 9: Convergence curve for frequency synthesis parameter estimation problem.

6. Conclusion

Exploration and exploitation phases of a metaheuristic algorithm are connected with a bridging mechanism. The efficacy of this bridging mechanism is important to achieve better convergence characteristics, solution quality, and optimization performance. This paper focuses on this mechanism, and 10 chaotic bridging mechanisms have been proposed for GOA. Following are the major highlights of this work:(1)10 different chaotic maps have been embedded with the conventional GOA parameter “”, and the chaotic mechanism has been proposed. The benefit of this mechanism is that it enables exploration phase till last iteration with chaotic properties.(2)Ten shifted and biased benchmark functions have been considered to benchmark the variants. The proposed variants have been evaluated on 30-dimensional and 50-dimensional benchmark problems. It has been observed that the mechanism which is enabled with the Singer chaotic map, that is, ECGOA8 is suited for unimodal and multimodal optimization problems.(3)Further the application of these variants on three truss bar design problem and parameter estimation of the frequency-modulated sound wave synthesis problem have also been investigated. It is observed that the performance of the developed variants is competitive with other contemporary algorithms. In some cases, variants outperform.(4)A nonparametric Wilcoxon rank sum test has been conducted, and the values have been obtained for all the ten functions. It has been concluded that variant ECGOA8 exhibits better results as compared with other opponents.

For further studies, it would be interesting to explore the application of different comfort zone reduction functions to improve the bridging mechanism between exploration and exploitation phases of GOA.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

References

  1. I. Fister Jr., X.-S. Yang, I. Fister, J. Brest, and D. Fister, A Brief Review of Nature-Inspired Algorithms for Optimization, 2013.
  2. A. E. Eiben and J. E. Smith, Introduction to Evolutionary Computing, vol. 53, Springer, Berlin, Germany, 2003.
  3. I. Rechenberg, “Evolution strategy: natures way of optimization,” in Optimization: Methods and Applications, Possibilities and Limitations, pp. 106–126, Springer, Berlin, Germany, 1989. View at Google Scholar
  4. J. H. Holland, “Genetic algorithms,” Scientific American, vol. 267, no. 1, pp. 66–73, 1992. View at Publisher · View at Google Scholar · View at Scopus
  5. X. Yao, Y. Liu, and G. Lin, “Evolutionary programming made faster,” IEEE Transactions on Evolutionary Computation, vol. 3, no. 2, pp. 82–102, 1999. View at Google Scholar
  6. E. Rashedi, H. Nezamabadi-Pour, and S. Saryazdi, “GSA: a gravitational search algorithm,” Information Sciences, vol. 179, no. 13, pp. 2232–2248, 2009. View at Publisher · View at Google Scholar · View at Scopus
  7. O. K. Erol and I. Eksin, “A new optimization method: big bang–big crunch,” Advances in Engineering Software, vol. 37, no. 2, pp. 106–111, 2006. View at Publisher · View at Google Scholar · View at Scopus
  8. A. Hatamlou, “Black hole: a new heuristic optimization approach for data clustering,” Information Sciences, vol. 222, pp. 175–184, 2013. View at Publisher · View at Google Scholar · View at Scopus
  9. J. Kennedy, Particle Swarm Optimization Encyclopedia of Machine Learning, 2010.
  10. X. Yang, A New Metaheuristic Bat-Inspired Algorithm Nature Inspired Cooperative Strategies for Optimization (NICSO 2010), Springer, Berlin, Germany, 2010.
  11. X.-S. Yang, “Firefly algorithm, stochastic test functions and design optimisation,” International Journal of Bio-Inspired Computation, vol. 2, no. 2, pp. 78–84, 2010. View at Publisher · View at Google Scholar · View at Scopus
  12. X.-S. Yang and S. Deb, “Cuckoo search via lévy flights,” in Proceedings of the World Congress on Nature & Biologically Inspired Computing (NaBIC 2009), pp. 210–214, December 2009.
  13. S. Mirjalili, S. M. Mirjalili, and A. Lewis, “Grey wolf optimizer,” Advances in Engineering Software, vol. 69, pp. 46–61, 2014. View at Publisher · View at Google Scholar · View at Scopus
  14. A. Askarzadeh, “A novel metaheuristic method for solving constrained engineering optimization problems: crow search algorithm,” Computers & Structures, vol. 169, pp. 1–12, 2016. View at Publisher · View at Google Scholar · View at Scopus
  15. S. Mirjalili, “The ant lion optimizer,” Advances in Engineering Software, vol. 83, pp. 80–98, 2015. View at Publisher · View at Google Scholar · View at Scopus
  16. S. Saremi, S. Mirjalili, and A. Lewis, “Grasshopper optimisation algorithm: theory and application,” Advances in Engineering Software, vol. 105, pp. 30–47, 2017. View at Publisher · View at Google Scholar · View at Scopus
  17. E. Gupta and A. Saxena, “Performance evaluation of antlion optimizer based regulator in automatic generation control of interconnected power system,” Journal of Engineering, vol. 2016, Article ID 4570617, 14 pages, 2016. View at Publisher · View at Google Scholar · View at Scopus
  18. L. K. Panwar, S. Reddy, and R. Kumar, “Binary fireworks algorithm based thermal unit commitment,” International Journal of Swarm Intelligence Research, vol. 6, no. 2, pp. 87–101, 2015. View at Publisher · View at Google Scholar
  19. L. K. Panwar, S. Reddy, A. Verma, B. Panigrahi, and R. Kumar, “Binary grey wolf optimizer for large scale unit commitment problem,” Swarm and Evolutionary Computation, vol. 38, pp. 251–266, 2018. View at Publisher · View at Google Scholar · View at Scopus
  20. M. Mafarja and S. Mirjalili, “Whale optimization approaches for wrapper feature selection,” Applied Soft Computing, vol. 62, pp. 441–453, 2018. View at Publisher · View at Google Scholar · View at Scopus
  21. A. Saxena and S. Shekhawat, “Ambient air quality classification by grey wolf optimizer based support vector machine,” Journal of Environmental and Public Health, vol. 2017, Article ID 3131083, 12 pages, 2017. View at Publisher · View at Google Scholar · View at Scopus
  22. A. H. Gandomi, X.-S. Yang, S. Talatahari, and A. H. Alavi, “Firefly algorithm with chaos,” Communications in Nonlinear Science and Numerical Simulation, vol. 18, no. 1, pp. 89–98, 2013. View at Publisher · View at Google Scholar · View at Scopus
  23. A. H. Gandomi and X.-S. Yang, “Chaotic bat algorithm,” Journal of Computational Science, vol. 5, no. 2, pp. 224–232, 2014. View at Publisher · View at Google Scholar · View at Scopus
  24. G.-G. Wang, L. Guo, A. H. Gandomi, G.-S. Hao, and H. Wang, “Chaotic krill herd algorithm,” Information Sciences, vol. 274, pp. 17–34, 2014. View at Publisher · View at Google Scholar · View at Scopus
  25. A. H. Gandomi, G. J. Yun, X.-S. Yang, and S. Talatahari, “Chaos-enhanced accelerated particle swarm optimization,” Communications in Nonlinear Science and Numerical Simulation, vol. 18, no. 2, pp. 327–340, 2013. View at Publisher · View at Google Scholar · View at Scopus
  26. G.-G. Wang, S. Deb, A. H. Gandomi, Z. Zhang, and A. H. Alavi, “Chaotic cuckoo search,” Soft Computing, vol. 20, no. 9, pp. 3349–3362, 2016. View at Publisher · View at Google Scholar · View at Scopus
  27. M. Kohli and S. Arora, “Chaotic grey wolf optimization algorithm for constrained optimization problems,” Journal of Computational Design and Engineering, 2017. View at Publisher · View at Google Scholar · View at Scopus
  28. S. Mirjalili and A. H. Gandomi, “Chaotic gravitational constants for the gravitational search algorithm,” Applied Soft Computing, vol. 53, pp. 407–419, 2017. View at Publisher · View at Google Scholar · View at Scopus
  29. A. Tharwat, E. H. Houssein, M. M. Ahmed, A. E. Hassanien, and T. Gabel, “Mogoa algorithm for constrained and unconstrained multi-objective optimization problems,” Applied Intelligence, pp. 1–16, 2017. View at Google Scholar
  30. J. Wu, H. Wang, N. Li et al., “Distributed trajectory optimization for multiple solar-powered UAVs target tracking in urban environment by adaptive grasshopper optimization algorithm,” Aerospace Science and Technology, vol. 70, pp. 497–510, 2017. View at Publisher · View at Google Scholar · View at Scopus
  31. X.-S. Yang, Test problems in optimization, Engineering Optimization: An Introduction with Metaheuristic Applications, X.-S. Yang, Ed., John Wiley & Sons, Hoboken, NJ, USA, 2010, arXiv: 1008.0549v1.
  32. J. G. Digalakis and K. G. Margaritis, “On benchmarking functions for genetic algorithms,” International Journal of Computer Mathematics, vol. 77, no. 4, pp. 481–506, 2001. View at Publisher · View at Google Scholar
  33. M. Molga and C. Smutnicki, Test Functions for Optimization Needs. Test Functions for Optimization Needs, 2005.
  34. F. Wilcoxon, “Individual comparisons by ranking methods,” Biometrics Bulletin, vol. 1, no. 6, pp. 80–83, 1945. View at Publisher · View at Google Scholar
  35. S. Das and P. N. Suganthan, “Problem definitions and evaluation criteria for CEC 2011 competition on testing evolutionary algorithms on real world optimization problems,” Jadavpur University, Kolkata, India, 2011, Technical Report. View at Google Scholar
  36. J. J. Liang, A. K. Qin, P. N. Suganthan, and S. Baskar, “Comprehensive learning particle swarm optimizer for global optimization of multimodal functions,” IEEE Transactions on Evolutionary Computation, vol. 10, no. 3, pp. 281–295, 2006. View at Publisher · View at Google Scholar · View at Scopus
  37. F. Van den Bergh and A. P. Engelbrecht, “A cooperative approach to particle swarm optimization,” IEEE Transactions on Evolutionary Computation, vol. 8, no. 3, pp. 225–239, 2004. View at Publisher · View at Google Scholar · View at Scopus
  38. S. Gupta and K. Deep, “A novel random walk grey wolf optimizer,” Swarm and Evolutionary Computation, 2018. View at Google Scholar
  39. A. Auger and N. Hansen, “A restart CMA evolution strategy with increasing population size,” in Proceedings of the IEEE Congress on Evolutionary Computation, vol. 2, pp. 1769–1776, Edinburgh, UK, September 2005.
  40. J. Kumpiene, A. Lagerkvist, and C. Maurice, “Stabilization of pb- and cu-contaminated soil using coal fly ash and peat,” Environmental Pollution, vol. 145, no. 1, pp. 365–373, 2007. View at Publisher · View at Google Scholar · View at Scopus