Evolutionary Algorithms Applied to Antennas and Propagation: Emerging Trends and Applications
View this Special IssueResearch Article  Open Access
Sotirios K. Goudos, John N. Sahalos, "Design of Large Thinned Arrays Using Different BiogeographyBased Optimization Migration Models", International Journal of Antennas and Propagation, vol. 2016, Article ID 5359298, 11 pages, 2016. https://doi.org/10.1155/2016/5359298
Design of Large Thinned Arrays Using Different BiogeographyBased Optimization Migration Models
Abstract
Array thinning is a common discretevalued combinatorial optimization problem. Evolutionary algorithms are suitable techniques for abovementioned problem. BiogeographyBased Optimization (BBO), which is inspired by the science of biogeography, is a stochastic populationbased evolutionary algorithm (EA). The original BBO uses a linear migration model to describe how species migrate from one island to another. Other nonlinear migration models have been proposed in the literature. In this paper, we apply BBO with four different migration models to five different large array design cases. Additionally we compare results with five other popular algorithms. The problems dimensions range from 150 to 300. The results show that BBO with sinusoidal migration model generally performs better than the other algorithms. However, these results are considered to be indicative and do not generally apply to all optimization problems in antenna design.
1. Introduction
Array thinning is a common discretevalued combinatorial optimization problem. By array thinning we mean the removal (turning “off”) of some radiating elements from a periodic array antenna in order to create an array with lower sidelobe level than that with uniform excitation. The elements connected to the feed network are turned “on,” while the turned “off” elements are connected to a matched load. Array thinning results in reduction in cost and weight. An exhaustive search for all possible combinations would result in the best array design; however the computational cost will increase exponentially as the array size increases. Therefore, array thinning can be categorized as discrete, combinatorial NP complete optimization problem [1]. EAs are suitable techniques for solving the array thinning problem. The problem of array thinning has been addressed in the literature using different EAs like genetic algorithms (GAs) [2], Particle Swarm Optimization (PSO) [3], and Ant Colony Optimization (ACO) [4]. The original PSO is inherently realvalued and operates only in continuous spaces. In order to solve binarycoded combinatorial optimization problems with PSO several binary versions of this algorithm have been proposed. Binary PSO (BPSO) proposed by Kennedy and Eberhart extends the original PSO algorithm using a sigmoid function (or Sshaped transfer function) to map real numbers to bits [5]. In [6] a new BPSO that uses Vshaped transfer function is proposed (VBPSO). The results from [6] show that VBPSO outperforms the original BPSO and other BPSO variants with other transfer functions. Differential evolution (DE) [7, 8] is a populationbased stochastic global optimization algorithm, which has been used in several real world engineering problems. Several DE variants or strategies exist. Oppositional differential evolution (ODE) [9] is a DE variant based on oppositionbased learning (OBL) [10] concepts. Harmony Search (HS) [11] is an evolutionary algorithm which is inspired by the way that musicians experiment and change the pitches of their instruments to improvise better harmonies. HS has been applied successfully to antenna array synthesis problems [12, 13]. Ant Colony Optimization (ACO) is a populationbased metaheuristic introduced by Dorigo et al. [14] and inspired by the behavior of real ants. The authors in [4, 15] have used ACO for thinned array design.
BiogeographyBased Optimization (BBO) [16] is an EA based on mathematical models that describe how species migrate from one island to another, how new species arise, and how species become extinct. The way the problem solution is found is analogous to nature’s way of distributing species. Ma in [17] showed that sinusoidal migration models generally outperform linear migration models like the one in original BBO algorithm. The authors in [18] extend the migration model performance analysis and they propose two nonlinear migration models, which they call model 7 and model 8. BBO has been applied to design problems in electromagnetics, including YagiUda synthesis [19], microstrip antenna design [20, 21], and antenna array synthesis [22–24].
The purpose of this paper is to use BBO with different migration models in order to design large thinned arrays. To the best of our knowledge this is the first time that BBO with other than the linear migration mode is applied to an antenna design problem. This paper contributes to compare performance of several stateoftheart EAs to high dimensional thinned array design problems. Additionally the paper contributes so that the design cases presented in this paper could also be used as a framework of benchmark functions for testing evolutionary algorithms to large array design problems. More specifically, we compare the original BBO linear migration model with the sinusoidal model [17], model 7, and model 8 [18], two BPSO variants, binary HS, ODE, and ACO. We apply the algorithms to five different design cases of linear and planar arrays. A comparative study of BBO variants performance on benchmark functions is also given. Moreover, a study of the influence on the BBO performance of the boundary constraint handling method is also presented. To the best of our knowledge this is the first time that such study is carried out on BBO in general. Numerical results show that BBO with sinusoidal migration model generally performs better than or equally with the other migration models. Additionally it performs better than the original BBO algorithm in terms of solution accuracy. Additionally, results show VBPSO outperforms the initial BPSO algorithm; however its performance is inferior to that of the BBO algorithms.
This paper is organized as follows. We describe the problem formulation in Section 2. A brief description of the EAs used in this paper is given in Section 3. In Section 4 we present the numerical results. Finally, the conclusion is given in Section 5.
2. Formulation
We consider an element linear array of isotropic elements. The array factor is expressed aswhere is the wavelength, is the complex excitation of the th element, is the corresponding vector of element amplitudes, is distance between two adjacent elements, is the direction cosine, and is the steering angle measured from broadside of the array. In the thinned array case could be only 1 (turned “on”) or 0 (turned “off”).
For a symmetrically excited array (1) becomeswhere is the largest integer less than or equal to . For an even number of elements , and for an odd number of elements we set . We assume that for all cases. The fill factor percentage is the percentage of the ratio of the turned “on” elements to the total number of elements. The optimization goal is the peak sidelobe level (SLL) suppression by finding the optimum element amplitudes. This design problem is therefore defined by the minimization of the objective function: where is the set of direction cosines that are outside the angular range of the main lobe.
Additionally, we study the effect of maintaining the same aperture length as the original uniform array for this case. We therefore force the first and the last element to be one (turned on).
To test the algorithms ability to design planar thinned arrays we consider a planar array, which lies on the  plane. The array factor of such an array is given by where , are the distance between two adjunct elements in the and direction, respectively, and , are the direction cosines.
The optimization goal here is the sidelobe level (SLL) suppression at two different phiplanes (, ) for a desired fill factor percentage. This design problem is expressed by the minimization of the objective function: where , are the calculated peak sidelobe levels at and planes, respectively.
3. Optimization Algorithms
3.1. Binary PSO
The binary PSO (BPSO) model was presented by Kennedy and Eberhart and is based on a very simple modification of the realvalued PSO [5]. In binary PSO, the particle positions belong to a dimensional binary space (bit strings of length ), while the particle velocities remain realvalued.
The th coordinate of each particle’s position is a bit, whose state is given bywhere is a uniformly distributed random number in and is a sigmoid limiting transformation that maps real numbers to the interval . Such a function is defined bywhich defines Sshaped transfer function.
In [6] new BPSO variants that use Vshaped transfer functions are proposed (VBPSO). The BPSO variant that produces the best results according to [6] is that with a transfer function expressed by
3.2. Binary Harmony Search Algorithm
Harmony Search (HS) [11] is an evolutionary algorithm which is inspired by the way that musicians experiment and change the pitches of their instruments to improvise better harmonies. The main control parameter in HS is the Harmony Memory Size (HMS), which determines the number of solutions (harmonies) inside the HM, and it is equivalent to population size in another algorithms. Another HS parameter is the Harmony Memory Consideration Rate (HMCR), which determines whether pitches (decision variables) should be selected from the HM or randomly from the predefined range. HMCR is a number between 0 and 1. An additional control parameter of HS is the Pitch Adjusting Rate (PAR), which determines the probability of adjusting the original value of the selected pitches from the HM.
3.3. Oppositional Differential Evolution
Differential evolution (DE) [7, 8] is a populationbased stochastic global optimization algorithm, which has been used in several real world engineering problems. Several DE variants or strategies exist. One of the DE advantages is that very few control parameters have to be adjusted in each algorithm run. In [9] a new DE algorithm based on oppositionbased learning (OBL) [10], oppositional differential evolution (ODE), was introduced. The basic idea of OBL is not only to calculate the fitness of the current individual but also to calculate the fitness of the opposite individual. Then the algorithm selects the individual with the lower (higher) fitness value. The benefits of using such a technique are that convergence speed may be faster and better approximation of the global optimum can be found. ODE uses an additional control parameter to those of standard DE called the jumping rate , which controls in each generation if the opposite population is created or not.
3.4. Ant Colony Optimization
Ant Colony Optimization (ACO) [14, 25, 26] is a metaheuristic inspired by the ants’ foraging behavior. At the core of this behavior is the indirect communication between the ants by means of chemical pheromone trails, which enables them to find short paths between their nest and food sources. Ants can sense pheromone. When they decide to follow a path, they tend to choose the ones with strong pheromone intensities way back to the nest or to the food source. Therefore, shorter paths would accumulate more pheromone than longer ones. This feature of real ant colonies is exploited in ACO algorithms in order to solve combinatorial optimization problems considered to be NPHard.
3.5. BiogeographyBased Optimization
The mathematical models of biogeography are based on the work of Robert MacArthur and Edward Wilson in the early 1960s. Using this model, it was possible to predict the number of species in a habitat. The habitat is an area that is geographically isolated from other habitats. The geographical areas that are well suited as residences for biological species are said to have a high habitat suitability index (HSI). Therefore, every habitat is characterized by the HSI which depends on factors like rainfall, diversity of vegetation, diversity of topographic features, land area, and temperature. Each of the features that characterize habitability is known as suitability index variables (SIVs). The SIVs are independent variables while HSI is the dependent variable.
Therefore, a solution to a dimensional problem can be represented as a vector of SIV variables , which is a habitat or island. The value of HSI of a habitat is the value of the objective function that corresponds to that solution and it is found by
Habitats with a high HSI are good solutions of the objective function, while poor solutions are those habitats with a low HSI. The immigration and emigration rates are functions of the rank of the given candidate solution. The rank of the given candidate solution represents the number of species in a habitat. These are given by the following:(1)Linear migration model (original BBO):(2)Sinusoidal migration model [17]:(3)Model 7 [18]:(4)Model 8 [18]: is the maximum possible immigration rate, is the maximum possible emigration rate, is the rank of the given candidate solution, and is the maximum number of species (e.g., population size). The rank of the given candidate solution or the number of species is obtained by sorting the solutions from most fit to least fit, according to the HSI value (e.g., fitness). BBO uses both mutation and migration operators. The application of these operators to each SIV in each solution is decided probabilistically. The mutation rate of a possible solution is defined to be inversely proportional to the solution probability and it is given bywhere is the probability that a habitat contains species and is a userdefined parameter. As with other evolutionary algorithms, BBO also incorporates elitism. This is implemented with a userselected elitism parameter . This means that the best phase vectors remain from one generation to the other. The BBO algorithm is outlined below:(1)Initialize the BBO control parameters. Map the problem solutions to habitats (vectors). Set the habitat modification probability , the maximum immigration rate , the maximum emigration rate , the maximum migration rate , and the elitism parameter (if elitism is desired).(2)Initialize a random population of habitats from a uniform distribution. Set the number of generations to one.(3)Evaluate objective function values for each antenna array of the population.(4)Map the objective function value to the number of species , the immigration rate , and the emigration rate for each solution (antenna array) of the population.(5)Apply the migration operator for each nonelite habitat based on immigration and emigration rates with the one of the linear, sinusoidal, or nonlinear migration models.(6)Update the species count probability using (7)Apply the mutation operator according to (14).(8)Sort the population according to the objective function value from best to worst.(9)Apply elitism by replacing the worst habitats of the previous generation with the best ones.(10)Repeat step until the maximum number of generations is reached.
4. Numerical Results
4.1. Comparison with Test Functions
In order to evaluate the different BBO migration models, first we use a set of test problems from the literature. In this paper, the test problems consist of eight wellknown benchmark functions. We have chosen two unimodal and six multimodal functions. These are expressed as follows [27, 28].
(A) Unimodal Functions
(1) Sphere Function
(2) Rosenbrock’s Function
(B) Multimodal Functions
(3) Ackley’s Function
(4) Griewanks’s Function
(5) Weierstrass Function
(6) Rastrigin’s Function
(7) Noncontinuous Rastrigin’s Function
(8) Schwefel’s FunctionAll algorithms are executed 50 times. The results are compared. The population size is set to 100 and the maximum number of generations is set to 2000 iterations. The problems dimension is set to . For all migration models the habitat modification probability, , is set to 1, and the maximum mutation rate, , is set equal to 0.005. The maximum immigration rate and the maximum emigration rate are both set to one. The elitism parameter is set to two. Table 1 reports the comparative results in terms of mean and standard deviation values. We notice that BBO with the sinusoidal migration model performs better in 5 out of the 8 test functions. Moreover, in order to compare the algorithms performance on all problems, we have conducted the Friedman test [29]. Table 2 shows the average ranking of all algorithms. The highest ranking is shown in bold. It is obvious that the best average ranking was obtained by the BBO with sinusoidal migration model which outperforms the other models.


4.2. Comparison with Array Thinning Problems
We compare the four BBO migration models’ performance with the two BPSO variants, BHS, ODE, and ACO on different thinned array design cases. All algorithms are executed 20 times. The results are compared. The population size is set to 200 and the maximum number of generations is set to 1000 iterations. For all migration models the habitat modification probability, , is set to 1, and the maximum mutation rate, , is set equal to 0.005. The maximum immigration rate and the maximum emigration rate are both set both to one. The elitism parameter is set to two. In both BPSO algorithms the learning factors , are both set equal to two, and inertia weight is linearly decreased from 0.9 to 0.4 as in [6]. For BHS the is set to 0.99 and the is set to 0.4. For ODE , , and the jumping rate is set to 0.3. For ACO the initial pheromone value is set to , the pheromone update constant is set to 20, the exploration constant is set to 1, the global pheromone decay rate is 0.9, the local pheromone decay rate is 0.5, the pheromone sensitivity is 1, and the visibility sensitivity is is 5.
The first case is that of a symmetrically excited 300element linear array. The total number of unknowns for this case is 150. Table 3 reports the comparative results. We notice that the sinusoidal model and the model 7 seem to outperform the other algorithms. Both models have obtained the same best value. Both BPSO variants perform worse than the BBO algorithms. ACO and VBPSO perform similarly. The convergence rate graph for this case is depicted in Figure 1. All BBO models seem to converge at similar speed faster than the other algorithms. The VBPSO converges faster than BPSO. It must be pointed out that the BBO algorithms converge at their final value at less than 300 iterations. The radiation pattern of the best obtained result is shown in Figure 2. The best array found is filled 72% and has a peak SLL of −24.67 dB. Additionally, we study the effect of maintaining the same aperture length as the original uniform array for this case. We therefore force the first and the last element to be one (turned on) for a design case. The comparative results for this case are shown in Table 4. The sinusoidal model presents the best performance except the best value found where the model 7 outperforms the others. VBSO is completive with the BBO algorithms for this case and achieves better performance than BBO with model 8. ODE performs better than ACO and BHS for this case. Figure 3 shows the convergence rate for this case. It is obvious that all BBO algorithms converge at similar speeds, while the PSO algorithms converge slower. The best obtained radiation pattern is shown in Figure 4. The obtained peak SLL value is −24.70 dB slightly smaller than the previous case. The array filling percentage is again 72%. We notice that there is not a significant difference if we choose an array without constraints or with the same aperture size for this case.


Next, we consider asymmetric array designs in which the problem dimension is 300. Therefore, we evaluate the algorithms ability to solve high dimensional problems. It must be pointed out that although the number of unknowns has doubled compared with the symmetric case the population size and iterations remain the same as previously. Therefore, the level of difficulty increases for all algorithms. Table 5 reports the comparative results for this case. The algorithms performance differences are clearer in this case. It is obvious that the sinusoidal model clearly outperforms the others in this case. The results obtained by model 7 and linear model are quite similar in this case. The VBPSO results are better again than BPSO, BHS, ODE, and ACO but worse than the BBO algorithms. Figure 5 shows the convergence rate graph for this case. Model 7 seems to convergence slightly faster than the other algorithms. The radiation pattern of the best obtained array is shown in Figure 6. The obtained array SLL is −26.11 dB and the fill percentage is about 72%.

In order to further evaluate the algorithms performance we choose again an asymmetric array case with the same aperture length. Table 6 holds the comparative results. The results show that sinusoidal model clearly outperforms the other algorithms. We notice that VBPSO and BPSO results are worse than the BBO algorithms. BHS is completive with the VBPSO algorithm for this case. The convergence rate graph of Figure 7 shows that models 7 and 8 converge slightly faster than the linear and the sinusoidal model for this case. Again it is obvious that the BBO algorithms require fewer iterations than the other algorithms in order to reach the final values. Figure 8 presents the radiation pattern of the best obtained array. The peak SLL is −26.08 dB and the filling percentage is 73%. Again the results are quite similar with the previous case with the peak SLL slightly higher than previous.

The final example is that of planar thinned array. We consider a planar array. All elements are equally spaced along  and axis at halfwavelength distance. Again, a population of 200 vectors is selected for all algorithms. The total number of generations is set to 500. Table 7 holds the comparative results for the planar array case. The sinusoidal model obtains the best value. However, the model 7 results are better than the results of the other algorithms models in terms of best mean value. The convergence rate graph for this case is shown in Figure 9. The BBO algorithms seem to converge at similar speed, faster than the other algorithms. The 3D radiation pattern of the best obtained array is shown in Figure 10(a). Figure 10(b) shows the array pattern at the two phiplanes. For the PSLL is −33.96 dB while for the PSLL is −33.85 dB. The filling percentage is 47.6%.

(a)
(b)
Overall BBO with sinusoidal model has been ranked first regarding mean and standard deviation values in 3 out of the 5 array design cases presented. Additionally, BBO with sinusoidal model in 4 out of the 5 design cases produced the best result. The results obtained by the BBO with model 7 outperformed the linear model in 2 out of the 5 cases. In all cases BBO with migration model 8 seems to perform worse than the other models. The VBPSO has outperformed the original BPSO in all cases and the BBO with migration model 8 in 2 out of the 5 cases. In order to compare the algorithms performance on all problems, we have conducted the Friedman test [29]. Table 8 shows the average ranking of all algorithms. The highest ranking is shown in bold. It is obvious that the best average ranking was obtained by the BBO with sinusoidal migration model which outperforms the other algorithms. VBPSO and BHS perform similarly, while ODE outperforms ACO.

Similar to the other evolutionary algorithms (EAs) such as differential evolution (DE), Harmony Search (HS), and Particle Swarm Optimization (PSO), in the BBO approach there is a way of sharing information between solutions [16]. This feature makes BBO suitable for the same types of problems that the other algorithms are used for, namely, high dimensional data. Additionally, BBO has some unique features that are different from those found in the other evolutionary algorithms. For example, quite different from DE and PSO, from one generation to the next the set of the BBO’s solutions is maintained and improved using the migration model, where the emigration and immigration rates are determined by the fitness of each solution. BBO differs from PSO in the fact that PSO solutions do not change directly; the velocities change. The BBO solutions share directly their attributes using the migration models. These differences can make BBO outperform other algorithms [16, 17, 30]. It must be pointed out that if PSO or DE are constrained to discrete space then the next generation will not necessarily be discrete [30]. However, this is not true for BBO; if BBO is constrained to a discrete space then the next generation will also be discrete to the same space. As the authors in [30] suggest this indicates that BBO could perform better than other EAs on combinatorial optimization problems, which makes BBO suitable for application to the antenna array thinning problems.
4.3. Boundary Conditions Constraint Handling Methods Study
In this subsection, we apply different boundary conditions handling methods for the BBO with the sinusoidal model in order to find if the settings used in the previous sections could be improved. The boundary constraint handling methods that we will test include the following [31, 32].
(1) Reflection Method where is a valid value, is the value which violates the bound constraint, and and are the lower and upper bounds for the th variable, respectively.
(2) Projection Method
In this case, the variables that violate the bound constraints are trimmed to the lower and upper bounds, respectively.
(3) Reinitialization by Position. In this case each variable that violates the constraints is randomly reinitialized with where is a uniformly distributed random number between 0 and 1.
(4) Reinitialize All. In this case, if at least one of the solution variables violates the boundaries, a complete new vector is generated within the allowed boundaries using (26).
(5) Conservatism. This technique was proposed to work particularly with DE. In this case, the infeasible solution is rejected and it is replaced by the original feasible vector.
The test function we use in all cases is a symmetrically thinned array with 100 elements. We run each case for 50 independent trials for each different boundary condition setting. The population size is set to 200 and the number of iterations to 1000. The best value, the worst value, the mean, and the standard deviation at the last generation are presented here. Table 9 holds the comparative results for all methods. We notice that the projection method in this case has obtained the best mean, worst, and standard deviation values. The conservatism method has obtained the best objective function value. The mean value results for all methods are close, while the best obtained values seem to differ.

5. Conclusion
In this paper, we have addressed the problem of designing thinned arrays using the BBO algorithm. We have compared performance of four different BBO migration models with other popular EAs. The results showed that BBO with sinusoidal model is highly competitive for the array thinning problem. BBO with sinusoidal model outperformed the other migration models in general. All the BBO algorithms converge faster and produce better results than the other algorithms. However, these results are considered to be indicative and cannot be generalized in all array design problems. Further tests should be carried out to test the BBO performance on other array design problems. The BBO algorithm is a powerful and efficient optimizer especially in combinatorial optimization problems. In our future work, we will study further the performance of BBO migration models to different antenna design problems.
Competing Interests
The authors declare that they have no competing interests.
References
 R. Jain and G. S. Mani, “Solving ‘antenna array thinning problem’ using genetic algorithm,” Applied Computational Intelligence and Soft Computing, vol. 2012, Article ID 946398, 14 pages, 2012. View at: Publisher Site  Google Scholar
 R. L. Haupt, “Thinned arrays using genetic algorithms,” IEEE Transactions on Antennas and Propagation, vol. 42, no. 7, pp. 993–999, 1994. View at: Publisher Site  Google Scholar
 N. Jin and Y. RahmatSamii, “Advances in particle swarm optimization for antenna designs: realnumber, binary, singleobjective and multiobjective implementations,” IEEE Transactions on Antennas and Propagation, vol. 55, no. 3 I, pp. 556–567, 2007. View at: Publisher Site  Google Scholar
 Ó. QuevedoTeruel and E. RajoIglesias, “Ant colony optimization in thinned array synthesis with minimum sidelobe level,” IEEE Antennas and Wireless Propagation Letters, vol. 5, no. 1, pp. 349–352, 2006. View at: Publisher Site  Google Scholar
 J. Kennedy and R. C. Eberhart, “A discrete binary version of the particle swarm algorithm,” in Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation, pp. 4104–4108, IEEE, Orlando, Fla, USA, 1997. View at: Publisher Site  Google Scholar
 S. Mirjalili and A. Lewis, “Sshaped versus Vshaped transfer functions for binary Particle Swarm Optimization,” Swarm and Evolutionary Computation, vol. 9, pp. 1–14, 2013. View at: Publisher Site  Google Scholar
 R. Storn and K. Price, “Differential evolution—a simple and efficient adaptive scheme for global optimization over continuous spaces,” 1995, http://citeseer.ist.psu.edu/article/storn95differential.html. View at: Google Scholar
 R. Storn and K. Price, “Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341–359, 1997. View at: Publisher Site  Google Scholar  MathSciNet
 R. S. Rahnamayan, H. R. Tizhoosh, and M. M. A. Salama, “Oppositionbased differential evolution,” IEEE Transactions on Evolutionary Computation, vol. 12, no. 1, pp. 64–79, 2008. View at: Publisher Site  Google Scholar
 H. R. Tizhoosh, “Oppositionbased learning: a new scheme for machine intelligence,” in Proceedings of the International Conference on Computational Intelligence for Modelling, Control and Automation (CIMCA '05) and International Conference on Intelligent Agents, Web Technologies and Internet Commerce (IAWTIC '05), vol. 1, pp. 695–701, Vienna, Austria, November 2005. View at: Google Scholar
 Z. W. Geem, J. H. Kim, and G. V. Loganathan, “A new heuristic optimization algorithm: harmony search,” Simulation, vol. 76, no. 2, pp. 60–68, 2001. View at: Publisher Site  Google Scholar
 S.H. Yang and J.F. Kiang, “Optimization of sparse linear arrays using harmony search algorithms,” IEEE Transactions on Antennas and Propagation, vol. 63, no. 11, pp. 4732–4738, 2015. View at: Publisher Site  Google Scholar  MathSciNet
 K. Guney and M. Onay, “Optimal synthesis of linear antenna arrays using a harmony search algorithm,” Expert Systems with Applications, vol. 38, no. 12, pp. 15455–15462, 2011. View at: Publisher Site  Google Scholar
 M. Dorigo, V. Maniezzo, and A. Colorni, “Ant system: optimization by a colony of cooperating agents,” IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 26, no. 1, pp. 29–41, 1996. View at: Publisher Site  Google Scholar
 E. Rajolglesias and Ó. QuevedoTeruel, “Linear array synthesis using an antcolonyoptimizationbased algorithm,” IEEE Antennas and Propagation Magazine, vol. 49, no. 2, pp. 70–79, 2007. View at: Publisher Site  Google Scholar
 D. Simon, “Biogeographybased optimization,” IEEE Transactions on Evolutionary Computation, vol. 12, no. 6, pp. 702–713, 2008. View at: Publisher Site  Google Scholar
 H. Ma, “An analysis of the equilibrium of migration models for biogeographybased optimization,” Information Sciences, vol. 180, no. 18, pp. 3444–3464, 2010. View at: Publisher Site  Google Scholar
 W. Guo, L. Wang, and Q. Wu, “An analysis of the migration rates for biogeographybased optimization,” Information Sciences, vol. 254, pp. 111–140, 2014. View at: Publisher Site  Google Scholar  MathSciNet
 U. Singh, H. Kumar, and T. S. Kamal, “Design of YagiUda antenna using biogeography based optimization,” IEEE Transactions on Antennas and Propagation, vol. 58, no. 10, pp. 3375–3379, 2010. View at: Publisher Site  Google Scholar
 M. R. Lohokare, S. S. Pattnaik, S. Devi, K. M. Bakwad, and J. G. Joshi, “Parameter calculation of rectangular microstrip antenna using biogeographybased optimization,” in Proceedings of the Applied Electromagnetics Conference (AEMC '09), pp. 1–4, Kolkata, India, December 2009. View at: Publisher Site  Google Scholar
 M. R. Lohokare, S. S. Pattnaik, S. Devi, B. K. Panigrahi, K. M. Bakwad, and J. G. Joshi, “Modified BBO and calculation of resonant frequency of circular microstrip antenna,” in Proceedings of the World Congress on Nature and Biologically Inspired Computing (NABIC '09), pp. 487–492, IEEE, Coimbatore, India, December 2009. View at: Publisher Site  Google Scholar
 U. Singh, H. Kumar, and T. S. Kamal, “Linear array synthesis using biogeography based optimization,” Progress in Electromagnetics Research M, vol. 11, pp. 25–36, 2010. View at: Publisher Site  Google Scholar
 A. Sharaqa and N. Dib, “Design of linear and circular antenna arrays using biogeography based optimization,” in Proceedings of the 2011 1st IEEE Jordan Conference on Applied Electrical Engineering and Computing Technologies (AEECT '11), pp. 1–6, Amman, Jordan, December 2011. View at: Publisher Site  Google Scholar
 S. K. Goudos, K. B. Baltzis, K. Siakavara, T. Samaras, E. Vafiadis, and J. N. Sahalos, “Reducing the number of elements in linear arrays using biogeographybased optimization,” in Proceedings of the 6th European Conference on Antennas and Propagation (EuCAP '12), pp. 1615–1618, Prague, Czech Republic, March 2012. View at: Publisher Site  Google Scholar
 M. Dorigo and L. M. Gambardella, “Ant colonies for the travelling salesman problem,” BioSystems, vol. 43, no. 2, pp. 73–81, 1997. View at: Publisher Site  Google Scholar
 M. Dorigo and T. Stutzle, Ant Colony Optimization, The MIT Press, Cambridge, Mass, USA, 2004.
 E. MezuraMontes, J. VelázquezReyes, and C. A. Coello Coello, “A comparative study of differential evolution variants for global optimization,” in Proceedings of the 8th Annual Genetic and Evolutionary Computation Conference (GECCO '06), pp. 485–492, Seattle, Wash, USA, July 2006. View at: Google Scholar
 J. J. Liang, A. K. Qin, P. N. Suganthan, and S. Baskar, “Comprehensive learning particle swarm optimizer for global optimization of multimodal functions,” IEEE Transactions on Evolutionary Computation, vol. 10, no. 3, pp. 281–295, 2006. View at: Publisher Site  Google Scholar
 S. García, A. Fernández, J. Luengo, and F. Herrera, “Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: experimental analysis of power,” Information Sciences, vol. 180, no. 10, pp. 2044–2064, 2010. View at: Publisher Site  Google Scholar
 H. Ma, D. Simon, M. Fei, and Z. Chen, “On the equivalences and differences of evolutionary algorithms,” Engineering Applications of Artificial Intelligence, vol. 26, no. 10, pp. 2397–2407, 2013. View at: Publisher Site  Google Scholar
 J. Arabas, A. Szczepankiewicz, and T. Wroniak, “Experimental comparison of methods to handle boundary constraints in differential evolution,” in Parallel Problem Solving from Nature, PPSN XI: 11th International Conference, Kraków, Poland, September 11–15, 2010, Proceedings, Part II, R. Schaefer, C. Cotta, J. Kołodziej, and G. Rudolph, Eds., pp. 411–420, Springer, Berlin, Germany, 2010. View at: Google Scholar
 E. JuarezCastillo, N. PerezCastro, and E. MezuraMontes, “A novel boundary constrainthandling technique for constrained numerical optimization problems,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '15), pp. 2034–2041, Sendai, Japan, May 2015. View at: Publisher Site  Google Scholar
Copyright
Copyright © 2016 Sotirios K. Goudos and John N. Sahalos. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.