Bioinspired Computation and Its Applications in Operation Management
View this Special IssueResearch Article  Open Access
Peng Wang, Zhouquan Zhu, Shuai Huang, "SevenSpot Ladybird Optimization: A Novel and Efficient Metaheuristic Algorithm for Numerical Optimization", The Scientific World Journal, vol. 2013, Article ID 378515, 11 pages, 2013. https://doi.org/10.1155/2013/378515
SevenSpot Ladybird Optimization: A Novel and Efficient Metaheuristic Algorithm for Numerical Optimization
Abstract
This paper presents a novel biologically inspired metaheuristic algorithm called sevenspot ladybird optimization (SLO). The SLO is inspired by recent discoveries on the foraging behavior of a sevenspot ladybird. In this paper, the performance of the SLO is compared with that of the genetic algorithm, particle swarm optimization, and artificial bee colony algorithms by using five numerical benchmark functions with multimodality. The results show that SLO has the ability to find the best solution with a comparatively small population size and is suitable for solving optimization problems with lower dimensions.
1. Introduction
In recent years, heuristic algorithms have gained popularity because of their ability to find nearoptimal solutions to problems unsolved by analytical methods within reasonable computation time due to the multimodality or high dimensionality of their objective functions [1]. Heuristic algorithms are usually developed to solve a specific problem. There is also a class of heuristic algorithms which can be used to solve a large class of problems either directly or with minor modifications, hence the name metaheuristic algorithms [2].
Researchers continue to develop many metaheuristic algorithms. Some of the most successful metaheuristic algorithms include genetic algorithm (GA) [3], ant colony optimization [4], particle swarm optimization (PSO) [5], and artificial bee colony (ABC) [6]. Some of the recently proposed metaheuristic algorithms include cuckoo search [7], monkey search [8], firefly algorithm [9], grenade explosion method [10], cat swarm optimization [11], and the artificial chemical reaction optimization algorithm [12]. The majority of these algorithms are biologically inspired; that is, they mimic nature for problem solving.
Metaheuristic algorithms are widely used in different fields and problems, such as manufacturing [13], services [14], scheduling [15], transportation [16], health [17], sports [18], geology [19], and astronomy [20]. A single metaheuristic algorithm that can solve all optimization problems of different types and structures does not exist, and, thus, new metaheuristic optimization algorithms are continuously developed [21].
This paper introduces a novel biologically inspired metaheuristic algorithm called sevenspot ladybird optimization (SLO). SLO is inspired by the foraging behavior of a sevenspot ladybird. This paper presents the basic concepts and main steps of the SLO and demonstrates its efficiency. The performance of the SLO is compared with some popular metaheuristic algorithms, such as GA, PSO, and ABC, by using five different dimensional classical benchmark functions, as given in [6, 22]. The simulated results show that the SLO has the ability to get out of a local minimum and is efficient for some multivariable, multimodal function optimizations.
In general, all the metaheuristic algorithms have something in common in the sense that they are populationbased search methods. These methods move from a set of points (population) to another set of points in a single iteration with likely improvement using a combination of deterministic or probabilistic rules. The most remarkable difference of these metaheuristic algorithms lies in the updating rules. The GA is inspired by the principles of genetics and evolution and mimics the reproduction behavior observed in biological populations. The GA employs the principal of “survival of the fittest” in its search process to select and generate individuals that are adapted to their environment. In PSO, instead of using genetic operators, each particle adjusts its “flying” according to its own flying experience and its companions’ flying experience [23]. ABC uses minimal model that mimics the foraging behavior of bees comprising of employed bees, onlooker bees and scouts [24]. The bees aim at discovering places of food sources with high amount of nectar (good fitness values). Differently, in our paper, SLO attempts to simulate the foraging behavior of a sevenspot ladybird which is rarely researched in the field of metaheuristic algorithm. The SLO algorithm consists of three essential components: dividing patches, searching food, and dispersal. Dividing patches increases the search efficiency and dispersal progressively reduces the search space. The search strategy in our algorithm is classified into extensive search and intensive search. Extensive search overcomes the weakness of local search and intensive search increases the possibility of achieving latent best solution. All the ideas are inspired by recent discoveries on the foraging behavior of a sevenspot ladybird which are quite different from other metaheuristic algorithms.
The rest of this paper is organized as follows. Section 2 presents the foraging behavior of the sevenspot ladybird. Section 3 describes the SLO and the steps in detail. Section 4 discusses the experiments and the results. Section 5 draws the conclusions.
2. SevenSpot Ladybird Foraging Behaviors
The sevenspot ladybird (Figure 1), Coccinella septempunctata, is a common, easily recognizable insect that has attracted a considerable amount of interest from professional entomologists.
Recent studies have shown that sevenspot ladybirds are more social than we believe them to be [25–28]. Sevenspot ladybirds use different kinds of pheromones at different stages of its life history, such as eggs, larvae, pupa, and adult stages (Figure 2). Some chemical ecologies of the sevenspot ladybirds, with special attention to semiochemicals involved in social communication and foraging behaviors, have been reviewed in [29].
Sevenspot ladybirds are effective predators of aphids and other homopteran pests, and, thus, their foraging behaviors have been extensively studied [30–35]. Some scholars classified the environmental levels of sevenspot ladybirds into prey, patches, and habitats (Figure 3) [33–35], providing a framework for discussing the foraging behaviors of sevenspot ladybirds.
In Figure 3, movement between prey within aggregates of aphids is referred to as intensive search which is slow and sinuous. Movement between aggregates within a patch is referred to as extensive search which is relatively linear and fast. Movement between patches is called dispersal and movement from patches to hibernation is called migration.
Sevenspot ladybirds locate their prey via extensive search and then switch to intensive search after feeding. While searching for its prey, a sevenspot ladybird holds its antennae parallel to its searching substratum and its maxillary palpi perpendicular to the substratum. The ladybird vibrates its maxillary palpi and turns its head from side to side. The sideward vibration can increase the area wherein the prey may be located.
How sevenspot ladybirds decide when to leave a patch for another, also known as dispersal, remains unclear. Several authors suggested that beetles decide to leave when the capture rate falls below a critical value or when the time since the last aphid was captured exceeds a certain threshold [36–38].
3. Proposed SevenSpot Ladybird Optimization (SLO) Algorithm
This section describes the proposed sevenspot ladybird optimization (SLO) algorithm, which simulates the foraging behavior of sevenspot ladybirds to solve multidimensional and multimodal optimization problems. The main steps of the SLO are as follows.
Step 1 (dividing patches). Suppose that the search space (environment) is a dimensional space. The th dimensional space is divided into subspaces, and the whole dimensional space is divided into subspaces (patches).
Step 2 (initializing population). Suppose that each sevenspot ladybird is treated as a point in a dimensional patch. The th ladybird is represented as , where is a latent solution to the optimized question.
If is the number of sevenspot ladybirds initialized with random positions in a patch, then the population size of the sevenspot ladybirds is , .
Step 3 (calculating fitness). For each particle, evaluate the optimization fitness in a dimensional patch.
Step 4 (choosing the best ladybird). The current fitness evaluation of each ladybird was compared with the fitness value of its best historical position (). If the current value is better than the previous one, then set value is equal to the current value, and the position is equal to the current position.
The current best fitness evaluation of all the ladybirds in a patch was compared with the fitness value of their previous best position (). If current value is better than the previous one, then set value equal to the current value, and the position equal to the current position.
The current best fitness evaluation of all the ladybirds in the population was compared with the fitness value of their previous best position (). If the current value is better than the previous one, then set value equal to the current value, and the position equal to the current position.
Step 5 (dispersal). In the SLO, if a position does not improve in a predetermined number of cycles, then a new position is produced in the patch where exists, replacing the abandoned position. The new position is produced near the to share the information of the best ladybird in the whole particle. The value of the predetermined number of cycles (limit) is an important control parameter in the SLO.
If the abandoned position is and , then the sevenspot ladybird discovers a new position as follows:
where is the neighborhood space of and is a random number between .
Step 6 (updating positions). The position of a ladybird is updated associated with its previous movement. If a ladybird has done extensive search, then the position of the ladybird is changed as follows:
After intensive search, a ladybird switches to extensive search. The position is updated according to the following equations:
In (2) and (4), and are two random numbers uniformly distributed from 0 to 1 and the positive constant is used for adjusting the search step and search direction in each iteration. In (3) and (5), the velocities of the ladybirds in each dimension are limited to the maximum velocity , which decides the search precision of the ladybirds in a solution space. If is too high, then the ladybirds will possibly fly over the optimal solution. However, if the is too low, then the ladybirds will fall into the local search space and have no method to carry on with the global search. Typically, is set as follows:
where and are the upper and lower bounds of each patch, respectively. Equation (6) came from [39]. We adopt it here to clamp the particles’ velocities on each dimension.
From equations above, we can see that the velocity updating rule is composed of three parts. The first part, known as intensive search, is inspired by the slow and sinuous movements of ladybirds. The second part, known as extensive search, is derived from the relatively linear and fast movement behavior of ladybirds. The third part imitates the sideward vibration of ladybirds to increase the search area where the potential solution may exist. The parameter and are usually set as relatively small random numbers.
Step 7 (inspecting termination condition). If the termination condition is satisfied, that is, the SLO has achieved the maximum iteration number, then the SLO is terminated; otherwise, it returns to Step 3.
4. Experiments
4.1. Benchmark Functions
In the field of heuristic computation, it is common to compare different algorithms using a set of test functions. However, the effectiveness of an algorithm against another algorithm cannot be measured by the number of problems that it solves better [40]. In this way, we have made a previous study of the functions to be optimized for constructing a test set with fewer functions and a better selection. We used five classical benchmark functions to compare the performance of the proposed SLO with those of GA, PSO, and ABC. This set is adequate to include different kinds of problems such as unimodal, multimodal, regular, irregular, separable, nonseparable, and multidimensional. Mathematical descriptions of the benchmark functions were obtained from [6, 22].
The first function is the Griewank function whose value is 0 at its global minimum (7). Initialization range for the function is . The Griewank function has a product term that introduces interdependence among its variables. The aim is to overcome the failure of the techniques that optimize each variable independently. The optima of the Griewank function are regularly distributed. Since the number of local optima increases with the dimensionality, this function is strongly multimodal. The multimodality disappears for sufficiently high dimensionalities () and makes the problem unimodal. Consider
The second function is the Rastrigin function whose value is 0 at its global minimum (8). Initialization range for the function is . The Rastrigin function is based on the Sphere function with the addition of cosine modulation to produce many local minima, making it multimodal. The locations of the minima are regularly distributed. The difficult part about finding optimal solutions to the Rastrigin function is that an optimization algorithm is easily trapped in a local optimum on its way towards the global optimum. Consider
The third function is the Rosenbrock function whose value is 0 at its global minimum (9). Initialization range for the function is . The global optimum is inside a long, narrow, parabolicshaped flat valley. Since it is difficult to converge to the global optimum, the variables are strongly dependent, and the gradients generally do not point towards the optimum, this problem is repeatedly used to test the performance of the optimization algorithms. Consider
The fourth function is the Ackley function whose value is 0 at its global minimum (10). Initialization range for the function is . The Ackley function has an exponential term that covers its surface with numerous local minima, making its complexity moderated. An algorithm that only uses the gradient steepest descent will be trapped in the local optima, but any search strategy that analyzes a wider region will be able to cross the valley among the optima and achieve better results. A search strategy must combine the exploratory and exploitative components efficiently to obtain good results for the Ackley function. Consider
The fifth function is the Schwefel function whose value is 0 at its global minimum (11). Initialization range for the function is . The surface of the Schwefel function is composed of a large number of peaks and valleys. The Schwefel function has a second best minimum far from the global minimum where many search algorithms are trapped. Moreover, the global minimum is near the bounds of the domain. Consider
4.2. Settings for Algorithms
The common control parameters for the algorithms include population size and number of maximum generation. In the experiments, maximum generations were 750, 1000, and 1500 for Dimensions 5, 10, and 30, respectively, and the population size was 50. Other control parameters of the algorithms and the schemes used in [6], including the control parameter values employed for GA, PSO, and ABC are presented below.
4.2.1. GA Settings
The settings for the used GA scheme presented in [6] are as follows: single point uniform crossover with rate of 0.95, random selection mechanism, Gaussian mutation with rate of 0.1, and linear ranking fitness function. A child chromosome is added to the population by using the child production scheme.
4.2.2. PSO Settings
PSO equations can be expressed as follows: where is the additional inertia weight that varies from 0.9 to 0.4 linearly with the iterations. The learning factors, and , are set to 2. The upper and lower bounds for , (, ) are set as the maximum upper and lower bounds of ; that is, (, ) = (, ). If the sum of accelerations would cause the velocity on that dimension to exceed or , then the velocity on that dimension will be limited to or , respectively [6].
4.2.3. ABC Settings
The control parameters of the ABC algorithm are as follows: the maximum number of cycles is equal to the maximum number of generation and the colony size is equal to the population size, that is, 50, as presented in [6]. The percentage of onlooker bees was 50% of the colony, the employed bees were 50% of the colony, and one bee was selected as the scout bee. The increase in the number of scouts encourages the exploration because the increase in onlookers for a food source increases exploitation.
4.2.4. SLO Settings
In SLO, each dimension is divided into two equal parts, and thus, 2^{D} patches are generated. In each patch, the initial population of ladybirds is set to 20. The parameter limit is 100 and is 1; that is, after 100 cycles of search, if a position in a patch cannot be improved, then it will be abandoned and a new position will be produced in the neighborhood of . The parameter in (2) and (4) decreases linearly from 10 to 2. The sideward vibration is and is .
4.3. Results and Discussion
In this paper, all the experiments were repeated 30 times with different random seeds. The best and mean function values of the solutions found using the algorithms for different dimensions were recorded. Tables 1, 2, 3, 4, and 5 present the mean, best, and standard deviations of the function values obtained using SLO, GA, PSO, and ABC with , and . Figures 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, and 18 show the convergence characteristics in terms of the fitness value of each algorithm for each test function.





According to the best function values obtained using the different algorithms with , the SLO can find the global optimization solution with values close to the theoretical solution and has the same search ability as PSO. Many literatures have pointed out that larger population size and large number of generations increase the likelihood of obtaining a global optimum solution. Thus, the performance of PSO with a swarm size of 50 is better than that with a swarm size of 20. According to the experiments in our paper, SLO with a small population of 20 was able to find the global optimization solution with values close to the theoretical solution. This indicates the proposed SLO algorithm has the ability to find the best solution with a comparatively small population size. Based on the mean results of all experiments, the proposed SLO has better performance than GA for Griewank function, Ackley function, and Schwefel function. However, when dimension is 30, the result of SLO is no better than that of PSO and ABC. Comparing the convergence graphs, SLO converged faster and performed better compared with GA.
From the results, we can see that the SLO does not obtain better result along with the growing dimensions. Considering the No Free Lunch Theorem [41], if we compare two searching algorithms with all possible functions, the performance of any two algorithms will be, on average, the same. As a result, when an algorithm is evaluated, we must look for the kind of problems where its performance is good, in order to characterize the type of problems for which the algorithm is suitable [42]. In this paper, the proposed SLO is suitable for solving optimization problems with lower dimensions.
5. Conclusion
This paper investigated the foraging behaviors of sevenspot ladybirds and proposed a novel biologically inspired metaheuristic algorithm called SLO. The SLO, GA, PSO, and ABC algorithms were tested on five numerical benchmark functions with multimodality to validate the performance of SLO. The simulated results show that SLO has the ability to find the best solution and is suitable for solving optimization problems with lower dimensions. In this paper, the ABC algorithm outperformed all other algorithms, but according to the No Free Lunch Theorem [41], “any elevated performance over one class of problems is offset by performance over another class.” Future studies will focus on improving the SLO.
Acknowledgments
The authors are grateful to the editor and the anonymous referees for their insightful and constructive comments and suggestions, which have been very helpful for improving this paper. This research was supported by the National Natural Science Foundation of China (Grant no. 51375389) and the National High Technology Research and Development Program of China (863 Program) no. 2011AA09A104.
References
 D. T. Pham and D. Karaboga, Intelligent Optimisation Techniques, Springer, New York, NY, USA, 2000.
 F. Glover and G. A. Kochenberger, Handbook of Metaheuristics, Kluwer Academic, Boston, Mass, USA, 2003.
 J. H. Holland, Adaptation in Natural and Artificial Systems, University of Michigan Press, Lansing, Mich, USA, 1975.
 M. Dorigo, V. Maniezzo, and A. Colorni, “Ant system: optimization by a colony of cooperating agents,” IEEE Transactions on Systems, Man, and Cybernetics B, vol. 26, no. 1, pp. 29–41, 1996. View at: Publisher Site  Google Scholar
 R. C. Eberhart and J. Kennedy, “New optimizer using particle swarm theory,” in Proceedings of the 6th International Symposium on Micro Machine and Human Science, pp. 39–43, Nagoya, Japan, October 1995. View at: Google Scholar
 D. Karaboga and B. Basturk, “A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm,” Journal of Global Optimization, vol. 39, no. 3, pp. 459–471, 2007. View at: Publisher Site  Google Scholar
 X. S. Yang and S. Deb, “Cuckoo search via Lévy flights,” in Proceedings of the World Congress on Nature and Biologically Inspired Computing (NABIC '09), pp. 210–214, Coimbatore, India, December 2009. View at: Publisher Site  Google Scholar
 A. Mucherino and O. Seref, “Monkey search: a novel metaheuristic search for global optimization,” in Proceedings of the Conference on Data Mining, Systems Analysis, and Optimization in Biomedicine, pp. 162–173, Gainesville, Fla, USA, March 2007. View at: Publisher Site  Google Scholar
 X. S. Yang, “Firefly algorithms for multimodal optimization,” Stochastic Algorithms: Foundations and Applications, vol. 5792, pp. 169–178, 2009. View at: Publisher Site  Google Scholar
 A. Ahrari and A. A. Atai, “Grenade explosion method—a novel tool for optimization of multimodal functions,” Applied Soft Computing Journal, vol. 10, no. 4, pp. 1132–1140, 2010. View at: Publisher Site  Google Scholar
 S. C. Chu, P. W. Tsai, and J. S. Pan, “Cat swarm optimization,” in PRICAI 2006: Trends in Artificial Intelligence, vol. 4099 of Lecture Notes in Computer Science, pp. 854–858, Springer, Guilin, China, 2006. View at: Google Scholar
 B. Alatas, “Acroa: artificial chemical reaction optimization algorithm for global optimization,” Expert Systems with Applications, vol. 38, no. 10, pp. 13170–13180, 2011. View at: Publisher Site  Google Scholar
 M. Kapanoglu and W. A. Miller, “An evolutionary algorithmbased decision support system for managing flexible manufacturing,” Robotics and ComputerIntegrated Manufacturing, vol. 20, no. 6, pp. 529–539, 2004. View at: Publisher Site  Google Scholar
 N. Mansour, H. Tabbara, and T. Dana, “A genetic algorithm approach for regrouping service sites,” Computers and Operations Research, vol. 31, no. 8, pp. 1317–1333, 2004. View at: Publisher Site  Google Scholar
 Z. Lian, X. Gu, and B. Jiao, “A similar particle swarm optimization algorithm for permutation flowshop scheduling to minimize makespan,” Applied Mathematics and Computation, vol. 175, no. 1, pp. 773–785, 2006. View at: Publisher Site  Google Scholar
 L. Barcos, V. Rodríguez, M. J. Álvarez, and F. Robusté, “Routing design for lessthantruckload motor carriers using ant colony optimization,” Transportation Research E, vol. 46, no. 3, pp. 367–383, 2010. View at: Publisher Site  Google Scholar
 G. N. Ramos, Y. Hatakeyama, F. Dong, and K. Hirota, “Hyperbox clustering with ant colony optimization (HACO) method and its application to medical risk profile recognition,” Applied Soft Computing Journal, vol. 9, no. 2, pp. 632–640, 2009. View at: Publisher Site  Google Scholar
 J. P. Hamiez and J. K. Hao, “Using solution properties within an enumerative search to solve a sports league scheduling problem,” Discrete Applied Mathematics, vol. 156, no. 10, pp. 1683–1693, 2008. View at: Publisher Site  Google Scholar
 M. Tamer Ayvaz, “Application of harmony search algorithm to the solution of groundwater management models,” Advances in Water Resources, vol. 32, no. 6, pp. 916–924, 2009. View at: Publisher Site  Google Scholar
 P. Charbonneau, “Genetic algorithms in astronomy and astrophysics,” Astrophysical Journal, vol. 101, no. 2, pp. 309–334, 1995. View at: Google Scholar
 E. Rashedi, H. NezamabadiPour, and S. Saryazdi, “Gsa: a gravitational search algorithm,” Information Sciences, vol. 179, no. 13, pp. 2232–2248, 2009. View at: Publisher Site  Google Scholar
 D. Srinivasan and T. H. Seow, “Particle swarm inspired evolutionary algorithm (psea) for multiobjective optimization problems,” in Proceedings of the Congress on Evolutionary Computation, pp. 2292–2297, Canberra, Australia, 2003. View at: Google Scholar
 R. Hassan, B. Cohanim, O. de Weck, and G. Venter, “A comparison of particle swarm optimization and the genetic algorithm,” in Proceedings of the 46th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics and Materials Conference, pp. 1–13, Austin, Tex, USA, April 2005. View at: Google Scholar
 B. Akay and D. Karaboga, “Artificial bee colony algorithm for largescale problems and engineering design optimization,” Journal of Intelligent Manufacturing, vol. 23, no. 4, pp. 1001–1014, 2012. View at: Publisher Site  Google Scholar
 I. Hodek, G. Iperti, and M. Hodkova, “Longdistance flights in Coccinellidae (Coleoptera),” The European Journal of Entomology, vol. 90, no. 4, pp. 403–414, 1993. View at: Google Scholar
 A. F. G. Dixon, Insect PredatorPrey Dynamics: Ladybird Beetles and Biological Control, Cambridge University Press, New York, NY, USA, 2000.
 J. L. Hemptinne, M. Gaudin, A. F. G. Dixon, and G. Lognay, “Social feeding in ladybird beetles: adaptive significance and mechanism,” Chemoecology, vol. 10, no. 3, pp. 149–152, 2000. View at: Google Scholar
 J. L. Hemptinne, G. Lognay, C. Gauthier, and A. F. G. Dixon, “Role of surface chemical signals in egg cannibalism and intraguild predation in ladybirds (Coleoptera: Coccinellidae),” Chemoecology, vol. 10, no. 3, pp. 123–128, 2000. View at: Google Scholar
 J. Pettersson, V. Ninkovic, R. Glinwood, M. A. Birkett, and J. A. Pickett, “Foraging in a complex environmentsemiochemicals support searching behaviour of the seven spot ladybird,” The European Journal of Entomology, vol. 102, no. 3, pp. 365–370, 2005. View at: Google Scholar
 V. Ninkovic, S. Al Abassi, and J. Pettersson, “The influence of aphidinduced plant volatiles on ladybird beetle searching behavior,” Biological Control, vol. 21, no. 2, pp. 191–195, 2001. View at: Publisher Site  Google Scholar
 N. Suzuki and T. Ide, “The foraging behaviors of larvae of the ladybird beetle, Coccinella septempunctata L., (Coleoptera: Coccinellidae) towards anttended and nonanttended aphids,” Ecological Research, vol. 23, no. 2, pp. 371–378, 2008. View at: Publisher Site  Google Scholar
 A. Vantaux, O. Roux, A. Magro, and J. Orivel, “Evolutionary perspectives on myrmecophily in ladybirds,” Psyche, vol. 2012, Article ID 591570, 7 pages, 2012. View at: Publisher Site  Google Scholar
 M. P. Hassell and T. R. E. Southwood, “Foraging strategies of insects,” Annual Review of Ecology and Systematics, vol. 9, pp. 75–98, 1978. View at: Google Scholar
 I. Hodek, S. Chakrabarti, and M. Rejmanek, “The effect of prey density on food intake by adult Cheilomenes sulphurea [Col.: Coccinellidae],” BioControl, vol. 29, no. 2, pp. 179–184, 1984. View at: Publisher Site  Google Scholar
 A. Ferran and A. F. G. Dixon, “Foraging behaviour of ladybird larvae (Coleoptera: Coccinellidae),” The European Journal of Entomology, vol. 90, no. 4, pp. 383–402, 1993. View at: Google Scholar
 J. L. Hemptinne, A. F. G. Dixon, and J. Coffin, “Attack strategy of ladybird beetles (Coccinellidae): factors shaping their numerical response,” Oecologia, vol. 90, no. 2, pp. 238–245, 1992. View at: Publisher Site  Google Scholar
 P. Kindlmann and A. F. G. Dixon, “Optimal foraging in ladybird beetles (Coleoptera: Coccinellidae) and its consequences for their use in biological control,” European Journal of Entomology, vol. 90, no. 4, pp. 443–450, 1993. View at: Google Scholar
 N. Minoretti and W. W. Weisser, “The impact of individual ladybirds (Coccinella septempunctata, Coleoptera: Coccinellidae) on aphid colonies,” The European Journal of Entomology, vol. 97, no. 4, pp. 475–479, 2000. View at: Google Scholar
 R. C. Eberhart and Y. Shi, “Particle swarm optimization: developments, applications and resources,” in Proceedings of the Congress on Evolutionary Computation, vol. 1, pp. 81–86, Seoul, Republic of Korea, May 2001. View at: Google Scholar
 D. OrtizBoyer, C. HervásMartínez, and N. GarcíaPedrajas, “CIXL2: a crossover operator for evolutionary algorithms based on population features,” Journal of Artificial Intelligence Research, vol. 24, pp. 1–48, 2005. View at: Google Scholar
 D. H. Wolpert and W. G. Macready, “No free lunch theorems for optimization,” IEEE Transactions on Evolutionary Computation, vol. 1, no. 1, pp. 67–82, 1997. View at: Publisher Site  Google Scholar
 F. Valdez and P. Melin, “Comparative study of particle swarm optimization and genetic algorithms for complex mathematical functions,” Journal of Automation, Mobile Robotics and Intelligent Systems, vol. 2, no. 1, pp. 43–51, 2008. View at: Google Scholar
Copyright
Copyright © 2013 Peng Wang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.