Table of Contents Author Guidelines Submit a Manuscript
The Scientific World Journal
Volume 2014, Article ID 672983, 9 pages
http://dx.doi.org/10.1155/2014/672983
Research Article

Hybrid Biogeography-Based Optimization for Integer Programming

College of Electronics and Information Engineering, Tongji University, Shanghai 201804, China

Received 15 January 2014; Accepted 17 March 2014; Published 3 June 2014

Academic Editors: S. Balochian and Y. Zhang

Copyright © 2014 Zhi-Cheng Wang and Xiao-Bei Wu. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Biogeography-based optimization (BBO) is a relatively new bioinspired heuristic for global optimization based on the mathematical models of biogeography. By investigating the applicability and performance of BBO for integer programming, we find that the original BBO algorithm does not perform well on a set of benchmark integer programming problems. Thus we modify the mutation operator and/or the neighborhood structure of the algorithm, resulting in three new BBO-based methods, named BlendBBO, BBO_DE, and LBBO_LDE, respectively. Computational experiments show that these methods are competitive approaches to solve integer programming problems, and the LBBO_LDE shows the best performance on the benchmark problems.

1. Introduction

An integer programming problem is a discrete optimization problem where the decision variables are restricted to integer values. In computer science and operations research, a remarkably wide range of problems, such as project scheduling, capital budgeting, goods distribution, and machine scheduling, can be expressed as integer programming problems [16]. Integer programming also has applications in bioinspired computational models such as artificial neural networks [7, 8].

The general form of an integer programming model can be stated as where is a -dimensional integer vector, is a -dimensional discrete space of integers, and is a feasible region that is not necessarily bounded. Any maximization version of integer programming problems can be easily transformed to a minimization problem.

One of the most well-known deterministic approaches for solving integer programming problems is the branch-and-bound algorithm [9]. It uses a “divide-and-conquer” strategy to split the feasible region into subregions, obtaining for each subregion a lower bound by ignoring the integrality constraints and checking whether the corresponding solution is a feasible one; if so, the current solution is optimum to the original problem; otherwise recursively split and tackle the subregions until all the variables are fixed to integers.

However, integer programming is known to be NP-hard [10], and thus the computational cost of deterministic algorithms increases very rapidly with problem size. In recent years, evolutionary algorithms (EA), which are stochastic search methods inspired by the principles of natural biological evolution, have attracted great attention and have been successfully applied to a wide range of computationally difficult problems. These heuristic algorithms do not guarantee finding the exact optimal solution in a single simulation run, but in most cases they are capable of finding acceptable solutions in a reasonable computational time.

Genetic algorithms (GA) are one of the most popular EA, but the encoding of the integer search space with fixed length binary strings as used in standard GA is not feasible for integer problems [11]. Many other heuristics, such as evolutionary strategy (ES) [12], particle swarm optimization (PSO) [13], and differential evolution (DE) [14], are initially proposed for continuous optimization problems. However, they can be adapted to integer programming by embedding the integer space into the real space and truncating or rounding real values to integers, and the applicability and performance of such approach are demonstrated by experimental studies.

Kelahan and Gaddy [15] conducted an early study that performs random search in integer spaces in the spirit of a ; that is, at each iteration a child solution vector is generated by adding a random vector to the parent vector, and the better one between the parent and the child is kept for the next generation. Rudolph [11] developed a based algorithm, which uses the principle of maximum entropy to guide the construction of a mutation distribution for arbitrary search spaces.

Laskari et al. [16] studied the ability of PSO for solving integer programming problems. On their test problems, PSO outperforms the branch-and-bound method in terms of number of function evaluations (NFE), and PSO exhibits high success rates even in cases where the branch-and-bound algorithm fails. Improved versions of PSO, including the quantum-behaved PSO which is based on the principle of state superposition and uncertainty [17] and barebones PSO which is based on samples from a normal distribution and requires no parameter tuning [18], have also been applied and shown to be efficient alternatives to integer programming problems.

Omran and Engelbrecht [19] investigated the performance of DE in integer programming. They tested three versions of DE and found that the self-adaptive DE (SDE) requiring no parameter tuning is the most efficient and performs better than PSO.

In this paper, we propose three algorithms for integer programming based on a relatively new bioinspired method, namely, biogeography-based optimization (BBO). We modify the mutation operator of the original BBO to enhance its exploration or global search ability and adopt a local neighborhood structure to avoid premature convergence. Experimental results show that our methods are competitive approaches to solving integer programming problems.

2. Biogeography-Based Optimization

Biogeography is the science of the geographical distribution of biological organisms over space and time. MacArthur and Wilson [20] established the mathematical models of island biogeography, which show that the species richness of an island can be predicted in terms of such factors as habitat area, immigration rate, and extinction rate. Inspired by this, Simon [21] developed the BBO algorithm, where a solution vector is analogous to a habitat, the solution components are analogous to a set of suitability index variables (SIVs), and the solution fitness is analogous to the species richness or habitat suitability index (HSI) of the habitat. Central to the algorithm is the equilibrium theory of island biogeography, which indicates that high HSI habitats have a high species emigration rate and low HSI habitats have a high species immigration rate. For example, in a linear model of species richness (as illustrated in Figure 1), a habitat ’s immigration rate and emigration rate are calculated based on its fitness as follows: where and are, respectively, the maximum and minimum fitness values among the population and and are, respectively, the maximum possible immigration rate and emigration rate. However, there are other nonlinear mathematical models of biogeography that can be used for calculating the migration rates [22, 23].

672983.fig.001
Figure 1: A linear model of emigration and immigration rates of a habitat.

Migration is used to modify habitats by mixing features within the population. BBO also has a mutation operator for changing SIV within a habitat itself and thus probably increasing diversity of the population. For each habitat , a species count probability computed from and indicates the likelihood that the habitat was expected a priori to exist as a solution for the problem. In this context, very high HSI habitats and very low HSI habitats are both equally improbable, and medium HSI habitats are relatively probable. The mutation rate of habitat is inversely proportional to its probability: where is a control parameter and is the maximum habitat probability in the population.

Algorithm 1 describes the general framework of BBO for a -dimensional global numerical optimization problem (where and are the lower and upper bounds of the th dimension, respectively, and is a function that generates a random value uniformly distributed in ).

alg1
Algorithm 1: The original BBO algorithm.

Typically, in Line 8 we can use a roulette wheel method for selection, the time complexity of which is . It is not difficult to see that the complexity of each iteration of the algorithm is , where is the time complexity for computing the fitness function .

3. Biogeography-Based Heuristics for Integer Programming

In BBO, the migration operator provides good exploitation ability, while the broader exploration of the search space is mainly based on the mutation operator. Simon [21] suggested that should be set to a small value (about 0.01), which results in low mutation rates. However, when being applied to integer programming, we need to use higher mutation rates to improve the exploration of search space. According to our experimental studies, when is set to about 0.25~0.3, the BBO algorithm exhibits the best performance on integer programming problems.

Note that the migration operator does not violate the integer constraints, and the rounding of real values to integers is required only after mutations (Line 13 of Algorithm 1). Nevertheless, even using a higher mutation rate, the performance of BBO is far from satisfactory for integer programming. This is mainly because random mutation operator does not utilize any information of the population to guide the exploration of search space. In this work, we introduce two other mutation operators to BBO, which results in three variants of BBO for integer programming.

3.1. A Blended Mutation Operator

In the first variant, namely, BlendBBO, we use a blended mutation operator, which is motivated by the blended crossover operator used by Mühlenbein and Schlierkamp-Voosen [24] in GA and by Ma and Simon [25] in constrained optimization. In our approach, if a component of vector is subject to mutate, we first select another vector with probability and then use the following equation to work out the new value of the component: where is a random value uniformly distributed in . Note that if the th dimension of the search space has a bound, (4) will never result in a value outside the bound.

Moreover, we employ an elitism mechanism in solution update (as used in ES [12, 26]): the migration operator always generates a new vector for each existing vector (rather than directly changing ); if is better than , no mutation will be applied and directly enters to the next generation; otherwise the mutation operator is applied to . This not only decreases the required NFE but also increases the convergence speed of the algorithm. The algorithm flow of BBO with the blended mutation is presented in Algorithm 2.

alg2
Algorithm 2: The BBO with the blended mutation for integer programming.

3.2. DE Mutation Operator

The second variant, namely, BBO_DE, replaces the random mutation operator with the mutation operator of DE, which mutates a vector component by adding the weighted difference between the corresponding components of two randomly selected vectors to a third one: where , , and are three unique randomly selected habitat indices that are different to , and is a constant scaling coefficient.

DE is well known for its good exploration ability, and the combination of BBO migration and DE mutation achieves a good balance between exploitation and exploration. BBO_DE also uses our new solution update mechanism described above. Therefore, the algorithm flow of BBO_DE simply replaces Lines 15 and 16 of Algorithm 2 with the DE mutation operation described by (5).

3.3. Local Topology of the Population

The original BBO uses a fully connected topology; that is, all the individual solutions are directly connected in the population and can migrate with each other. But such a global topology is computationally intensive and is prone to premature convergence. To overcome this problem, our third variant replaces the global topology with a local one. One of the simplest local topologies is the ring topology, where each individual is directly connected to two other individuals [27, 28]. But here we employ a more generalized local topology, the random topology, where each individual has immediate neighbors that are randomly selected from the population and is a control parameter [28].

In consequence, whenever an individual vector is to be immigrated, the emigrating vector is chosen from its neighbors rather than the whole population, based on the migration rates. The neighborhood structure can be saved in an matrix : if two habitats and are directly connected then ; otherwise . It is easy to see that the complexity of each iteration of the algorithm is .

Storn and Price [14] have proposed several different strategies on DE mutation. The scheme of (5) is denoted as DE/rand/1. Another scheme is named DE/best/1, which always chooses the best individual of the population as in (5). Omran et al. [29] extended it to the DE/lbest/1 scheme, which uses a ring topology and always chooses the better neighbor of the vector to be mutated.

In our approach, BBO migration and DE mutation share the same local random topology. That is, each individual has neighbors, and at each time an is chosen from the neighbors with probability to participate in the mutation such that

Moreover, if the current best solution has not been improved after every generation (where is a predefined constant), we reset the neighborhood structure randomly.

The third variant is named LBBO_LDE, and it also uses the same solution update mechanism as the previous two variants.

4. Computational Experiments

We test the three variants of BBO on a set of integer programming benchmark problems, which are taken from [16, 30, 31] and frequently encountered in the relevant literature. The details of the benchmark problems are described in the Appendix. For comparison, we also implement the basic BBO, DE, and SDE [19] for integer programming. The branch-and-bound method is not included for comparison, because it has shown that DE outperforms branch-and-bound on most test problems [16, 19].

For all the six algorithms, we use the same population size and run them on each problem for 40 times with different random seeds. The migration control parameters are set as for BBO, BlendBBO, BBO_DE, and LBBO_LDE, and the mutation control parameter is set to 0.01 for BBO and 0.25 for BlendBBO (BBO_DE and LBBO_LDE do not use this parameter). Empirically, the neighborhood size and the threshold of nonimprovement generations are both set to 3 for LBBO_LDE. The other parameters with regard to DE and SDE are set as suggested in [14, 32].

The first two problems and are high-dimensional problems. For , we, respectively, consider it in 10 and 30 dimensions. Table 1 presents the success rates (SR) and required NFE of the algorithms to achieve the optimum in 10 dimensions, and Figure 2 presents the corresponding convergence curves of the algorithms. As we can see, the original BBO fails to solve the problem, and the SR of BlendBBO is only 20%. The four algorithms utilizing the DE mutation operator can guarantee the optimal result on the 10-dimensional problem, among which LBBO_LDE shows the best performance, and the other three algorithms have similar performance, but the result of BBO_DE is slightly better than DE and SDE.

tab1
Table 1: SR and required NFE of the algorithms on in 10 dimensions.
672983.fig.002
Figure 2: The convergence curves of the algorithms on in 10 dimensions, where the vertical axis represents the objective value and the horizontal axis represents the number of generations.

Table 2 and Figure 3, respectively, present the results and the convergence curves of the algorithms on in 30 dimensions. On this high-dimensional problem, BBO, BlendBBO, and DE all fail to obtain the optimum, SDE and BBO_DE, respectively, have SR of 85% and 90% for obtaining the optimum, and only our LBBO_LDE can always guarantee the optimum.

tab2
Table 2: SR and required NFE of the algorithms on in 30 dimensions.
672983.fig.003
Figure 3: The convergence curves of the algorithms on in 30 dimensions, where the vertical axis represents the objective value and the horizontal axis represents the number of generations.

From the convergence curves we can also find that the BBO algorithm converges very fast at the early stage, but thereafter its performance deteriorates because it is ineffective to explore other potentially promising areas of the search space. By combining with the DE mutation operator, our hybrid BBO methods inherit the fast convergence speed of BBO, at the same time taking advantage of the exploration ability of DE.

For , we, respectively, consider it in 5 and 15 dimensions, the experimental results of which are, respectively, presented in Tables 3 and 4 and the convergence curves of which are presented in Figures 4 and 5. The results are similar to those of : for the low dimensional problem, SDE, BBO_DE, and LBBO_LDE are efficient; for the high-dimensional problem, only LBBO_LDE can guarantee the optimum; the performance LBBO_LDE is the best while that of BBO is the worst; SDE performs better than DE and BBO_DE performs slightly better than SDE, and BlendBBO outperforms BBO but is worse than the algorithms with the DE mutation operator.

tab3
Table 3: SR and required NFE of the algorithms on in 5 dimensions.
tab4
Table 4: SR and required NFE of the algorithms on in 15 dimensions.
672983.fig.004
Figure 4: The convergence curves of the algorithms on in 5 dimensions, where the vertical axis represents the objective value and the horizontal axis represents the number of generations.
672983.fig.005
Figure 5: The convergence curves of the algorithms on in 15 dimensions, where the vertical axis represents the objective value and the horizontal axis represents the number of generations.

is a 5-dimensional problem more difficult than . As we can see from the results shown in Table 5, BBO and BlendBBO always fail on the problem, and DE, SDE, and LBBO_LDE can guarantee the optimum. The required NFE of DE is slightly better than SDE and LBBO_LDE, but LBBO_LDE converges faster than DE, as shown in Figure 6.

tab5
Table 5: SR and required NFE of the algorithms on .
672983.fig.006
Figure 6: The convergence curves of the algorithms on , where the vertical axis represents the objective value and the horizontal axis represents the number of generations.

is a relatively easy problem, on which even the worst BBO has an SR of 75%, and all the other algorithms can guarantee the optimum. LBBO_LDE is the best one in terms of both NFE and convergence speed, as shown in Table 6 and Figure 7.

tab6
Table 6: SR and required NFE of the algorithms on .
672983.fig.007
Figure 7: The convergence curves of the algorithms on , where the vertical axis represents the objective value and the horizontal axis represents the number of generations.

The remaining three test problems are also relatively easy. The experimental results are presented in Tables 7, 8, and 9, and the convergence curves are shown in Figures 8, 9, and 10, respectively. As we can clearly see, the four algorithms with the DE mutation operator can always obtain the optima on these problems, and LBBO_LDE shows the best performance.

tab7
Table 7: SR and required NFE of the algorithms on .
tab8
Table 8: SR and required NFE of the algorithms on .
tab9
Table 9: SR and required NFE of the algorithms on .
672983.fig.008
Figure 8: The convergence curves of the algorithms on , where the vertical axis represents the objective value and the horizontal axis represents the number of generations.
672983.fig.009
Figure 9: The convergence curves of the algorithms on , where the vertical axis represents the objective value and the horizontal axis represents the number of generations.
672983.fig.0010
Figure 10: The convergence curves of the algorithms on , where the vertical axis represents the objective value and the horizontal axis represents the number of generations.

In summary, our LBBO_LDE outperforms the other algorithms on all of the test problems. Generally speaking, the original BBO converges fast at first, but it is easy to be trapped by the local optima. BlendBBO alleviates the dilemma to a certain degree, but the DE mutation operator is more effective than the blended mutation operator, as demonstrated by our experimental results. By combining BBO and DE, the BBO_DE algorithm provides an efficient alternative to popular methods such as SDE. The local topology used in LBBO_LDE further improves the search ability and suppresses the premature convergence, especially on high-dimensional problems where the performance of DE and SDE deteriorates quickly. Therefore, LBBO_LDE is a very competitive heuristic method for solving integer programming problem.

5. Conclusion

In this paper we develop three algorithms for integer programming based on the BBO heuristic. The BlendBBO uses a blended mutation operator, BBO_DE integrates the DE mutation operator, and LBBO_LDE further uses a local neighborhood structure for selecting individuals for migration and mutation. Experimental results show that LBBO_LDE has the best performance on a set of benchmark integer programming problem.

In general, the LBBO_LDE algorithm with local neighborhood size of 3~5 is efficient on the test problem, but none of the values can provide the best performance on all the problems. Currently we are studying a mechanism that dynamically adjusts the neighborhood size as well as other control parameters according to the search state [33]. Moreover, the test problems considered in the paper only have bounds for decision variables but do not include other constraints, and we are extending the proposed approach to solve more complex constrained optimization problems, including multiobjective ones [3436]. We also believe that our approach can be adapted to effectively handle other kinds of combinatorial optimization problems, such as 0-1 integer programming and permutation-based optimization.

Appendix

Integer Programming Benchmark Problems

Consider In the above problems, the ranges of variables are all set as .

Conflict of Interests

The authors declare that they have no conflict of interests regarding the publication of this paper.

References

  1. S. Sofianopoulou, “The process allocation problem: a survey of theapplication of graph-theoretic and integer programming approaches,” Journal of the Operational Research Society, vol. 43, no. 5, pp. 407–413, 1992. View at Google Scholar
  2. J. M. Whitacre, “Recent trends indicate rapid growth of nature-inspired optimization in academia and industry,” Computing, vol. 93, no. 2-4, pp. 121–133, 2011. View at Publisher · View at Google Scholar · View at Scopus
  3. Y. Zheng and J. Xue, “A problem reduction based approach to discrete optimization algorithm design,” Computing, vol. 88, no. 1-2, pp. 31–54, 2010. View at Publisher · View at Google Scholar · View at Scopus
  4. Y. J. Zheng, S. Y. Chen, Y. Lin, and W. L. Wang, “Bio-inspired optimizationof sustainable energy systems: a review,” Mathematical Problemsin Engineering, vol. 2013, Article ID 354523, 12 pages, 2013. View at Publisher · View at Google Scholar
  5. Y. J. Zheng, H. F. Ling, H. H. Shi, H. S. Chen, and S. Y. Chen, “Emergencyrailway wagon scheduling by hybrid biogeography-based optimization,” Computers & Operations Research, vol. 43, no. 3, pp. 1–8, 2014. View at Publisher · View at Google Scholar
  6. Y. J. Zheng, H. F. Ling, and J. Y. Xue, “Ecogeography-based optimization: enhancing biogeography-based optimization with ecogeographic barriers and differentiations,” Computers & Operations Research, vol. 50, pp. 115–127, 2014. View at Google Scholar
  7. V. Dua, “A mixed-integer programming approach for optimal configuration of artificial neural networks,” Chemical Engineering Research and Design, vol. 88, no. 1, pp. 55–60, 2010. View at Publisher · View at Google Scholar · View at Scopus
  8. Y. Tan, J. Wang, and J. M. Zurada, “Nonlinear blind source separation using a radial basis function network,” IEEE Transactions on Neural Networks, vol. 12, no. 1, pp. 124–134, 2001. View at Publisher · View at Google Scholar · View at Scopus
  9. G. Nemhauser and L. Wolsey, Integer and Combinatorial Optimization, John Wiley & Sons, 1988.
  10. M. R. Garey, D. S. Johnson, and R. Sethi, “The complexity of flowshop and jobshop scheduling,” Mathematics of Operations Research, vol. 1, no. 2, pp. 117–129, 1976. View at Google Scholar · View at Scopus
  11. G. Rudolph, “An evolutionary algorithm for integer programming,” in Parallel Problem Solving from Nature, Y. Davidor, H. P. Schwefel, and R. Männer, Eds., vol. 866 of Lecture Notes in Computer Science, pp. 139–148, Springer, Berlin, 1994. View at Publisher · View at Google Scholar
  12. H. G. Beyer and H. P. Schwefel, “Evolution strategies—a comprehensive introduction,” Natural Computing, vol. 1, no. 1, pp. 3–52, 2002. View at Publisher · View at Google Scholar
  13. J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proceedings of the IEEE International Conference on Neural Networks, vol. 4, pp. 1942–1948, December 1995. View at Scopus
  14. R. Storn and K. Price, “Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341–359, 1997. View at Google Scholar · View at Scopus
  15. R. Kelahan and J. Gaddy, “Application of the adaptive random searchto discrete and mixed integer optimization,” International Journal for Numerical Methods in Engineering, vol. 12, pp. 289–298, 1978. View at Google Scholar
  16. E. C. Laskari, K. E. Parsopoulos, and M. N. Vrahatis, “Particle swarm optimizationfor integer programming,” in Proceedings of the IEEE Congress on Evolutionary Computation, vol. 2, pp. 1582–1587, 2002. View at Publisher · View at Google Scholar
  17. J. Liu, J. Sun, and W. Xu, “Quantum-behaved particle swarm optimizationfor integer programming,” in Neural Information Processing, I. King, J. Wang, L. W. Chan, and D. Wang, Eds., vol. 4233 of Lecture Notes in Computer Science, pp. 1042–1050, Springer, Berlin, Germany, 2006. View at Publisher · View at Google Scholar
  18. M. G. H. Omran, A. Engelbrecht, and A. Salman, “Barebones particle swarm for integer programming problems,” in Proceedings of the IEEE Swarm Intelligence Symposium (SIS '07), pp. 170–175, April 2007. View at Publisher · View at Google Scholar · View at Scopus
  19. M. G. H. Omran and A. P. Engelbrecht, “Differential evolution for integer programming problems,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '07), pp. 2237–2242, September 2007. View at Publisher · View at Google Scholar · View at Scopus
  20. R. MacArthur and E. Wilson, The Theory of Biogeography, Princeton University Press, Princeton, NJ, USA, 1967.
  21. D. Simon, “Biogeography-based optimization,” IEEE Transactions on Evolutionary Computation, vol. 12, no. 6, pp. 702–713, 2008. View at Publisher · View at Google Scholar · View at Scopus
  22. H. Ma, “An analysis of the equilibrium of migration models for biogeography-based optimization,” Information Sciences, vol. 180, no. 18, pp. 3444–3464, 2010. View at Publisher · View at Google Scholar · View at Scopus
  23. H. Ma and D. Simon, “Analysis of migration models of biogeography-based optimization using Markov theory,” Engineering Applications of Artificial Intelligence, vol. 24, no. 6, pp. 1052–1060, 2011. View at Publisher · View at Google Scholar · View at Scopus
  24. H. Mühlenbein and D. Schlierkamp-Voosen, “Predictive models for thebreeder genetic algorithm I. Continuous parameter optimization,” Evolutionary Computation, vol. 1, no. 1, pp. 25–49, 1993. View at Publisher · View at Google Scholar
  25. H. Ma and D. Simon, “Blended biogeography-based optimization for constrained optimization,” Engineering Applications of Artificial Intelligence, vol. 24, no. 3, pp. 517–525, 2011. View at Publisher · View at Google Scholar · View at Scopus
  26. D. Du, D. Simon, and M. Ergezer, “Biogeography-based optimization combined with evolutionary strategy and immigration refusal,” in Proceedings of the IEEE International Conference on Systems, Man and Cybernetics (SMC '09), pp. 997–1002, October 2009. View at Publisher · View at Google Scholar · View at Scopus
  27. X. Li, “Niching without niching parameters: particle swarm optimization using a ring topology,” IEEE Transactions on Evolutionary Computation, vol. 14, no. 1, pp. 150–169, 2010. View at Publisher · View at Google Scholar · View at Scopus
  28. Y. J. Zheng, H. F. Ling, X. B. Wu, and J. Y. Xue, “Localized biogeographybasedoptimization,” Soft Computing, 2014. View at Publisher · View at Google Scholar
  29. M. G. Omran, A. P. Engelbrecht, and A. Salman, “Using neighborhood topologies with differential evolution,” in Proceedings of the Workshop of International Conference on Computational Intelligence and Security, pp. 35–40, 2005.
  30. A. Glankwahmdee, J. S. Liebman, and G. L. Hogg, “Unconstrained discrete nonlinear programming,” Engineering Optimization, vol. 4, no. 2, pp. 95–107, 1979. View at Google Scholar · View at Scopus
  31. S. Rao, Engineering Optimization—Theory and Practice, Wiley Eastern, New Delhi, India, 1996.
  32. M. Omran, A. Salman, and A. Engelbrecht, “Self-adaptive differential evolution,” in Computational Intelligence and Security, Y. Hao, J. Liu, Y. Wang et al., Eds., vol. 3801 of Lecture Notes in Computer Science, pp. 192–199, Springer, Berlin, Germany, 2005. View at Publisher · View at Google Scholar
  33. Y. J. Zheng, H. F. Ling, and Q. Guan, “Adaptive parameters for a modifiedcomprehensive learning particle swarm optimizer,” Mathematical Problemsin Engineering, vol. 2012, Article ID 207318, 11 pages, 2012. View at Publisher · View at Google Scholar
  34. Y. J. Zheng, Q. Song, and S. Y. Chen, “Multiobjective fireworks optimizationfor variable-rate fertilization in oil crop production,” Applied Soft Computing, vol. 13, no. 11, pp. 4253–4263, 2013. View at Publisher · View at Google Scholar
  35. Y. J. Zheng and S. Y. Chen, “Cooperative particle swarm optimization formultiobjective transportation planning,” Applied Intelligence, vol. 39, no. 1, pp. 202–216, 2013. View at Publisher · View at Google Scholar
  36. Y. Zheng, H. Ling, J. Xue, and S. Chen, “Population classificationin fire evacuation: a multiobjective particle swarm optimization approach,” IEEE Transactions on Evolutionary Computation, vol. 18, no. 1, pp. 70–81, 2014. View at Publisher · View at Google Scholar