About this Journal Submit a Manuscript Table of Contents
Applied Computational Intelligence and Soft Computing
Volume 2011 (2011), Article ID 938240, 20 pages
http://dx.doi.org/10.1155/2011/938240
Review Article

Evolutionary Computation and Its Applications in Neural and Fuzzy Systems

1Central Research Institute, Enjoyor Inc., Hangzhou 310030, China
2Faculty of Electromechanical Engineering, Guangdong University of Technology, Guangzhou 510006, China
3Department of Electrical and Computer Engineering, Concordia University, Montreal, QC, Canada H3G 1M8

Received 7 March 2011; Revised 6 July 2011; Accepted 4 August 2011

Academic Editor: Miin-Shen Yang

Copyright © 2011 Biaobiao Zhang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. X. Yao, “Evolving artificial neural networks,” Proceedings of the IEEE, vol. 87, no. 9, pp. 1423–1447, 1999. View at Publisher · View at Google Scholar · View at Scopus
  2. K. W. C. Ku, M. W. Mak, and W. C. Siu, “Approaches to combining local and evolutionary search for training neural networks: a review and some new results,” in Advances in Evolutionary Computing: Theory and Applications, A. Ghosh and S. Tsutsui, Eds., pp. 615–641, Springer, Berlin, Germany, 2003.
  3. J. M. Baldwin, “A new factor in evolution,” The American Naturalist, vol. 30, pp. 441–451, 1896.
  4. P. Turney, “Myths and legends of the Baldwin effect,” in Proceedings of the 13th International Conference on Machine Learning, pp. 135–142, Bari, Italy, 1996.
  5. Wikipedia, The Free Encyclopedia, http://en.wikipedia.org.
  6. K. -L. Du and M. N. S. Swamy, Neural Networks in a Softcomputing Framework, Springer, London, UK, 2006.
  7. J. Holland, Adaptation in Natural and Artificial Systems, The University of Michigan Press, Ann Arbor, Mich, USA, 1975.
  8. U. K. Chakraborty and C. Z. Janikow, “An analysis of Gray versus binary encoding in genetic search,” Information Sciences, vol. 156, no. 3-4, pp. 253–269, 2003. View at Publisher · View at Google Scholar · View at Scopus
  9. D. H. Wolpert and W. G. Macready, “No free lunch theorems for search,” Tech. Rep. SFI-TR-95-02-010, Santa Fe Institute, 1995.
  10. A. H. Wright, “Genetic algorithms for real parameter optimization,” in Foundations of Genetic Algorithms, G. Rawlins, Ed., pp. 205–218, Morgan Kaufmann, San Mateo, Calif, USA, 1991.
  11. D. E. Goldberg, K. Deb, and B. Korb, “Messy genetic algorithms: motivation, analysis, and first results,” Complex Systems, vol. 3, no. 5, pp. 493–530, 1989.
  12. K. Mathias and L. D. Whitley, “Changing representations during search: a comparative study of delta coding,” Evolution Computing, vol. 2, no. 3, pp. 249–278, 1995.
  13. D. E. Goldberg and K. Deb, “A comparative analysis of selection schemes used in genetic algorithms,” in Foundations of Genetic Algorithms, G. J. E. Rawlins, Ed., pp. 69–93, Morgan Kaufmann, San Mateo, Calif, USA, 1990.
  14. D. E. Goldberg, Genetic Algorithms in Search, Optimization, and Machine Learning, Addison-Wesley, Reading, Mass, USA, 1989.
  15. G. Rudolph, “Convergence analysis of canonical genetic algorithms,” IEEE Transactions on Neural Networks, vol. 5, no. 1, pp. 96–101, 1994. View at Publisher · View at Google Scholar · View at Scopus
  16. L. Davis and J. J. Grefenstette, “Concerning GENESIS and OOGA,” in Handbook of Genetic Algorithms, L. Davis, Ed., pp. 374–377, Van Nostrand Reinhold, New York, NY, USA, 1991.
  17. J. Smith and F. Vavak, “Replacement strategies in steady state genetic algorithms: static environments,” in Foundations of Genetic Algorithms, W. Banzhaf and C. Reeves, Eds., vol. 5, pp. 219–233, Morgan Kaufmann, San Francisco, Calif, USA, 1999.
  18. K. De Jong, An analysis of the behavior of a class of genetic adaptive systems, Ph.D. thesis, The University of Michigan Press, Ann Arbor, Mich, USA, 1975.
  19. J. R. Koza, Genetic Programming, MIT Press, Cambridge, Mass, USA, 1993.
  20. D. R. Frantz, Non-linearities in genetic adaptive search, Ph.D. thesis, The University of Michigan Press, Ann Arbor, Mich, USA, 1972.
  21. G. Syswerda, “Uniform crossover in genetic algorithms,” in Proceedings of the 3rd International Conference on Genetic Algorithms, J. D. Schaffer, Ed., pp. 2–9, Fairfax, Va, USA, 1989.
  22. L. J. Eshelman, “The CHC adaptive search algorithm: how to have safe search when engaging in nontraditional genetic recombination,” in Foundations of Genetic Algorithms, G. J. E. Rawlins, Ed., pp. 265–283, Morgan Kaufmann, San Mateo, Calif, USA, 1991.
  23. J. D. Schaffer, R. A. Caruana, L. J. Eshelman, and R. Das, “A study of control parameters affecting online performance of genetic algorithms for function optimisation,” in Proceedings of the 3rd International Conference on Genetic Algorithms, J. D. Schaffer, Ed., pp. 70–79, Morgan Kaufmann, Arlington, Va, USA, 1989.
  24. D. Bhandari, N. R. Pal, and S. K. Pal, “Directed mutation in genetic algorithms,” Information Sciences, vol. 79, no. 3-4, pp. 251–270, 1994. View at Scopus
  25. H. Mulenbein and D. Schlierkamp-Voose, “Analysis of selection, mutation and recombination in genetic algorithms,” in Evolution and Biocomputation, W. Banzhaf and F. H. Eechman, Eds., LNCS 899, pp. 142–168, Springer, Berlin, Germany, 1995.
  26. H. Muhlenbein, “Parallel genetic algorithms, population genetics and combinatorial optimization,” in Proceedings of the 3rd International Conference on Genetic Algorithms, J. D. Schaffer, Ed., pp. 416–421, Morgan Kaufman, 1989.
  27. L. Davis, “Bit-climbing, representational bias, and test suite design,” in Proceedings of the 4th International Conference on Genetic Algorithms, L. Booker, Ed., pp. 18–23, Morgan Kaufmann, 1991.
  28. C. Y. Lee, “Entropy-boltzmann selection in the genetic algorithms,” IEEE Transactions on Systems, Man, and Cybernetics, Part B, vol. 33, no. 1, pp. 138–142, 2003. View at Publisher · View at Google Scholar · View at Scopus
  29. L. Davis, “Hybridization and numerical representation,” in Handbook of Genetic Algorithms, L. Davis, Ed., pp. 61–71, Van Nostrand Reinhold, New York, NY, USA, 1991.
  30. Z. Michalewicz, Genetic Algorithms + Data Structures = Evolution Programs, Springer, Berlin, Germany, 3rd edition, 1996.
  31. L. J. Eshelman and J. D. Schaffer, “Real-coded genetic algorithms and interval-schemata,” in Foundations of Genetic Algorithms, L. D. Whitley, Ed., vol. 2, pp. 187–202, Morgan Kaufmann, San Mateo, Calif, USA, 1993.
  32. H. P. Schwefel, Numerical Optimization of Computer Models, Wiley, Chichester, UK, 1981.
  33. F. H. F. Leung, H. K. Lam, S. H. Ling, and P. K. S. Tam, “Tuning of the structure and parameters of a neural network using an improved genetic algorithm,” IEEE Transactions on Neural Networks, vol. 14, no. 1, pp. 79–88, 2003. View at Publisher · View at Google Scholar · View at Scopus
  34. X. Yao, Y. Liu, K. H. Liang, and G. Lin, “Fast evolutionary algorithms,” in Advances in Evolutionary Computing: Theory and Applications, S. Ghosh and S. Tsutsui, Eds., pp. 45–94, Springer, Berlin, Germany, 2003.
  35. S. Kirkpatrick, C. D. Gelatt Jr., and M. P. Vecchi, “Optimization by simulated annealing,” Science, vol. 220, no. 4598, pp. 671–680, 1983. View at Scopus
  36. J. Grefenstette, R. Gopal, B. Rosmaita, and D. Gucht, “Genetic algorithms for the travelling saleman problem,” in Proceedings of the 2nd International Conference on Genetic Algorithms and their Applications, J. Grefenstette, Ed., pp. 160–168, Lawrence Erlbaum Associates, 1985.
  37. D. Whitley, T. Starkweather, and D. Fuquay, “Scheduling problems and traveling salesmen: the genetic edge recombination operator,” in Proceedings of the 3rd International Conference on Genetic Algorithms, J. D. Schaffer, Ed., pp. 133–140, Morgan Kaufmann, 1989.
  38. B. R. Fox and M. B. McMahon, “Genetic operators for sequencing problems,” in Foundations of Genetic Algorithms, G. J. E. Rawlins, Ed., pp. 284–300, Morgan Kaufmann, San Mateo, Calif, USA, 1991.
  39. J. Bean, “Genetic algorithms and random keys for sequence and optimization,” ORSA Journal on Computing, vol. 6, no. 2, pp. 154–160, 1994.
  40. D. Knjazew and D. E. Goldberg, “OMEGA—ordering messy GA: solving permutation problems with the fast messy genetic algorithm and random keys,” in Proceedings of the Genetic and Evolutionary Computing Conference, L. D. Whitley, D. E. Goldberg, et al., Eds., pp. 181–188, Morgan Kaufmann, Las Vegas, Nev, USA, 2000.
  41. D. E. Goldberg, K. Deb, H. Kargupta, and G. Harik, “Rapid, accurate optimization of difficult problems using fast messy genetic algorithms,” in Proceedings of the 5th International Conference on Genetic Algorithms, pp. 56–64, Urbana-Champaign, Ill, USA, 1993.
  42. L. M. Patnaik and S. Mandavilli, “Adaptation in genetic algorithms,” in Genetic Algorithms for Pattern Recognition, S. K. Pal and P. P. Wang, Eds., pp. 45–64, CRC Press, Boca Raton, Fla, USA, 1996.
  43. G. Ochoa, I. Harvey, and H. Buxton, “On recombination and optimal mutation rates,” in Proceedings of the Genetic and Evolutionary Computing Conference, (GECCO ’99), W. Banzhaf, J. Daida, and A. E. Eiben, Eds., pp. 488–495, Morgan Kaufmann, 1999.
  44. J. Hesser and R. Manner, “Towards an optimal mutation probability for genetic algorithms,” in Parallel Problem Solving from Nature, H. P. Schwefel and R. Manner, Eds., LNCS 496, pp. 23–32, Springer, Berlin, Germany, 1991. View at Publisher · View at Google Scholar
  45. D. E. Goldberg, “Optimal initial population size for binary-coded genetic algorithms,” TCGA Report 850001, The Clearinghouse for Genetic Algorithms, University of Alabama, Tuscalossa, Ala, USA, 1985.
  46. T. C. Fogarty, “Varying the probability of mutation in the genetic algorithm,” in Proceedings of the 3rd International Conference on Genetic Algorithms, pp. 104–109, Fairfax, Va, USA, 1989.
  47. J. Arabas, Z. Michalewicz, and J. Mulawka, “GAVaPS—a genetic algorithm with varying population size,” in Proceedings of the 1st IEEE Conference on Evolutionary Computation, pp. 73–78, Orlando, Fla, USA, June 1994. View at Scopus
  48. I. la Tendresse, J. Gottlieb, and O. Kao, “The effects of partial restarts in evolutionary search,” in Proceedings of the 5th International Conference on Artificial Evolution, LNCS 2310, pp. 117–127, Springer, Le Creusot, France, 2001.
  49. N. N. Schraudolph and R. K. Belew, “Dynamic parameter encoding for genetic algorithms,” Machine Learning, vol. 9, no. 1, pp. 9–21, 1992. View at Publisher · View at Google Scholar · View at Scopus
  50. R. J. Streifel, R. J. Marks II, R. Reed, J. J. Choi, and M. Healy, “Dynamic fuzzy control of genetic algorithm parameter coding,” IEEE Transactions on Systems, Man, and Cybernetics, Part B, vol. 29, no. 3, pp. 426–433, 1999. View at Publisher · View at Google Scholar · View at Scopus
  51. S. K. Sharma and G. W. Irwin, “Fuzzy coding of genetic algorithms,” IEEE Transactions on Evolutionary Computation, vol. 7, no. 4, pp. 344–355, 2003. View at Publisher · View at Google Scholar · View at Scopus
  52. D. Whitley, “The GENITOR algorithm and selective pressure,” in Proceedings of the 3rd International Conference on Genetic Algorithms, J. D. Schaffer, Ed., pp. 116–121, Morgan Kaufmann, 1989.
  53. D. Whitley and T. Starkweather, “GENITOR II: a distributed genetic algorithm,” Journal of Experimental & Theoretical Artificial Intelligence, vol. 2, no. 3, pp. 189–214, 1990.
  54. B. Manderick and P. Spiessens, “Fine-grained parallel genetic algorithms,” in Proceedings of the 3rd International Conference on Genetic Algorithms, J. D. Schaffer, Ed., pp. 428–433, Morgan Kaufmann, 1989.
  55. R. J. Collins and D. R. Jefferson, “Selection in massively parallel genetic algorithms,” in Proceedings of the 4th International Conference on Genetic Algorithms, R. K. Belew and L. B. Booker, Eds., pp. 249–256, Morgan Kaufmann, 1991.
  56. Y. Davidor, “A naturally occurring niche and species phenomenon: the model and first results,” in Proceedings of the 4th International Conference on Genetic Algorithms, R. K. Belew and L. B. Booker, Eds., pp. 257–262, Morgan Kaufmann, 1991.
  57. H. M. Cartwright and S. P. Harris, “The application of the genetic algorithm to two-dimensional strings: the source apportionment problem,” in Proceedings of the International Conference on Genetic Algorithms, p. 631, Urbana-Champaign, Ill, USA, 1993.
  58. K. J. Cherkauer, “Genetic search for nearest-neighbor exemplars,” in Proceedings of the 4th Midwest Artificial Intelligence and Cognitive Science Conference, pp. 87–91, Utica, Ill, USA, 1992.
  59. Y. Sato and T. Ochiai, “2-D genetic algorithms for determining neural network structure and weights,” in Proceedings of the 4th Annual Conference on Evolutionary Programming, J. R. McDonnel, R. G. Reynolds, and D. B. Fogel, Eds., pp. 789–804, MIT Press, San Diego, Calif, USA, 1995.
  60. I. Rechenberg, Evolutionsstrategie—Optimierung Technischer Systeme Nach Prinzipien der Biologischen Information, Formman, Freiburg, Germany, 1973.
  61. M. Herdy, “Application of the evolution strategy to discrete optimization problems,” in Parallel Problem Solving from Nature, H. P. Schwefel and R. Manner, Eds., LNCS 496, pp. 188–192, Springer, Berlin, Germany, 1991.
  62. T. Back and H. Schwefel, “An overview of evolutionary algorithms for parameter optimization,” Evolution Computing, vol. 1, no. 1, pp. 1–23, 1993.
  63. N. Hansen and A. Ostermeier, “Adapting arbitrary normal mutation distributions in evolution strategies: the covariance matrix adaptation,” in Proceedings of the 1996 IEEE International Conference on Evolutionary Computation, (ICEC '96), pp. 312–317, Nagoya, Japan, May 1996. View at Scopus
  64. N. Hansen and A. Ostermeier, “Completely derandomized self-adaptation in evolution strategies,” Evolutionary Computation, vol. 9, no. 2, pp. 159–195, 2001. View at Scopus
  65. M. Milano, P. Koumoutsakos, and J. Schmidhuber, “Self-organizing nets for optimization,” IEEE Transactions on Neural Networks, vol. 15, no. 3, pp. 758–765, 2004. View at Publisher · View at Google Scholar · View at Scopus
  66. T. Kohonen, Self-Organization and Associative Memory, Springer, Berlin, Germany, 1989.
  67. T. M. Martinetz, S. G. Berkovich, and K. J. Schulten, “Neural-gas network for vector quantization and its application to time-series prediction,” IEEE Transactions on Neural Networks, vol. 4, no. 4, pp. 558–569, 1993. View at Publisher · View at Google Scholar · View at Scopus
  68. L. Fogel, J. Owens, and M. Walsh, Artificial Intelligence through Simulated Evolution, Wiley, New York, NY, USA, 1966.
  69. P. Moscato, “On evolution, search, optimization, genetic algorithms and martial arts: towards memetic algorithms,” Technical Report 826, Caltech Concurrent Computation Program, California Institute of Technology, Pasadena, Calif, USA, 1989.
  70. D. B. Fogel, “An analysis of evolutionary programming,” in Proceedings of the 1st Annual Conference on Evolutionary Programming, D. B. Fogel and J. W. Atmar, Eds., pp. 43–51, Evolutionary Programming Society, La Jolla, Calif, USA, 1992.
  71. D. B. Fogel, Evolutionary Computation, IEEE Press, New Jersy, NJ, USA, 1995.
  72. T. Back, G. Rudolph, and H. P. Schwefel, “Schwefel, evolutionary programming and evolutionary strategies: similarities and differences,” in Proceedings of the 2nd Annual Conference on Evolutionary Programming, D. B. Fogel and W. Atmar, Eds., pp. 11–22, La Jolla, Calif, USA, 1993.
  73. R. G. Reynolds, “An introduction to cultural algorithms,” in Proceedings of the 3rd Annual Conference on Evolutionary Programming, A. V. Sebald and L. J. Fogel, Eds., pp. 131–139, World Scientific, 1994.
  74. P. Moscato, “Memetic algorithms: a short introduction,” in New Ideas in Optimization, D. Corne, F. Glover, and M. Dorigo, Eds., pp. 219–234, McGraw-Hill, New York, NY, USA, 1999.
  75. R. Dawkins, The Selfish Gene, Oxford University Press, Oxford, UK, 1976.
  76. J. J. Grefenstette, “Deception considered harmful,” in Foundations of Genetic Algorithms, L. D. Whitley, Ed., vol. 2, pp. 75–91, Morgan Kaufmann, San Mateo, Calif, USA, 1993.
  77. L. D. Whitley, “Fundamental principles of deception in genetic search,” in Foundations of Genetic Algorithms, G. J. E. Rawlins, Ed., pp. 221–241, Morgan Kaufmann, San Mateo, Calif, USA, 1991.
  78. M. Vose and G. Liepins, “Punctuated equilibria in genetic search,” Complex Systems, vol. 5, no. 1, pp. 31–44, 1991.
  79. A. Nix and M. D. Vose, “Modeling genetic algorithms with markov chains,” Annals of Mathematics and Artificial Intelligence, vol. 5, no. 1, pp. 79–88, 1992.
  80. D. Whitley and N. W. Yoo, “Modeling simple genetic algorithms for permutation problems,” in Foundations of Genetic Algorithms, D. Whitley and M. Vose, Eds., vol. 3, pp. 163–184, Morgan Kaufmann, San Mateo, Calif, USA, 1995.
  81. E. Bonabeau, M. Dorigo, and G. Theraulaz, Swarm Intelligence: From Natural to Artificial Systems, Oxford Press, New York, NY, USA, 1999.
  82. J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proceedings of the 1995 IEEE International Conference on Neural Networks, pp. 1942–1948, Perth, Wash, USA, December 1995. View at Scopus
  83. Y. Shi and R. Eberhart, “A modified particle swarm optimizer,” in Proceedings of the 1998 IEEE International Conference on Evolutionary Computation, (ICEC '98), pp. 69–73, Anchorage, Alaska, USA, May 1998. View at Scopus
  84. J. Kennedy and R. Eberhart, Swarm Intelligence, Morgan Kaufmann, San Francisco, Calif, USA, 2001.
  85. M. Clerc and J. Kennedy, “The particle swarm-explosion, stability, and convergence in a multidimensional complex space,” IEEE Transactions on Evolutionary Computation, vol. 6, no. 1, pp. 58–73, 2002. View at Publisher · View at Google Scholar · View at Scopus
  86. C. A. C. Coello, G. T. Pulido, and M. S. Lechuga, “Handling multiple objectives with particle swarm optimization,” IEEE Transactions on Evolutionary Computation, vol. 8, no. 3, pp. 256–279, 2004. View at Publisher · View at Google Scholar · View at Scopus
  87. N. K. Jerne, “Towards a network theory of the immune system,” Annales d’Immunologie, vol. 125, no. 1-2, pp. 373–389, 1974. View at Scopus
  88. F. M. Burnet, The Clonal Selection Theory of Acquired Immunity, Cambridge University Press, Cambridge, UK, 1959.
  89. G. L. Ada and G. J. V. Nossal, “The clonal-selection theory,” Scientific American, vol. 257, no. 2, pp. 62–69, 1987. View at Scopus
  90. H. Atlan and I. R. Cohen, Theories of Immune Networks, Springer, Berlin, Germany, 1989.
  91. F. Varela, V. Sanchez-Leighton, and A. Coutinho, “Adaptive strategies gleaned from immune networks: viability theory and comparison with classifier systems,” in Theoretical Biology: Epigenetic and Evolutionary Order, B. Goodwin and P. T. Saunders, Eds., A Waddington Memorial Conference, pp. 112–123, Edinburgh University Press, Edinburgh, UK, 1989.
  92. L. N. De Castro and F. J. Von Zuben, “Learning and optimization using the clonal selection principle,” IEEE Transactions on Evolutionary Computation, vol. 6, no. 3, pp. 239–251, 2002. View at Publisher · View at Google Scholar · View at Scopus
  93. L. Jiao and L. Wang, “A novel genetic algorithm based on immunity,” IEEE Transactions on Systems, Man, and Cybernetics Part A, vol. 30, no. 5, pp. 552–561, 2000. View at Scopus
  94. M. Dorigo, V. Maniezzo, and A. Colorni, “Positive feedback as a search strategy,” Technical Report 91-016, Dipartimento di Elettronica, Politecnico di Milano, Milan, Italy, 1991.
  95. M. Dorigo and L. M. Gambardella, “Ant colony system: a cooperative learning approach to the traveling salesman problem,” IEEE Transactions on Evolutionary Computation, vol. 1, no. 1, pp. 53–66, 1997. View at Scopus
  96. M. Dorigo, G. Di Caro, and L. M. Gambardella, “Ant algorithms for discrete optimization,” Artificial Life, vol. 5, no. 2, pp. 137–172, 1999. View at Scopus
  97. M. Dorigo and T. Stutzle, Ant Colony Optimization, MIT Press, Cambridge, Mass, USA, 2004.
  98. G. Bilchev and I. C. Parmee, “The ant colony metaphor for searching continuous design spaces,” in Proceedings of the AISB Workshop on Evolutionary Computing, T. C. Fogarty, Ed., LNCS 993, pp. 25–39, Springer, Sheffield, UK, 1995.
  99. H. J. Zimmermann and H. J. Sebastian, “Intelligent system design support by fuzzy-multi-criteria decision making and/or evolutionary algorithms,” in Proceedings of the IEEE International Conference on Fuzzy Systems, pp. 367–374, Yokohama, Japan, 1995.
  100. A. Cichocki and R. Unbehauen, Neural Networks for Optimization and Signal Processing, Wiley, New York, NY, USA, 1992.
  101. J. D. Schaffer, Some experiments in machine learning using vector evaluated genetic algorithms, Ph.D. thesis, Vanderbilt University Press, Nashville, Tenn, USA, 1984.
  102. N. Srinivas and K. Deb, “Multiobjective optimization using nondominated sorting in genetic algorithms,” Evolution Computing, vol. 2, no. 3, pp. 221–248, 1995.
  103. K. Deb, S. Agrawal, A. Pratap, and T. Meyarivan, “A fast elitist non-dominated sorting genetic algorithm for multi-objective optimization: NSGA-II,” in Proceedings of the Parallel Problem Solving from Nature, (PPSN '00), M. Schoenauer, K. Deb, et al., Eds., pp. 849–858, Springer, 2000.
  104. J. Horn, N. Nafpliotis, and D. E. Goldberg, “A niched pareto genetic algorithm for multiobjective optimization,” in Proceedings of the 1st IEEE Conference Evolutionary Computation, IEEE World Congress on Computational Intelligence, vol. 1, pp. 82–87, Orlando, Fla, USA, 1994.
  105. E. Zitzler and L. Thiele, “Multiobjective evolutionary algorithms: a comparative case study and the strength Pareto approach,” IEEE Transactions on Evolutionary Computation, vol. 3, no. 4, pp. 257–271, 1999. View at Scopus
  106. D. W. Corne, J. D. Knowles, and M. J. Oates, “The pareto envelope-based selection algorithm for multiobjective optimisation,” in Proceedings of the Parallel Problem Solving from Nature, (PPSN '00), M. Schoenauer, K. Deb, et al., Eds., pp. 839–848, Springer, 2000.
  107. J. D. Knowles and D. W. Corne, “Approximating the nondominated front using the Pareto archived evolution strategy,” Evolutionary Computation, vol. 8, no. 2, pp. 149–172, 2000. View at Scopus
  108. E. Zitzler, M. Laumanns, and L. Thiele, “SPEA2: improving the strength Pareto evolutionary algorithm,” TIK-Report 103, Department of Electrical Engineering, Swiss Federal Institute of Technology, 2001.
  109. F. Menczer, M. Degeratu, and W. N. Street, “Efficient and scalable Pareto optimization by evolutionary local selection algorithms,” Evolutionary Computation, vol. 8, no. 2, pp. 223–247, 2000. View at Scopus
  110. A. Prugel-Bennett, “Symmetry breaking in population-based optimization,” IEEE Transactions on Evolutionary Computation, vol. 8, no. 1, pp. 63–79, 2004. View at Publisher · View at Google Scholar · View at Scopus
  111. D. Beasley, D. R. Bull, and R. R. Martin, “A sequential niche technique for multimodal function optimization,” Evolution Computing, vol. 1, no. 2, pp. 101–125, 1993.
  112. S. Tsutsui, Y. Fujimoto, and A. Ghosh, “Forking genetic algorithms: GAs with search space division schemes,” Evolutionary Computation, vol. 5, no. 1, pp. 61–80, 1997. View at Scopus
  113. F. Menczer and R. K. Belew, “Local selection,” in Proceedings of the 7th International Conference Evolution Programming, LNCS 1447, pp. 703–712, Springer, San Diego, Calif, USA, 1998.
  114. Z. Michalewicz, “A survey of constraint handling techniques in evolutionary computation methods,” in Evolutionary Programming, J. McDonnell, J. R. Reynolds, and D. Fogel, Eds., vol. 4, pp. 135–155, MIT Press, Cambridge Mass, USA, 1995.
  115. W. Pedrycz, “Fuzzy evolutionary computing,” Soft Computing, vol. 2, pp. 61–72, 1998.
  116. M. A. Lee and H. Takagi, “Dynamic control of genetic algorithms using fuzzy logic techniques,” in Proceedings of the 5th International Conference on Genetic Algorithms, (ICGA ’93), pp. 76–83, Urbana-Champaign, Ill, USA, 1993.
  117. F. Herrera and M. Lozano, “Fuzzy adaptive genetic algorithms: design, taxonomy, and future directions,” Soft Computing, vol. 7, pp. 545–562, 2003.
  118. F. Herrera, M. Lozano, and J. L. Verdegay, “Dynamic and heuristic fuzzy connectives-based crossover operators for controlling the diversity and convergence of real-coded genetic algorithms,” International Journal of Intelligent Systems, vol. 11, no. 12, pp. 1013–1040, 1996. View at Scopus
  119. K. De Jong, “Genetic algorithms are not function optimizers,” in Foundations of Genetic Algorithms, L. D. Whitley, Ed., vol. 2, pp. 5–17, Morgan Kaufmann, San Mateo, Calif, USA, 1993.
  120. P. P. C. Yip and Y. H. Pao, “Combinatorial optimization with use of guided evolutionary simulated annealing,” IEEE Transactions on Neural Networks, vol. 6, no. 2, pp. 290–295, 1995. View at Publisher · View at Google Scholar · View at Scopus
  121. H. Chen, N. S. Flann, and D. W. Watson, “Parallel genetic simulated annealing: a massively parallel SIMD algorithm,” IEEE Transactions on Parallel and Distributed Systems, vol. 9, no. 2, pp. 126–136, 1998. View at Scopus
  122. V. Delport, “Codebook design in vector quantisation using a hybrid system of parallel simulated annealing and evolutionary selection,” Electronics Letters, vol. 32, no. 13, pp. 1158–1160, 1996. View at Scopus
  123. V. Delport, “Parallel simulated annealing and evolutionary selection for combinatorial optimisation,” Electronics Letters, vol. 34, no. 8, pp. 758–759, 1998. View at Scopus
  124. H. J. Cho, S. Y. Oh, and D. H. Choi, “Population-oriented simulated annealing technique based on local temperature concept,” Electronics Letters, vol. 34, no. 3, pp. 312–313, 1998. View at Scopus
  125. C. Harpham, C. W. Dawson, and M. R. Brown, “A review of genetic algorithms applied to training radial basis function networks,” Neural Computing and Applications, vol. 13, no. 3, pp. 193–201, 2004. View at Publisher · View at Google Scholar · View at Scopus
  126. P. J. B. Hancock, “Genetic algorithms and permutation problems: a comparison of recombination operators for neural net structure specification,” in Proceedings of the IEEE International Workshop on Combinations of Genetic Algorithms and Neural Networks, (COGANN '92), D. Whitley and J. D. Schaffer, Eds., pp. 108–122, Baltimore, Md, USA, 1992.
  127. X. Yao and Y. Liu, “A new evolutionary system for evolving artificial neural networks,” IEEE Transactions on Neural Networks, vol. 8, no. 3, pp. 694–713, 1997. View at Scopus
  128. D. J. Montana and L. Davis, “Training feedforward networks using genetic algorithms,” in Proceedings of the 11th International Joint Conference on Artificial Intelligence, N. Sridhara, Ed., pp. 762–767, Morgan Kaufmann, Detroit, Mich, USA, 1989.
  129. S. W. Lee, “Off-line recognition of totally unconstrained handwritten numerals using multilayer cluster neural network,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 18, no. 6, pp. 648–652, 1996. View at Scopus
  130. G. E. Hinton and S. J. Nowlan, “How learning can guide evolution,” Complex Systems, vol. 1, pp. 495–502, 1987.
  131. F. Gruau and D. Whitley, “Adding learning to the cellular development of neural networks: evolution and the Baldwin effect,” Evolution Computing, vol. 1, no. 3, pp. 213–233, 1993.
  132. L. D. Whitley, V. S. Gordon, and K. E. Mathias, “Lamarckian evolution, the Baldwin effect and function optimization,” in Parallel Problem Solving from Nature III, Y. Davidor, H. P. Schwefel, and R. Manner, Eds., LNCS 866, pp. 6–15, Springer, London, UK, 1994.
  133. F. Menczer and D. Parisi, “Evidence of hyperplanes in the genetic learning of neural networks,” Biological Cybernetics, vol. 66, no. 3, pp. 283–289, 1992. View at Publisher · View at Google Scholar · View at Scopus
  134. H. Kitano, “Designing neural networks using genetic algorithms with graph generation system,” Complex Systems, vol. 4, no. 4, pp. 461–476, 1990.
  135. D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning internal representations by error propagation,” in Parallel Distributed Processing: Explorations in the Microstructure of Cognition, D. E. Rumelhart and J. L. McClelland, Eds., vol. 1 of Foundation, pp. 318–362, MIT Press, Cambridge, Mass, USA, 1986.
  136. P. A. Castillo, J. J. Merelo, V. Rivas, G. Romero, and A. Prieto, “G-Prop: global optimization of multilayer perceptrons using GAs,” Neurocomputing, vol. 35, pp. 149–163, 2000. View at Publisher · View at Google Scholar · View at Scopus
  137. P. A. Castillo, J. Carpio, J. J. Merelo, V. Rivas, G. Romero, and A. Prieto, “Evolving multilayer perceptrons,” Neural Processing Letters, vol. 12, no. 2, pp. 115–127, 2000. View at Publisher · View at Google Scholar · View at Scopus
  138. S. E. Fahlman, “Fast learning variations on back-propation: an empirical study,” in Proceedings of the 1988 Connectionist Models Summer School, D. S. Touretzky, G. E. Hinton, and T. Sejnowski, Eds., pp. 38–51, Morgan Kaufmann, Pittsburgh, Pa, USA, 1988.
  139. H. B. Kim, S. H. Jung, T. G. Kim, and K. H. Park, “Fast learning method for back-propagation neural network by evolutionary adaptation of learning rates,” Neurocomputing, vol. 11, no. 1, pp. 101–106, 1996. View at Publisher · View at Google Scholar · View at Scopus
  140. D. J. Chalmers, “The evolution of learning: an experiment in genetic connectionism,” in Proceedings of the 1990 Connectionist Models Summer School, D. S. Touretzky, J. L. Elman, and G. E. Hinton, Eds., pp. 81–90, Morgan Kaufmann, 1990.
  141. J. Baxter, “The evolution of learning algorithms for artificial neural networks,” in Complex Systems, D. Green and T. Bosso-Maier, Eds., pp. 313–326, IOS Press, Amsterdam, The Netherlands, 1992.
  142. Z. Guo and R. E. Uhrig, “Using genetic algorithms to select inputs for neural networks,” in Proceedings of the IEEE International Workshop on Combinations of Genetic Algorithms and Neural Networks, (COGANN '92), D. Whitley and J. D. Schaffer, Eds., pp. 223–234, Baltimore, Md, USA, 1992.
  143. A. Alvarez, “A neural network with evolutionary neurons,” Neural Processing Letters, vol. 16, no. 1, pp. 43–52, 2002. View at Publisher · View at Google Scholar · View at Scopus
  144. M. G. Cooper and J. J. Vidal, “Genetic design of fuzzy controllers,” in Genetic Algorithms for Pattern Recognition, S. K. Pal and P. P. Wang, Eds., pp. 283–298, CRC Press, Boca Raton, Fla, USA, 1996.
  145. M. Nyberg and Y. H. Pao, “Automatic optimal design of fuzzy systems based on universal approximation and evolutionary programming,” in Fuzzy Logic and Intelligent Systems, H. Li and M. M. Gupta, Eds., pp. 311–366, Kluwer, Norwell, Mass, USA, 1995.
  146. W. Pedrycz and A. F. Rocha, “Fuzzy-set based models of neurons and knowledge-based networks,” IEEE Transactions on Fuzzy Systems, vol. 1, no. 4, pp. 254–266, 1993. View at Publisher · View at Google Scholar · View at Scopus
  147. M. Russo, “FuGeNeSys—a fuzzy genetic neural system for fuzzy modeling,” IEEE Transactions on Fuzzy Systems, vol. 6, no. 3, pp. 373–388, 1998. View at Scopus
  148. M. Russo, “Genetic fuzzy learning,” IEEE Transactions on Evolutionary Computation, vol. 4, no. 3, pp. 259–273, 2000. View at Publisher · View at Google Scholar · View at Scopus
  149. G. Alpaydin, G. Dundar, and S. Balkir, “Evolution-based design of neural fuzzy networks using self-adapting genetic parameters,” IEEE Transactions on Fuzzy Systems, vol. 10, no. 2, pp. 211–221, 2002. View at Publisher · View at Google Scholar · View at Scopus
  150. S. K. Oh, W. Pedrycz, and H. S. Park, “Multi-layer hybrid fuzzy polynomial neural networks: a design in the framework of computational intelligence,” Neurocomputing, vol. 64, no. 1–4, pp. 397–431, 2005. View at Publisher · View at Google Scholar · View at Scopus
  151. I. F. Chung, C. J. Lin, and C. T. Lin, “A GA-based fuzzy adaptive learning control network,” Fuzzy Sets and Systems, vol. 112, no. 1, pp. 65–84, 2000. View at Scopus
  152. W. Pedrycz and M. Reformat, “Evolutionary fuzzy modeling,” IEEE Transactions on Fuzzy Systems, vol. 11, no. 5, pp. 652–665, 2003. View at Publisher · View at Google Scholar · View at Scopus
  153. G. Paun, G. Rozenberg, and A. Salomaa, DNA Computing, Springer, Berlin, Germany, 1998.
  154. G. Paun, Membrane Computing: An Introduction, Springer, Berlin, Germany, 2002.
  155. M. Hirvensalo, Quantum Computing, Springer, Berlin, Germany, 2001.
  156. S. Forrest, Emergent Computation, MIT Press, Cambridge, Mass, USA, 1991.