Research Article | Open Access
Viewing the Problem from Different Angles: A New Diversity Measure Based on Angular Distances
It is commonly believed that diversity is crucial for an evolutionary system to succeed, especially when the problem to be solved contains local optima from which the population cannot easily escape. There exist numerous methods to measure population diversity, but none of these have been shown to be consistently useful. In this paper, a new diversity measure is introduced, and it is shown that high diversity according to this new measure generally leads to a more successful overall evolution in most of the cases considered.
- B. Wyns, P. De Bruyne, and L. Boullart, “Characterizing diversity in genetic programming,” in Proceedings of the 9th European Conference on Genetic Programming, vol. 3905 of Lecture Notes in Computer Science, pp. 250–259, 2006.
- E. K. Burke, S. Gustafson, and G. Kendall, “Diversity in genetic programming: an analysis of measures and correlation with fitness,” IEEE Transactions on Evolutionary Computation, vol. 8, no. 1, pp. 47–62, 2004.
- A. Toffolo and E. Benini, “Genetic diversity as an objective in multi-objective evolutionary algorithms,” Evolutionary Computation, vol. 11, no. 2, pp. 151–167, 2003.
- A. Ekárt and S. Z. Németh, “Maintaining the diversity of genetic programs,” in Proceedings of the 5th European Conference on Genetic Programming, pp. 162–171, 2002.
- D. Curran and C. O'Riordan, “Increasing population diversity through cultural learning,” Adaptive Behavior, vol. 14, no. 4, pp. 315–338, 2006.
- J. P. Rosca, “Genetic programming exploratory power and the discovery of functions,” in Proceedings of the 4th Annual Conference on Evolutionary Programming, pp. 719–736, 1995.
- J. P. Rosca, “Entropy-driven adaptive representation,” in Proceedings of the Workshop on Genetic Programming: From Theory to Real-World Applications, pp. 23–32, 1995.
- T. M. Mitchell, Machine Learning, McGraw-Hill, Singapore, 1997.
- P. H. McQuesten, Cultural enhancement of neuroevolution, Ph.D. dissertation, Artificial Intelligence Laboratory, The University of Texas, Austin, Tex, USA, 2002.
- J. He and X. Yao, “From an individual to a population: an analysis of the first hitting time of population-based evolutionary algorithms,” IEEE Transactions on Evolutionary Computation, vol. 6, no. 5, pp. 495–511, 2002.
- V. Nissen and J. Propach, “On the robustness of population-based versus point-based optimization in the presence of noise,” IEEE Transactions on Evolutionary Computation, vol. 2, no. 3, pp. 107–119, 1998.
- D. V. Arnold and H.-G. Beyer, “On the benefits of populations for noisy optimization,” Evolutionary Computation, vol. 11, no. 2, pp. 111–127, 2003.
- T. Smith, P. Husbands, P. Layzell, and M. O'Shea, “Fitness landscapes and evolvability,” Evolutionary Computation, vol. 10, no. 1, pp. 1–34, 2002.
- T. Bäck, “Selective pressure in evolutionary algorithms: a characterizationof selection mechanisms,” in Proceedings of the 1st IEEE Conference on Evolutionary Computation, pp. 57–62, 1994.
- T. Blickle and L. Thiele, “A comparison of selection schemes used in evolutionary algorithms,” Evolutionary Computation, vol. 4, no. 4, pp. 361–394, 1996.
- D. E. Goldberg, Genetic Algorithms in Search, Optimization and Machine Learning, Addison-Wesley Longman, Boston, Mass, USA, 1989.
- H.-P. Schwefel, Evolution and Optimum Seeking, John Wiley & Sons, New York, NY, USA, 1995.
- J. T. Alander, “On optimal population size of genetic algorithms,” in Proceedings of Computer Systems and Software Engineering (CompEuro '92), pp. 65–70, The Hague, The Netherlands, May 1992.
- D. E. Goldberg, “Genetic algorithms, noise, and the sizing of populations,” Complex Systems, vol. 6, pp. 333–362, 1992.
- T. Jansen, K. A. De Jong, and I. Wegener, “On the choice of the offspring population size in evolutionary algorithms,” Evolutionary Computation, vol. 13, no. 4, pp. 413–440, 2005.
- M. Fuchs, “Large populations are not always the best choice in genetic programming,” in Proceedings of the Genetic and Evolutionary Computation Conference (GECCO '99), pp. 1033–1038, 1999.
- J. R. Koza, Genetic Programming: On the Programming of Computers by Means of Natural Selection, MIT Press, Cambridge, Mass, USA, 1992.
- H. Shimodaira, “Dcga: a diversity control oriented genetic algorithm,” in Proceedings of the 9th International Conference on Tools with Artificial Intelligence, pp. 367–374, 1997.
- U.-M. O'Reilly, “Using a distance metric on genetic programs to understand genetic operators,” in Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, vol. 5, pp. 4092–4097, 1997.
- M. Hutter and S. Legg, “Fitness uniform optimization,” IEEE Transactions on Evolutionary Computation, vol. 10, no. 5, pp. 568–589, 2006.
- E. Burke, S. Gustafson, and G. Kendall, “A survey and analysis of diversity measures in genetic programming,” in Proceedings of the Genetic and Evolutionary Computation Conference, pp. 716–723, 2002.
- A. E. Magurran, Ecological Diversity and Its Measurement, Princeton University Press, Princeton, NJ, USA, 1988.
- R. Olsson, “Inductive functional programming using incremental program transformation,” Artificial Intelligence, vol. 74, no. 1, pp. 55–81, 1995.
- L. B. Booker, D. E. Goldberg, and J. H. Holland, “Classifier systems and genetic algorithms,” Artificial Intelligence, vol. 40, no. 1–3, pp. 235–282, 1989.
- N. R. Draper and H. Smith, Applied Regression Analysis, John Wiley & Sons, New York, NY, USA, 3rd edition, 1998.
- S. Haykin, Neural Networks: A Comprehensive Foundation, Prentice-Hall, Upper Saddle River, NJ, USA, 2nd edition, 1999.
- R. Olsson, “The art of writing specifications for the adate automatic programming system,” in Proceedings of the 3rd Annual Conference on Genetic Programmin, pp. 278–283, 1998.
- S. Gustafson, E. K. Burke, and G. Kendall, “Sampling of unique structures and behaviours in genetic programming,” in Proceedings of the 7th European Conference on Genetic Programming, vol. 3003 of Lecture Notes in Computer Science, pp. 279–288, 2004.
- G. Zenobi and P. Cunningham, “Using diversity in preparing ensembles of classifiers based on different feature subsets to minimize generalization error,” in Proceedings of the 12th European Conference on Machine Learning (ECML '01), pp. 576–587, Freiburg, Germany, September 2001.
- T. G. Dietterich, “Ensemble methods in machine learning,” in Proceedings of the 1st International Workshop on Multiple Classifier Systems, pp. 1–15, 2000.
- O. Takahashi and S. Kobayashi, “An angular distance dependent alternation model for real-coded genetic algorithms,” in Proceedings of the Congress on Evolutionary Computation (CEC '04), vol. 2, pp. 2159–2165, 2004.
- T. Bäck, Evolutionary Algorithms in Theory and Practice, Oxford University Press, Oxford, UK, 1996.
- O. W. Gilley and R. K. Pace, “On the harrison and rubinfeld data,” Journal of Environmental Economics and Management, vol. 31, no. 3, pp. 403–405, 1996.
- A. Asuncion and D. Newman, “UCI machine learning repository,” 2007, http://archive.ics.uci.edu/ml.
- P. Vlachos, “The StatLib data set repository,” 2009, http://lib.stat.cmu.edu/datasets.
- X. Yao, “Evolving artificial neural networks,” Proceedings of the IEEE, vol. 87, no. 9, pp. 1423–1447, 1999.
- C. Spearman, “The proof and measurement of association between two things,” The American Journal of Psychology, vol. 15, no. 1, pp. 72–101, 1904.
- B. Sareni and L. Krähenbühl, “Fitness sharing and niching methods revisited,” IEEE Transactions on Evolutionary Computation, vol. 2, no. 3, pp. 97–106, 1998.
- P. Darwen and X. Yao, “Every niching method has its niche: fitness sharing and implicit sharing compared,” in Parallel Problem Solving from Nature, vol. 1141 of Lecture Notes in Computer Science, pp. 398–407, Springer, Berlin, Germany, 1996.
- C. D. Rosin and R. K. Belew, “New methods for competitive coevolution,” Evolutionary Computation, vol. 5, no. 1, pp. 1–29, 1997.
Copyright © 2009 Henrik Berg. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.