Table of Contents Author Guidelines Submit a Manuscript
Journal of Applied Mathematics
Volume 2013 (2013), Article ID 590614, 18 pages
http://dx.doi.org/10.1155/2013/590614
Research Article

Selecting Optimal Feature Set in High-Dimensional Data by Swarm Search

1Department of Computer and Information Science, University of Macau, Macau
2Faculty of Science and Technology, Middlesex University, UK
3Department of Computer Science and Engineering, Cambridge Institute of Technology, Ranchi, India

Received 22 July 2013; Revised 1 October 2013; Accepted 8 October 2013

Academic Editor: Zong Woo Geem

Copyright © 2013 Simon Fong et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. S. Aeberhard, D. Coomans, and O. de Vel, “Comparison of classifiers in high dimensional settings,” Tech. Rep. 92-02, Department of Computer Science and Department of Mathematics and Statistics, James Cook University, North Queensland, Australia, 1992. View at Google Scholar
  2. E.-G. Talbi, L. Jourdan, J. García-Nieto, and E. Alba, “Comparison of population based metaheuristics for feature selection: application to microarray data classification,” in Proceedings of the 6th IEEE/ACS International Conference on Computer Systems and Applications (AICCSA '08), pp. 45–52, Doha, Qatar, April 2008. View at Publisher · View at Google Scholar · View at Scopus
  3. S. M. Vieira, L. F. Mendonca, G. J. Farinha, and J. M. C. Sousa, “Metaheuristics for feature selection: application to sepsis outcome prediction,” in Proceedings of the IEEE World Congress on Computational Intelligence, pp. 1–8, Brisbane, Australia, June 2012.
  4. J. Wang, A.-R. Hedar, S. Wang, and J. Ma, “Rough set and scatter search metaheuristic based feature selection for credit scoring,” Expert Systems with Applications, vol. 39, no. 6, pp. 6123–6128, 2012. View at Publisher · View at Google Scholar · View at Scopus
  5. N. Abd-Alsabour and A. Moneim, “Diversification with an ant colony system for the feature selection problem,” in Proceedings of the 2nd International Conference on Management and Artificial Intelligence (IPEDR' 12), vol. 35, pp. 35–39, IACSIT Press, 2012.
  6. S. Casado, J. Pacheco , and L. Núñez, “A new variable selection method for classification,” XV Jornadas de ASEPUMA y III Encuentro Internacional, pp. 1–11, 2007. View at Google Scholar
  7. J. B. Jona and N. Nagaveni, “Ant-cuckoo colony optimization for feature selection in digital mammogram,” Pakistan Journal of Biological Sciences, pp. 1–6, 2013. View at Google Scholar
  8. A. Unler, A. Murat, and R. B. Chinnam, “Mr2PSO: a maximum relevance minimum redundancy feature selection method based on swarm intelligence for support vector machine classification,” Information Sciences, vol. 181, no. 20, pp. 4625–4641, 2011. View at Publisher · View at Google Scholar · View at Scopus
  9. D. Korycinski, M. M. Crawford, and J. W. Barnes, “Adaptive feature selection for hyperspectral data analysis,” in 9th Image and Signal Processing for Remote Sensing, vol. 5238 of Proceedings of SPIE, pp. 213–225, Barcelona, Spain, 2004. View at Publisher · View at Google Scholar · View at Scopus
  10. S. C. Yusta, “Different metaheuristic strategies to solve the feature selection problem,” Pattern Recognition Letters, vol. 30, no. 5, pp. 525–534, 2009. View at Publisher · View at Google Scholar · View at Scopus
  11. A. Unler and A. Murat, “A discrete particle swarm optimization method for feature selection in binary classification problems,” European Journal of Operational Research, vol. 206, no. 3, pp. 528–539, 2010. View at Publisher · View at Google Scholar · View at Scopus
  12. M. García-Torres, C. F. García López, B. Melián-Batista, A. J. Moreno-Pérez, and J. M. Moreno-Vega, “Solving feature subset selection problem by a hybrid metaheuristic,” Hybrid Metaheuristics, pp. 59–68, 2004. View at Google Scholar
  13. S. El Ferchichi and K. Laabidi, “Genetic algorithm and tabu search for feature selection,” Studies in Informatics and Control, vol. 18, no. 2, pp. 181–187, 2009. View at Google Scholar
  14. A. Al-Ani, “Feature subset selection using ant colony optimization,” International Journal of Information and Mathematical Sciences, vol. 2, article 1, pp. 53–58, 2006. View at Google Scholar
  15. L. C. Molina, L. Belanche, and À. Nebot, “Feature selection algorithms: a survey and experimental evaluation,” in Proceedings of the IEEE International Conference on Data Mining, (ICDM '02), pp. 306–313, Maebashi, Japan, December 2002. View at Scopus
  16. R. A. Johnson and D. W. Wichern, Applied Multivariate Statistical Analysis, Prentice Hall, Englewood Cliffs, NJ, USA, 3rd edition, 1992. View at MathSciNet
  17. M. Deriche and A. Al-Ani, “Feature selection using a mutual information based measure,” in Proceedings of the 16th International Conference on Pattern Recognition, vol. 4, pp. 82–85, Quebec, Canada, August 2002.
  18. G. Kumar and K. Kumar, “A novel evaluation function for feature selection based upon information theory,” in Proceedings of the 24th Canadian Conference on Electrical and Computer Engineering (CCECE '11), pp. 395–399, Niagara Falls, NY, USA, May 2011. View at Publisher · View at Google Scholar · View at Scopus
  19. M. A. Hall and L. Smith, “A. Practical feature subset selection for machine learning,” in Proceedings of the Australian Computer Science Conference, pp. 181–191, Springer, New York, NY, USA, 1998.
  20. R. Kohavi and G. H. John, “Wrappers for feature subset selection,” Artificial Intelligence, vol. 97, no. 1-2, pp. 273–324, 1997. View at Google Scholar · View at Scopus
  21. R. Ruiz, J. C. Riquelme, and J. S. Aguilar-Ruiz, “Heuristic search over a ranking for feature selection,” in Proceedings of the 8th International Workshop on Artificial Neural Networks, (IWANN '05), pp. 742–749, Barcelona, Spain, June 2005. View at Scopus
  22. T. N. Lal, O. Chapelle, J. Western, and A. Elisseeff, “Embedded methods,” Studies in Fuzziness and Soft Computing, vol. 207, pp. 137–165, 2006. View at Google Scholar · View at Scopus
  23. X. S. Yang, “Swarm-based metaheuristic algorithms and no-free-lunch theorems,” in Theory and New Applications of Swarm Intelligence, R. Parpinelli and S. Heitor Lopes, Eds., InTech, 2012. View at Publisher · View at Google Scholar
  24. J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proceedings of the IEEE International Conference on Neural Networks. Part 1, vol. 4, pp. 1942–1948, December 1995. View at Scopus
  25. X.-S. Yang, “A new metaheuristic bat-inspired algorithm,” Nature Inspired Cooperative Strategies for Optimization (NICSO '10), vol. 284, pp. 65–74, 2010. View at Google Scholar · View at Scopus
  26. R. Tang, S. Fong, X. S. Yang, and S. Deb, “Wolf search algorithm with ephemeral memory,” in Proceedings of the IEEE 7th International Conference on Digital Information Management (ICDIM '12), pp. 165–172, August 2012.
  27. S. Fong, K. Lan, and R. Wong, “Classifying human voices by using hybrid SFX time-series pre-processing and ensemble feature selection,” Biomed Research International, vol. 2013, Article ID 720834, 27 pages, 2013. View at Publisher · View at Google Scholar