Table of Contents Author Guidelines Submit a Manuscript
Mathematical Problems in Engineering
Volume 2015, Article ID 590678, 12 pages
http://dx.doi.org/10.1155/2015/590678
Research Article

A New Ensemble Method with Feature Space Partitioning for High-Dimensional Data Classification

1Database and Bioinformatics Laboratory, College of Electrical and Computer Engineering, Chungbuk National University, Cheongju 362763, Republic of Korea
2Graduate School of Professional Science Master, Chungbuk National University, Cheongju 362763, Republic of Korea
3Department of Computer Science, Namseoul University, Cheonan 331707, Republic of Korea
4School of Electronics & Computer Engineering, Chonnam National University, Gwangju 500757, Republic of Korea

Received 25 November 2014; Accepted 5 January 2015

Academic Editor: Sanghyuk Lee

Copyright © 2015 Yongjun Piao et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. H. Zhao, A. P. Sinha, and W. Ge, “Effects of feature construction on classification performance: an empirical study in bank failure prediction,” Expert Systems with Applications, vol. 36, no. 2, pp. 2633–2644, 2009. View at Publisher · View at Google Scholar · View at Scopus
  2. B. Pradhan, “A comparative study on the predictive ability of the decision tree, support vector machine and neuro-fuzzy models in landslide susceptibility mapping using GIS,” Computers & Geosciences, vol. 51, pp. 350–365, 2013. View at Publisher · View at Google Scholar · View at Scopus
  3. D. C. Ciresan, U. Meier, J. Masci, and J. Schmidhuber, “Multi-column deep neural network for traffic sign classification,” Neural Networks, vol. 30, pp. 333–338, 2012. View at Google Scholar
  4. C. H. Jin, G. Pok, H.-W. Park, and K. H. Ryu, “Improved pattern sequence-based forecasting method for electricity load,” IEEJ Transactions on Electrical and Electronic Engineering, vol. 9, no. 6, pp. 670–674, 2014. View at Publisher · View at Google Scholar
  5. Z. Qi, Y. Tian, and Y. Shi, “Robust twin support vector machine for pattern classification,” Pattern Recognition, vol. 46, no. 1, pp. 305–316, 2013. View at Publisher · View at Google Scholar · View at Scopus
  6. A. Rahman and B. Verma, “Ensemble classifier generation using non-uniform layered clustering and Genetic Algorithm,” Knowledge-Based Systems, vol. 43, pp. 30–42, 2013. View at Publisher · View at Google Scholar · View at Scopus
  7. L. Breiman, “Bagging predictors,” Machine Learning, vol. 24, no. 2, pp. 123–140, 1996. View at Google Scholar · View at Scopus
  8. Y. Freund and R. E. Schapire, “Experiments with a new boosting algorithm,” in International Conference on Machine Learning, pp. 148–156, 1996.
  9. S. B. Cho and H.-H. Won, “Cancer classification using ensemble of neural networks with multiple significant gene subsets,” Applied Intelligence, vol. 26, no. 3, pp. 243–250, 2007. View at Publisher · View at Google Scholar · View at Scopus
  10. K. Tumer and J. Ghosh, “Classier combining: analytical results and implications,” in Proceedings of the National Conference on Artificial Intelligence, pp. 126–132, Portland, Ore, USA, 1996.
  11. K. Tumer and N. C. Oza, “Decimated input ensembles for improved generalization,” in Proceedings of the International Joint Conference on Neural Networks (IJCNN '99), pp. 3069–3074, July 1999. View at Scopus
  12. R. Bryll, R. Gutierrez-Osuna, and F. Quek, “Attribute bagging: improving accuracy of classifier ensembles by using random feature subsets,” Pattern Recognition, vol. 36, no. 6, pp. 1291–1302, 2003. View at Publisher · View at Google Scholar · View at Scopus
  13. L. Rokach, “Genetic algorithm-based feature set partitioning for classification problems,” Pattern Recognition, vol. 41, no. 5, pp. 1676–1700, 2008. View at Publisher · View at Google Scholar · View at Scopus
  14. L. Rokach, “Ensemble-based classifiers,” Artificial Intelligence Review, vol. 33, no. 1-2, pp. 1–39, 2010. View at Publisher · View at Google Scholar · View at Scopus
  15. L. Breiman, “Random forests,” Machine Learning, vol. 45, no. 1, pp. 5–32, 2001. View at Publisher · View at Google Scholar · View at Scopus
  16. H. Ahn, H. Moon, M. J. Fazzari, N. Lim, J. J. Chen, and R. L. Kodell, “Classification by ensembles from random partitions of high-dimensional data,” Computational Statistics and Data Analysis, vol. 51, no. 12, pp. 6166–6179, 2007. View at Publisher · View at Google Scholar · View at Scopus
  17. K. Ming Ting, J. R. Wells, S. Chuan Tan, S. Wei Teng, and G. I. Webb, “Feature-subspace aggregating: ensembles for stable and unstable learners,” Machine Learning, vol. 82, no. 3, pp. 375–397, 2011. View at Publisher · View at Google Scholar · View at Scopus
  18. K. W. de Bock and D. V. D. Poel, “An empirical evaluation of rotation-based ensemble classifiers for customer churn prediction,” Expert Systems with Applications, vol. 38, no. 10, pp. 12293–12301, 2011. View at Publisher · View at Google Scholar · View at Scopus
  19. Y. Yang, Z. Ma, A. G. Hauptmann, and N. Sebe, “Feature selection for multimedia analysis by sharing information among multiple tasks,” IEEE Transactions on Multimedia, vol. 15, no. 3, pp. 661–669, 2013. View at Publisher · View at Google Scholar · View at Scopus
  20. I. A. Gheyas and L. S. Smith, “Feature subset selection in large dimensionality domains,” Pattern Recognition, vol. 43, no. 1, pp. 5–13, 2010. View at Publisher · View at Google Scholar · View at Scopus
  21. L. Yu and H. Liu, “Feature selection for high-dimensional data: a fast correlation-based filter solution,” in Proceedings of the 12th International Conference on Machine Learning, Washington, DC, USA, 2003.
  22. Y. Piao, M. Piao, K. Park, and K. H. Ryu, “An ensemble correlation-based gene selection algorithm for cancer classification with gene expression data,” Bioinformatics, vol. 28, no. 24, pp. 3306–3315, 2012. View at Publisher · View at Google Scholar · View at Scopus
  23. M. F. Akay, “Support vector machines combined with feature selection for breast cancer diagnosis,” Expert Systems with Applications, vol. 36, no. 2, pp. 3240–3247, 2009. View at Publisher · View at Google Scholar · View at Scopus
  24. S. J. Hanson and Y. O. Halchenko, “Brain reading using full brain support vector machines for object recognition: there is no ‘face’ identification area,” Neural Computation, vol. 20, no. 2, pp. 486–503, 2008. View at Publisher · View at Google Scholar · View at Scopus
  25. W. Zaghloul, S. M. Lee, and S. Trimi, “Text classification: neural networks vs support vector machines,” Industrial Management & Data Systems, vol. 109, no. 5, pp. 708–717, 2009. View at Publisher · View at Google Scholar · View at Scopus
  26. L. Zhang, B. Liao, D. Li, and W. Zhu, “A novel representation for apoptosis protein subcellular localization prediction using support vector machine,” Journal of Theoretical Biology, vol. 259, no. 2, pp. 361–365, 2009. View at Publisher · View at Google Scholar · View at Scopus
  27. A. Martinez and R. Benavente, “The AR face database,” CVC Technical Report 24, 1998. View at Google Scholar
  28. F. S. Samaria and A. C. Harter, “Parameterisation of a stochastic model for human face identification,” in Proceedings of the 2nd IEEE Workshop on Applications of Computer Vision, pp. 138–142, December 1994. View at Scopus
  29. T. Sim, S. Baker, and M. Bsat, “The cmu pose, illumination, andexpression (pie) database of human faces,” Tech. Rep. CMU-RI-TR-01-02, Robotics Institute, Carnegie Mellon University, 2001. View at Google Scholar
  30. D. Hond and L. Spacek, “Distinctive descriptions for face processing,” in Proceedings of the 8th British Machine Vision Conference (BMVC '97), pp. 320–329, Essex, UK, 1997.
  31. S. A. Armstrong, J. E. Staunton, L. B. Silverman et al., “MLL translocations specify a distinct gene expression profile that distinguishes a unique leukemia,” Nature Genetics, vol. 30, no. 1, pp. 41–47, 2002. View at Publisher · View at Google Scholar · View at Scopus
  32. D. Singh, P. G. Febbo, K. Ross et al., “Gene expression correlates of clinical prostate cancer behavior,” Cancer Cell, vol. 1, no. 2, pp. 203–209, 2002. View at Publisher · View at Google Scholar · View at Scopus
  33. H.-H. Hsu, C.-W. Hsieh, and M.-D. Lu, “Hybrid feature selection by combining filters and wrappers,” Expert Systems with Applications, vol. 38, no. 7, pp. 8144–8150, 2011. View at Publisher · View at Google Scholar · View at Scopus
  34. M. Piao, H. S. Shon, J. Y. Lee, and K. H. Ryu, “Subspace projection method based clustering analysis in load profiling,” IEEE Transactions on Power Systems, vol. 29, no. 6, pp. 2628–2635, 2014. View at Publisher · View at Google Scholar · View at Scopus
  35. M. E. A. Bashir, D. G. Lee, M. Li et al., “Trigger learning and ECG parameter customization for remote cardiac clinical care information system,” IEEE Transactions on Information Technology in Biomedicine, vol. 16, no. 4, pp. 561–571, 2012. View at Publisher · View at Google Scholar · View at Scopus
  36. Y. Lee, Y. J. Jung, K. W. Nam, S. Nittel, K. Beard, and K. H. Ryu, “Geosensor data representation using layered slope grids,” Sensors, vol. 12, no. 12, pp. 17074–17093, 2012. View at Publisher · View at Google Scholar · View at Scopus