Table of Contents Author Guidelines Submit a Manuscript
The Scientific World Journal
Volume 2014, Article ID 313164, 9 pages
Research Article

Global Optimization Ensemble Model for Classification Methods

Department of Computer Engineering, College of Electrical & Mechanical Engineering (E&ME), National University of Sciences and Technology (NUST), H-12, Islamabad 46000, Pakistan

Received 24 February 2014; Accepted 19 March 2014; Published 27 April 2014

Academic Editors: N. Barsoum, V. N. Dieu, P. Vasant, and G.-W. Weber

Copyright © 2014 Hina Anwar et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. J. Han and M. Kamber, Data Mining: Concepts & Techniques, 2nd edition, 2006.
  2. J. Han, M. Kamber, and J. Pei, Data Mining: Concepts and Techniques, Morgan Kaufmann, 2nd edition, 2005.
  3. S. Kotsiantis, D. Kanellopoulos, and P. Pintelas, “Data preprocessing for supervised leaning,” International Journal of Computer Science, vol. 1, no. 2, pp. 111–117, 2006. View at Google Scholar
  4. G. Toussaint, “Geometric proximity graphs for improving nearest neighbor methods in instance-based learning and data mining,” International Journal of Computational Geometry and Applications, vol. 15, no. 2, pp. 101–150, 2005. View at Publisher · View at Google Scholar · View at Scopus
  5. O. Söder, 2008,
  6. S. R. Safavian and D. Landgrebe, “A survey of decision tree classifier methodology,” IEEE Transactions on Systems, Man and Cybernetics, vol. 21, no. 3, pp. 660–674, 1991. View at Publisher · View at Google Scholar · View at Scopus
  7. J. Li, Department of Statistics, The Pennsylvania State University,
  8. H. Wang, L. Wang, and L. Yi, “Maximum entropy framework used in text classification,” in Proceedings of the IEEE International Conference on Intelligent Computing and Intelligent Systems (ICIS '10), vol. 2, pp. 828–833, October 2010. View at Publisher · View at Google Scholar · View at Scopus
  9. T. M. Mitchell, Machine Learning, McGraw-Hill, 1997.
  10. L. Breiman, “Random forests,” Machine Learning, vol. 45, no. 1, pp. 5–32, 2001. View at Publisher · View at Google Scholar · View at Scopus
  11. T. K. Ho, “Random decision forest,” in Proceedings of the 3rd International Conference on Document Analysis and Recognition, pp. 278–282, Montreal, QC, Canada, August 1995.
  12. D. W. Hosmer and S. Lemeshow, Applied Logistic Regression, John Wiley & Sons, 2nd edition, 2000.
  13. J. Cohen, P. Cohen, S. G. West, and L. S. Aiken, Applied Multiple Regression/Correlation Analysis For the Behavioral Sciences, Routledge, 3rd edition, 2002.
  14. J. R. Quinlan, “Generating production rules from decision trees,” in Proceedings of the 10th International Joint Conference on Artificial Intelligence, J. McDermott, Ed., pp. 304–307, 1987.
  15. Rapid miner Tutoiral, October 2009, 6-tutorial.pdf.
  16. S. Dash, B. Patra, and B. K. Tripathy, “A hybrid data mining technique for improving the classification accuracy of microarray data set,” International Journal of Information Engineering and Electronic Business, vol. 2, pp. 43–50, 2012. View at Google Scholar
  17. S.-W. Lin and S.-C. Chen, “PSOLDA: a particle swarm optimization approach for enhancing classification accuracy rate of linear discriminant analysis,” Applied Soft Computing Journal, vol. 9, no. 3, pp. 1008–1015, 2009. View at Publisher · View at Google Scholar · View at Scopus
  18. R. Bryll, R. Gutierrez-Osuna, and F. Quek, “Attribute bagging: improving accuracy of classifier ensembles by using random feature subsets,” Pattern Recognition, vol. 36, no. 6, pp. 1291–1302, 2003. View at Publisher · View at Google Scholar · View at Scopus
  19. D. W. Abbott, “Combining models to improve classifier accuracy and robustness,” in Proceedings of the 2nd International Conference on Information Fusion (Fusion '99), vol. I, pp. 289–295, 1999.
  20. S. Y. Sohn and S. H. Lee, “Data fusion, ensemble and clustering to improve the classification accuracy for the severity of road traffic accidents in Korea,” Safety Science, vol. 41, no. 1, pp. 1–14, 2003. View at Publisher · View at Google Scholar · View at Scopus
  21. M. R. Smith and T. Martinez, “Improving classification accuracy by identifying and removing instances that should be misclassified,” in Proceedings of the International Joint Conference on Neural Network (IJCNN '11), pp. 2690–2697, August 2011. View at Publisher · View at Google Scholar · View at Scopus
  22. M. Zamalloayz, L. J. Rodriguez-Fuentesy, M. Penagarikanoy, G. Bordely, and J. P. Uribez, “Feature dimensionality reduction through genetic algorithms for faster speaker recognition,” in Proceedings of the 16th European Signal Processing Conference (EUSIPCO '08), 2008.
  23. B. Liu, B. McKay, and H. A. Abbass, “Improving genetic classifiers with a boosting algorithm,” in Proceedings of the Congress on Evolutionary Computation (CEC '03), vol. 4, pp. 2596–2602, December 2003.
  24. M. L. Raymer, W. F. Punch, E. D. Goodman, L. A. Kuhn, and A. K. Jain, “Dimensionality reduction using genetic algorithms,” IEEE Transactions on Evolutionary Computation, vol. 4, no. 2, pp. 164–171, 2000. View at Publisher · View at Google Scholar · View at Scopus
  25. D. N. Kumar, “Optimization Methods: Advanced Topics in Optimization—Direct and Indirect Search Methods,”