Table of Contents Author Guidelines Submit a Manuscript
Journal of Electrical and Computer Engineering
Volume 2015, Article ID 835357, 17 pages
http://dx.doi.org/10.1155/2015/835357
Research Article

Analysis of Generalization Ability for Different AdaBoost Variants Based on Classification and Regression Trees

1Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology, 4259 Nagatsuta-cho, Midori-ku, Yokohama, Kanagawa 226-8503, Japan
2Imaging Science and Engineering Laboratory, Tokyo Institute of Technology, 4259 Nagatsuta-cho, Midori-ku, Yokohama, Kanagawa 226-8503, Japan

Received 13 November 2014; Accepted 21 January 2015

Academic Editor: Sos Agaian

Copyright © 2015 Shuqiong Wu and Hiroshi Nagahashi. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. J.-M. Guo, C.-C. Lin, M.-F. Wu, C.-H. Chang, and H. Lee, “Complexity reduced face detection using probability-based face mask prefiltering and pixel-based hierarchical-feature adaboosting,” IEEE Signal Processing Letters, vol. 18, no. 8, pp. 447–450, 2011. View at Publisher · View at Google Scholar · View at Scopus
  2. P. Viola and M. Jones, “Rapid object detection using a boosted cascade of simple features,” in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR '01), pp. I-511–I-518, December 2001. View at Scopus
  3. M. Kölsch and M. Turk, “Robust hand detection,” in Proceedings of the 6th IEEE International Conference on Automatic Face and Gesture Recognition (FGR '04), pp. 614–619, May 2004. View at Scopus
  4. J. M. Guo, Y. F. Liu, C. H. Chang, and H. S. Nguyen, “Improved hand tracking system,” IEEE Transactions on Circuits and Systems for Video Technology, vol. 22, no. 5, pp. 693–701, 2012. View at Publisher · View at Google Scholar · View at Scopus
  5. J. Xu, Q. Wu, J. Zhang, and Z. Tang, “Fast and accurate human detection using a cascade of boosted MS-LBP features,” IEEE Signal Processing Letters, vol. 19, no. 10, pp. 676–679, 2012. View at Publisher · View at Google Scholar · View at Scopus
  6. Y. Freund and R. E. Schapire, “A short introduction to boosting,” Journal of Japanese Society for Artificial Intelligence, vol. 14, no. 5, pp. 771–780, 1999. View at Google Scholar
  7. R. E. Schapire, Y. Freund, P. Bartlett, and W. S. Lee, “Boosting the margin: a new explanation for the effectiveness of voting methods,” The Annals of Statistics, vol. 26, no. 5, pp. 1651–1686, 1998. View at Publisher · View at Google Scholar · View at MathSciNet
  8. R. E. Schapire and Y. Singer, “Improved boosting algorithms using confidence-rated predictions,” Machine Learning, vol. 37, no. 3, pp. 297–336, 1999. View at Google Scholar
  9. Z. L. Fu, “Analysis and improvement on real AdaBoost algorithm,” Journal of University of Electronic Science and Technology of China, vol. 45, no. 10, pp. 1747–1755, 2012. View at Google Scholar
  10. S. Wu and H. Nagahashi, “Parameterized adaboost: introducing a parameter to speed up the training of real adaboost,” IEEE Signal Processing Letters, vol. 21, no. 6, pp. 687–691, 2014. View at Publisher · View at Google Scholar · View at Scopus
  11. J. Friedman, T. Hastie, and R. Tibshirani, “Additive logistic regression: a statistical view of boosting,” The Annals of Statistics, vol. 28, no. 2, pp. 337–407, 2000. View at Publisher · View at Google Scholar · View at MathSciNet
  12. A. Vezhnevets and V. Vezhnevets, “Modest AdaBoost: teaching AdaBoost to generalize better,” Graphicon, vol. 12, no. 5, pp. 987–997, 2005. View at Google Scholar
  13. Y. Z. J. Thongkam, G. Xu, and F. Huang, “Breast cancer survivability via adaboost algorithms,” in Proceedings of the 2nd Australasian Workshop on Health Data and Knowledge Management (HDKM ’08), vol. 80, pp. 55–64, New South Wales, Australia, January 2008.
  14. S. Wu and H. Nagahashi, “A new method for solving overfitting problem of gentle AdaBoost,” in Proceedings of the 5th International Conference on Graphic and Image Processing, vol. 9069 of Proceedings of SPIE, pp. 1–6, Hong Kong, China, January 2014.
  15. S. Wu and H. Nagahashi, “Penalized AdaBoost: improving the generalization error of gentle AdaBoost through a margin distribution,” IEICE Transactions on Information and Systems. Submitted.
  16. Y. Freund, “An adaptive version of the boost by majority algorithm,” in Proceedings of the 12th Annual Conference on Computational Learning Theory, pp. 102–113, 2000.
  17. A. Demiriz, K. P. Bennett, and J. Shawe-Taylor, “Linear programming boosting via column generation,” Machine Learning, vol. 46, no. 1–3, pp. 225–254, 2002. View at Publisher · View at Google Scholar · View at Scopus
  18. H. Li and C. Shen, “Boosting the minimum margin: LP boost vs. ada boost,” in Proceedings of the Digital Image Computing: Techniques and Applications (DICTA '08), pp. 533–539, December 2008. View at Publisher · View at Google Scholar · View at Scopus
  19. C. Domingo and O. Watanabe, “Madaboost: a modification of adaboost,” in Proceedings of the 13th Annual Conference on Computational Learning Theory, pp. 180–189, Palo Alto, Calif, USA, July 2000.
  20. R. A. Servedio, “Smooth boosting and learning with malicious noise,” Journal of Machine Learning Research, vol. 4, no. 4, pp. 633–648, 2004. View at Google Scholar · View at MathSciNet · View at Scopus
  21. W. Fan, S. J. Stolfo, J. Zhang, and P. K. Chan, “AdaCost: misclassification cost-sensitive boosting,” in Proceedings of the 16th International Conference on Machine Learning, pp. 97–105, 1999.
  22. K. M. Ting, “A comparative study of cost-sensitive boosting algorithms,” in Proceedings of 17th International Conference on Machine Learning, pp. 983–990, 2000.
  23. M. V. Joshi, V. Kumar, and R. C. Agarwal, “Evaluating boosting algorithms to classify rare classes: comparison and improvements,” in Proceedings of the 1st IEEE International Conference on Data Mining (ICDM '01), pp. 257–264, December 2001. View at Scopus
  24. Y. Sun, M. S. Kamel, A. K. C. Wong, and Y. Wang, “Cost-sensitive boosting for classification of imbalanced data,” Pattern Recognition, vol. 40, no. 12, pp. 3358–3378, 2007. View at Publisher · View at Google Scholar · View at Scopus
  25. D. P. Young and J. M. Ferryman, “Faster learning via optimised adaboost,” in Proceedings of the IEEE Conference on Advanced Video and Signal Based Surveillance (AVSS '05), pp. 400–405, September 2005. View at Scopus
  26. M. Seyedhosseini, A. R. C. Paiva, and T. Tasdizen, “Fast AdaBoost training using weighted novelty selection,” in Proceedings of the International Joint Conference on Neural Network (IJCNN '11), pp. 1245–1250, August 2011. View at Publisher · View at Google Scholar · View at Scopus
  27. B. Paul, G. Athithan, and M. N. Murty, “Speeding up AdaBoost classifier with random projection,” in Proceedings of the 7th International Conference on Advances in Pattern Recognition (ICAPR '09), pp. 251–254, February 2009. View at Publisher · View at Google Scholar · View at Scopus
  28. C. Sun, J. Hu, and K.-M. Lam, “Feature subset selection for efficient AdaBoost training,” in Proceedings of the 12th IEEE International Conference on Multimedia and Expo (ICME '11), pp. 1–6, IEEE, Barcelona, Spain, July 2011. View at Publisher · View at Google Scholar · View at Scopus
  29. E. Grossmann, “AdaTree: boosting a weak classifier into a decision tree,” in Proceedings of the Conference on Computer Vision and Pattern Recognition Workshop (CVPR '04), pp. 105–112, Washington, DC, USA, 2004.
  30. J. K. Bradley and R. E. Schapire, “FilterBoost: regression and classification on large datasets,” in Proceedings of the 21st Annual Conference on Neural Information Processing Systems (NIPS '07), pp. 185–192, December 2007. View at Scopus
  31. Y. Sun, J. Li, and W. Hager, “Two new regularized AdaBoost algorithms,” in Proceedings of the International Conference on Machine Learning and Applications (ICMLA '04), pp. 41–48, December 2004. View at Scopus
  32. S. Z. Li and Z. Zhang, “FloatBoost learning and statistical face detection,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 26, no. 9, pp. 1112–1123, 2004. View at Publisher · View at Google Scholar · View at Scopus
  33. Y. Zhang and P. He, “A revised AdaBoost algorithm: FM-AdaBoost,” in Proceedings of the International Conference on Computer Application and System Modeling (ICCASM '10), pp. V11-277–V11-281, October 2010. View at Publisher · View at Google Scholar · View at Scopus
  34. G. Rätsch, T. Onoda, and K.-R. Müller, “Soft margins for AdaBoost,” Machine Learning, vol. 42, no. 3, pp. 287–320, 2001. View at Publisher · View at Google Scholar · View at Scopus
  35. Y. Freund, “A more robust boosting algorithm,” http://arxiv.org/abs/0905.2138.
  36. M. K. Warmuth, K. Glocer, and G. Rätsch, “Boosting algorithms for maximizing the soft margin,” in Proceedings of the 21st Annual Conference on Neural Information Processing Systems (NIPS '07), pp. 1–8, December 2007. View at Scopus
  37. Y. Lu, Q. Tian, and T. Huang, “Interactive boosting for image classification,” in Proceedings of the 7th International Conference on Multiple Classifier Systems, pp. 180–189, 2007.
  38. J. J. Rodríguez and J. Maudes, “Boosting recombined weak classifiers,” Pattern Recognition Letters, vol. 29, no. 8, pp. 1049–1059, 2008. View at Publisher · View at Google Scholar · View at Scopus
  39. M. Warmuth, K. Glocer, and S. Vishwanathan, “Entropy regularized LPBoost,” in Proceedings of the 19th International Conference on Algorithmic Learning Theory, pp. 256–271, 2008.
  40. P. K. Mallapragada, R. Jin, A. K. Jain, and Y. Liu, “SemiBoost: boosting for semi-supervised learning,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 31, no. 11, pp. 2000–2014, 2009. View at Publisher · View at Google Scholar · View at Scopus
  41. K. Chen and S. Wang, “Semi-supervised learning via regularized boosting working on multiple semi-supervised assumptions,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 33, no. 1, pp. 129–143, 2011. View at Publisher · View at Google Scholar · View at Scopus
  42. L. I. Kuncheva, “Error bounds for aggressive and conservative adaboost,” in Proceedings of the 4th International Workshop on Multiple Classifier Systems, pp. 25–34, 2003.
  43. X. Miao and J. S. Heaton, “A comparison of random forest and AdaBoost tree in ecosystem classification in east Mojave Desert,” in Proceedings of the 18th International Conference on Geoinformatics, pp. 1–6, Beijing, China, June 2010.
  44. E. Alfaro, N. García, M. Gámez, and D. Elizondo, “Bankruptcy forecasting: an empirical comparison of AdaBoost and neural networks,” Decision Support Systems, vol. 45, no. 1, pp. 110–122, 2008. View at Publisher · View at Google Scholar · View at Scopus
  45. Y. Wang, P. Han, X. Lu, R. Wu, and J. Huang, “The performance comparison of Adaboost and SVM applied to SAR ATR,” in Proceedings of the CIE International Conference on Radar (ICR '06), pp. 1–4, October 2006. View at Publisher · View at Google Scholar · View at Scopus
  46. A. Ferreira, “Survey on boosting algorithms for supervised and semisupervised learning,” Tech. Rep., Instituto Superior de Engenharia de Lisboa, 2007. View at Google Scholar
  47. C. Seiffert, T. M. Khoshgoftaar, J. Van Hulse, and A. Napolitano, “Resampling or reweighting: a comparison of boosting implementations,” in Proceedings of the 20th IEEE International Conference on Tools with Artificial Intelligence (ICTAI '08), pp. 445–451, November 2008. View at Publisher · View at Google Scholar · View at Scopus
  48. D. Hegazy and J. Denzler, “Performance comparison and evaluation of adaboost and softboost algorithms on generic object recognition,” Proceedings of World Academy of Science: Engineering & Technolog, vol. 47, pp. 70–74, 2008. View at Google Scholar
  49. M. Drauschke and W. Forstner, Comparison of AdaBoost and ADTboost for Feature Subset Selection, INSTICC PRESS, 2008.
  50. S. Jurić-Kavelj and I. Petrović, “Experimental comparison of AdaBoost algorithms applied on leg detection with different range sensor setups,” in Proceedings of the 19th International Workshop on Robotics in Alpe-Adria-Danube Region (RAAD '10), pp. 267–272, June 2010. View at Publisher · View at Google Scholar · View at Scopus
  51. J. Sun, D. Cui, D. Gu, H. Cai, and G. Liu, “Empirical analysis of AdaBoost algorithms on license plate detection,” in Proceedings of the IEEE International Conference on Mechatronics and Automation (ICMA '09), pp. 3497–3502, August 2009. View at Publisher · View at Google Scholar · View at Scopus
  52. P. Natesan, P. Balasubramanie, and G. Gowrison, “Performance comparison of AdaBoost based weak classifiers in network intrusion detection,” Journal of Information Systems and Communication, vol. 3, no. 1, pp. 295–299, 2012. View at Google Scholar
  53. A. J. Ferreira and M. T. Figueiredo, “Boosting algorithms: a review of methods, theory, and applications,” Journal of Information Systems and Communication, vol. 3, no. 1, pp. 35–85, 2012. View at Google Scholar
  54. A. Vezhnevets and V. Vezhnevets, “GML AdaBoostMatlab Toolbox Manual,” http://graphics.cs.msu.ru/ru/science/research/machinelearning/adaboosttoolbox.
  55. K. Bache and M. Lichman, UCI Machine Learning Repository, University of California, School of Information and Computer Science, Irvine, Calif, USA, 2013, http://archive.ics.uci.edu/ml/.