About this Journal Submit a Manuscript Table of Contents
ISRN Signal Processing
Volume 2012 (2012), Article ID 740761, 8 pages
http://dx.doi.org/10.5402/2012/740761
Research Article

Online Boosting Algorithm Based on Two-Phase SVM Training

1Department of Information Processing, Tokyo Institute of Technology, Tokyo 152-8550, Japan
2Imaging Science and Engineering Laboratory, Tokyo Institute of Technology, Tokyo 152-8550, Japan

Received 18 May 2012; Accepted 25 June 2012

Academic Editors: G. Camps-Valls and B. Yuan

Copyright © 2012 Vsevolod Yugov and Itsuo Kumazawa. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

We describe and analyze a simple and effective two-step online boosting algorithm that allows us to utilize highly effective gradient descent-based methods developed for online SVM training without the need to fine-tune the kernel parameters, and we show its efficiency by several experiments. Our method is similar to AdaBoost in that it trains additional classifiers according to the weights provided by previously trained classifiers, but unlike AdaBoost, we utilize hinge-loss rather than exponential loss and modify algorithm for the online setting, allowing for varying number of classifiers. We show that our theoretical convergence bounds are similar to those of earlier algorithms, while allowing for greater flexibility. Our approach may also easily incorporate additional nonlinearity in form of Mercer kernels, although our experiments show that this is not necessary for most situations. The pre-training of the additional classifiers in our algorithms allows for greater accuracy while reducing the times associated with usual kernel-based approaches. We compare our algorithm to other online training algorithms, and we show, that for most cases with unknown kernel parameters, our algorithm outperforms other algorithms both in runtime and convergence speed.