About this Journal Submit a Manuscript Table of Contents
Advances in Artificial Neural Systems
Volume 2011 (2011), Article ID 302572, 6 pages
http://dx.doi.org/10.1155/2011/302572
Research Article

Cross-Validation, Bootstrap, and Support Vector Machines

1Division of Informatics and Computer Sciences, Graduate School of Engineering, Osaka Electro-Communication University, Osaka 572-8530, Japan
2Biometrics Department, Statistics Analysis Division, EPS Co., Ltd., 3-4-30 Miyahara, Yodogawa-ku, Osaka 532-0003, Japan

Received 2 April 2011; Accepted 7 June 2011

Academic Editor: Tomasz G. Smolinski

Copyright © 2011 Masaaki Tsujitani and Yusuke Tanaka. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. C. M. Bishop, Pattern Regression and Machine Learning, Springer, New York, NY, USA, 2006.
  2. N. Cristianini and J. Shawe-Tylor, An Introduction to Support Vector Machines and Other Kernel-Based Learning Method, Cambridge University Press, Cambridge, UK, 2000.
  3. C.-W. Hsu, C.-C. Chung, and C.-J. Lin, “A practical guide to support vector classification,” 2009, http://www.csie.ntu.edu.tw/~cjlin/papers/guide/guide.pdf.
  4. P. Zhang, “Model selection via multifold cross validation,” Annals of Statistics, vol. 21, pp. 299–313, 1993.
  5. B. Efron and R .J. Tibshirani, An Introduction to the Bootstrap, Chapman & Hall, New York, NY, USA, 1993.
  6. M. Tsujitani and T. Koshimizu, “Neural discriminant analysis,” IEEE Transactions on Neural Networks, vol. 11, no. 6, pp. 1394–1401, 2000. View at Scopus
  7. M. Tsujitani and M. Aoki, “Neural regression model, resampling and diagnosis,” Systems and Computers in Japan, vol. 37, no. 6, pp. 13–20, 2006. View at Publisher · View at Google Scholar · View at Scopus
  8. M. Tsujitani and M. Sakon, “Analysis of survival data having time-dependent covariates,” IEEE Transactions on Neural Networks, vol. 20, no. 3, pp. 389–394, 2009. View at Publisher · View at Google Scholar · View at PubMed · View at Scopus
  9. G. Gong, “Cross-validation, the jackknife, and the bootstrap: excess error estimation in forward logistic regression,” Journal of the American Statistical Association, vol. 81, pp. 108–113, 1986.
  10. T.-F. Wu, C.-J. Lin, and R. C. Weng, “Probability estimates for multi-class classification by pairwise coupling,” Journal of Machine Learning Research, vol. 5, pp. 975–1005, 2004.
  11. C. W. Hs and C. J. Lin, “A comparison of methods for multi-class support vector machines,” IEEE Transactions on Neural Networks, vol. 13, pp. 415–425, 2002.
  12. C.-C. Chang and C.-J. Lin, “LIBSVM: a library for support vector machines,” 2001, http://www.csie.ntu.edu.tw/~cjlin/libsvm.
  13. J. Platt, “Probabilistic outputs for support vector machines and comparison to regularized likelihood methods,” in Advances in Large Margin Classifiers, A. Smola, P. Bartlett, B. Schölkopf, and D. Schuurmans, Eds., MIT Press, Cambridge, Mass, USA, 2000.
  14. A. Karatzoglou, D. Meyer, and K. Hornik, “Support vector machines in R,” Journal of Statistical Software, vol. 15, no. 9, pp. 1–28, 2006. View at Scopus
  15. H. T. Lin, C. J. Lin, and R. C. Weng, “A note on Platt's probabilistic outputs for support vector machines,” Machine Learning, vol. 68, no. 3, pp. 267–276, 2007. View at Publisher · View at Google Scholar
  16. H. Akaike, “Information theory and an extension of the maximum likelihood principle,” in Proceedings of the 2nd International Symposium on Information Theory, B. N. Petrov and F. Csaki, Eds., pp. 267–281, Akademia Kaido, Budapest, Hungary, 1973.
  17. M. Ishiguro, Y. Sakamoto, and G. Kitagawa, “Bootstrapping log likelihood and EIC, an extension of AIC,” Annals of the Institute of Statistical Mathematics, vol. 49, no. 3, pp. 411–434, 1996. View at Scopus
  18. R. Shibata, “Bootstrap estimate of Kullback-Leibler information for model selection,” Statistica Sinica, vol. 7, no. 2, pp. 375–394, 1997. View at Scopus
  19. D. Collett, Modeling Binary Data, Chapman & Hall, New York, NY, USA, 2nd edition, 2003.
  20. J. M. Landwehr, D. Pregibon, and A. C. Shoemaker, “Graphical methods for assessing logistic regression models,” Journal of the American Statistical Association, vol. 79, pp. 61–71, 1984.
  21. T. J. Hastie and R. J. Tibshirani, “Classification by pairwise coupling,” Annals of Statistics, vol. 26, no. 2, pp. 451–471, 1998.
  22. E. J. Bredensteiner and K. P. Bennett, “Multicategory classification by support vector machines,” Computational Optimization and Applications, vol. 12, pp. 53–79, 1999.
  23. C. W. Hsu and C. J. Lin, “A formal analysis of stopping criteria of decomposition methods for support vector machines,” IEEE Transactions on Neural Networks, vol. 13, no. 5, pp. 1045–1052, 2002. View at Publisher · View at Google Scholar · View at PubMed
  24. S. N. Wood, Generalized Additive Models an Introduction with R, Chapman & Hall, New York, NY, USA, 2006.
  25. A. Albert, Multivariate Interpretation of Clinical Laboratory Data, Marcel Dekker, New York, NY, USA, 1992..
  26. A. Albert and E. Lesaffre, “Multiple group logistic discrimination,” Computers and Mathematics with Applications, vol. 12, no. 2, pp. 209–224, 1986.
  27. E. Lesaffre and A. Albert, “Multiple-group logistic regression diagnosis,” Computers and Mathematics with Applications, vol. 38, pp. 425–440, 1989.
  28. Y. Wang, G. Wahba, C. Gu, R. Klein, and B. Klein, “Using smoothing spline anova to examine the relation of risk factors to the incidence and progression of diabetic retinopathy,” Statistics in Medicine, vol. 16, no. 12, pp. 1357–1376, 1997. View at Publisher · View at Google Scholar