About this Journal Submit a Manuscript Table of Contents
Mathematical Problems in Engineering
Volume 2013 (2013), Article ID 602341, 10 pages
http://dx.doi.org/10.1155/2013/602341
Research Article

A Novel Sparse Least Squares Support Vector Machines

1School of Mechanical and Electrical Engineering, Jiaxing University, Jiaxing 314001, China
2School of Engineering, Zhejiang Normal University, Jinhua 321004, China
3School of Electronics, Electrical Engineering and Computer Science, Queen's University of Belfast, Belfast BT9 5AH, UK

Received 9 August 2012; Accepted 6 December 2012

Academic Editor: Huaguang Zhang

Copyright © 2013 Xiao-Lei Xia et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. J. A. K. Suykens, T. V. Gestel, J. Vandewalle, and B. D. Moor, “A support vector machine formulation to PCA analysis and its kernel version,” IEEE Transactions on Neural Networks, vol. 14, no. 2, pp. 447–450, 2003. View at Publisher · View at Google Scholar · View at Scopus
  2. J. A. K. Suykens and J. Vandewalle, “Least squares support vector machine classifiers,” Neural Processing Letters, vol. 9, no. 3, pp. 293–300, 1999. View at Publisher · View at Google Scholar · View at Scopus
  3. V. Vapnik, The Nature of Statistical Learning Theory, Springer, New York, NY, USA, 1995.
  4. V. Vapnik, Statistical Learning Theory, John Wiley & Sons, New York, NY, USA, 1998.
  5. J. A. K. Suykens, L. Lukas, P. V. Dooren, B. D. Moor, and J. Vandewalle, “Least squares support vector machine classifiers: a large scale algorithm,” in Proceedings of the European Conference on Circuit Theory and Design (ECCTD '99), pp. 839–842, Stresa, Italy, September 1999.
  6. T. V. Gestel, J. A. K. Suykens, B. Baesens et al., “Benchmarking least squares support vector machine classifiers,” Machine Learning, vol. 54, no. 1, pp. 5–32, 2004. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  7. W. Chu, C. J. Ong, and S. S. Keerthi, “An improved conjugate gradient scheme to the solution of least squares SVM,” IEEE Transactions on Neural Networks, vol. 16, no. 2, pp. 498–501, 2005. View at Publisher · View at Google Scholar · View at Scopus
  8. S. S. Keerthi, S. K. Shevade, C. Bhattacharyya, and K. R. K. Murthy, “Improvements to Platt's SMO algorithm for SVM classifier design,” Neural Computation, vol. 13, no. 3, pp. 637–649, 2001. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  9. J. A. K. Suykens, J. de Brabanter, L. Lukas, and J. Vandewalle, “Weighted least squares support vector machines: robustness and sparce approximation,” Neurocomputing, vol. 48, no. 1, pp. 85–105, 2002. View at Publisher · View at Google Scholar · View at Scopus
  10. B. J. de Kruif and T. J. A. de Vries, “Pruning error minimization in least squares support vector machines,” IEEE Transactions on Neural Networks, vol. 14, no. 3, pp. 696–702, 2003. View at Publisher · View at Google Scholar · View at Scopus
  11. X. Zeng and X. W. Chen, “SMO-based pruning methods for sparse least squares support vector machines,” IEEE Transactions on Neural Networks, vol. 16, no. 6, pp. 1541–1546, 2005. View at Publisher · View at Google Scholar · View at Scopus
  12. L. Jiao, L. Bo, and L. Wang, “Fast sparse approximation for least squares support vector machine,” IEEE Transactions on Neural Networks, vol. 18, no. 3, pp. 685–697, 2007. View at Publisher · View at Google Scholar · View at Scopus
  13. K. Li, J. X. Peng, and G. W. Irwin, “A fast nonlinear model identification method,” IEEE Transactions on Automatic Control, vol. 50, no. 8, pp. 1211–1216, 2005. View at Publisher · View at Google Scholar
  14. C. J. C. Burges, “A tutorial on support vector machines for pattern recognition,” Data Mining and Knowledge Discovery, vol. 2, no. 2, pp. 121–167, 1998. View at Publisher · View at Google Scholar · View at Scopus
  15. N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods, Cambridge University Press, New York, NY, USA, 2000.
  16. R. Fletcher, Practical Methods of Optimization, John Wiley & Sons, New York, NY, USA, 1987.
  17. C. Saunders, A. Gammerman, and V. Vovk, “Ridge regression learning algorithm in dual variables,” in Proceedings of the 15th International Conference on Machine Learning (ICML '98), pp. 515–521, Morgan Kaufmann, 1998.
  18. S. Mika, G. Ratsch, and K. Muller, “A mathematical programming approach to the kernel fisher algorithm,” in Advances in Neural Information Processing Systems, pp. 591–597, 2001.
  19. T. Gestel, J. A. K. Suykens, G. Lanckriet, A. Lambrechts, B. Moor, and J. Vandewalle, “Bayesian framework for least-squares support vector machine classifiers, gaussian processes, and kernel fisher discriminant analysis,” Neural Computation, vol. 14, no. 5, pp. 1115–1147, 2002. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  20. X. Xia, K. Li, and G. Irwin, “Improved training of an optimal sparse least squares support vector machine,” in Proceedings of the 17th World Congress The International Federation of Automatic Control (IFAC '08), Seoul, Korea, July 2008. View at Publisher · View at Google Scholar
  21. C. Lawson and R. Hanson, “Solving least squares problems,” in Prentice-Hall Series in Automatic Computation, Prentice Hall, Englewood Cliffs, NJ, USA, 1974. View at Zentralblatt MATH
  22. S. Chen, S. Billings, and W. Luo, “Orthogonal least squares methods and their application to non-linear system identification,” International Journal of Control, vol. 50, no. 5, pp. 1873–1896, 1989. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  23. S. Chen, C. F. N. Cowan, and P. M. Grant, “Orthogonal least squares learning algorithm for radial basis function networks,” IEEE Transactions on Neural Networks, vol. 2, no. 2, pp. 302–309, 1991. View at Publisher · View at Google Scholar · View at Scopus
  24. S. Chen and J. Wigger, “Fast orthogonal least squares algorithm for efficient subset model selection,” IEEE Transactions on Signal Processing, vol. 43, no. 7, pp. 1713–1715, 1995. View at Publisher · View at Google Scholar · View at Scopus
  25. O. L. Mangasarian and D. R. Musicant, “Successive overrelaxation for support vector machines,” IEEE Transactions on Neural Networks, vol. 10, no. 5, pp. 1032–1037, 1999. View at Publisher · View at Google Scholar
  26. K. Li, J. X. Peng, and E. Bai, “A two-stage algorithm for identification of nonlinear dynamic systems,” Automatica, vol. 42, no. 7, pp. 1189–1197, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  27. S. Fahlman and C. Lebiere, “The cascade-correlation learning architecture,” in Advances in Neural Information Processing Systems 2, D. S. Touretzky, Ed., 1990.
  28. C. Chang and C. Lin, “LIBSVM: a library for support vector machines,” Software, vol. 80, pp. 604–611, 2001, http://www.csie.ntu.edu.tw/~cjlin/libsvm/.
  29. K. Pelckmans, J. Suykens, T. van Gestel et al., “LSSVMlab: a matlab/C toolbox for least squares support vector machines,” in Tutorial, KULeuven-ESAT, Leuven, Belgium, 2002.
  30. J. Garcke, M. Griebel, and M. Thess, “Data mining with sparse grids,” Computing, vol. 67, no. 3, pp. 225–253, 2001. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  31. L. Breiman, “Arcing classifier (with discussion and a rejoinder by the author),” The Annals of Statistics, vol. 26, no. 3, pp. 801–849, 1998. View at Publisher · View at Google Scholar
  32. S. Chen, X. Hong, and C. J. Harris, “Regression based D-optimality experimental design for sparse kernel density estimation,” Neurocomputing, vol. 73, no. 4–6, pp. 727–739, 2010. View at Publisher · View at Google Scholar · View at Scopus