Table of Contents Author Guidelines Submit a Manuscript
Mathematical Problems in Engineering
Volume 2013, Article ID 712437, 12 pages
http://dx.doi.org/10.1155/2013/712437
Research Article

Efficient Model Selection for Sparse Least-Square SVMs

1School of Mechanical and Electrical Engineering, Jiaxing University, Jiaxing 314001, China
2School of Electronics, Electrical Engineering and Computer Science, Queen's University of Belfast, Belfast BT9 5AH, UK
3School of Computer Science and IT, University of Nottingham, Nottingham NG8 1BB, UK

Received 11 April 2013; Revised 13 June 2013; Accepted 19 June 2013

Academic Editor: Ker-Wei Yu

Copyright © 2013 Xiao-Lei Xia et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. J. A. K. Suykens, T. Van Gestel, J. Vandewalle, and B. De Moor, “A support vector machine formulation to PCA analysis and its kernel version,” IEEE Transactions on Neural Networks, vol. 14, no. 2, pp. 447–450, 2003. View at Publisher · View at Google Scholar · View at Scopus
  2. J. A. K. Suykens and J. Vandewalle, “Least squares support vector machine classifiers,” Neural Processing Letters, vol. 9, no. 3, pp. 293–300, 1999. View at Google Scholar · View at Scopus
  3. S. S. Keerthi, S. K. Shevade, C. Bhattacharyya, and K. R. K. Murthy, “Improvements to Platt's SMO algorithm for SVM classifier design,” Neural Computation, vol. 13, no. 3, pp. 637–649, 2001. View at Publisher · View at Google Scholar · View at Scopus
  4. L. Bo, L. Jiao, and L. Wang, “Working set selection using functional gain for LS-SVM,” IEEE Transactions on Neural Networks, vol. 18, no. 5, pp. 1541–1544, 2007. View at Publisher · View at Google Scholar · View at Scopus
  5. J. A. K. Suykens, L. Lukas, P. V. Dooren, B. D. Moor, and J. Vandewalle, “Least squares support vector machine classifiers: a large scale algorithm,” in Proceedings of the European Conference on Circuit Theory and Design (ECCTD '99), pp. 839–842, Stresa, Italy, 1999.
  6. W. Chu, C. J. Ong, and S. S. Keerthi, “An improved conjugate gradient scheme to the solution of least squares SVM,” IEEE Transactions on Neural Networks, vol. 16, no. 2, pp. 498–501, 2005. View at Publisher · View at Google Scholar · View at Scopus
  7. J. A. K. Suykens, J. De Brabanter, L. Lukas, and J. Vandewalle, “Weighted least squares support vector machines: robustness and sparce approximation,” Neurocomputing, vol. 48, pp. 85–105, 2002. View at Publisher · View at Google Scholar · View at Scopus
  8. B. J. De Kruif and T. J. A. De Vries, “Pruning error minimization in least squares support vector machines,” IEEE Transactions on Neural Networks, vol. 14, no. 3, pp. 696–702, 2003. View at Publisher · View at Google Scholar · View at Scopus
  9. X. Zeng and X.-W. Chen, “SMO-based pruning methods for sparse least squares support vector machines,” IEEE Transactions on Neural Networks, vol. 16, no. 6, pp. 1541–1546, 2005. View at Publisher · View at Google Scholar · View at Scopus
  10. P. Vincent and Y. Bengio, “Kernel matching pursuit,” Machine Learning, vol. 48, no. 1–3, pp. 165–187, 2002. View at Publisher · View at Google Scholar · View at Scopus
  11. V. Popovici, S. Bengio, and J.-P. Thiran, “Kernel matching pursuit for large datasets,” Pattern Recognition, vol. 38, no. 12, pp. 2385–2390, 2005. View at Publisher · View at Google Scholar · View at Scopus
  12. X.-L. Xia, K. Li, and G. Irwin, “A novel sparse least squares support vector machine,” Mathematical Problems in Engineering, vol. 2013, Article ID 602341, 10 pages, 2013. View at Publisher · View at Google Scholar
  13. Y. Lee and O. Mangasarian, “RSVM: reduced support vector machines,” Tech. Rep., Data Mining Institute, Computer Sciences Department, University of Wisconsin, Madison, Wis, USA, 2000. View at Google Scholar
  14. L. Jiao, L. Bo, and L. Wang, “Fast sparse approximation for least squares support vector machine,” IEEE Transactions on Neural Networks, vol. 18, no. 3, pp. 685–697, 2007. View at Publisher · View at Google Scholar · View at Scopus
  15. S. Fahlman and C. Lebiere, “The cascade-correlation learning architecture,” in Advances In Neural Information ProcessIng Systems 2, D. S. Touretzky, Ed., 1990. View at Google Scholar
  16. C. Chang and C. Lin, “LIBSVM: a library for support vector machines,” SofTware, vol. 80, pp. 604–611, 2001. View at Google Scholar
  17. K. Pelckmans, J. Suykens, T. Van Gestel et al., LS-SVMlAb: A Matlab/c Toolbox for Least Squares Support Vector Machines, KULeuven-ESAT, Leuven, Belgium, 2002.
  18. J. Garcke, M. Griebel, and M. Thess, “Data mining with sparse grids,” Computing, vol. 67, no. 3, pp. 225–253, 2001. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  19. S. Chen, X. Hong, and C. J. Harris, “Regression based D-optimality experimental design for sparse kernel density estimation,” Neurocomputing, vol. 73, no. 4–6, pp. 727–739, 2010. View at Publisher · View at Google Scholar · View at Scopus