Table of Contents Author Guidelines Submit a Manuscript
Mathematical Problems in Engineering
Volume 2017, Article ID 4191789, 12 pages
https://doi.org/10.1155/2017/4191789
Research Article

Reconstruct the Support Vectors to Improve LSSVM Sparseness for Mill Load Prediction

State Key Laboratory of Electrical Insulation and Power Equipment, School of Electrical Engineering, Xi’an Jiaotong University, Xi’an, China

Correspondence should be addressed to Jianquan Shi; moc.621@64nauqnaijihs

Received 24 October 2016; Revised 14 May 2017; Accepted 25 May 2017; Published 5 July 2017

Academic Editor: Erik Cuevas

Copyright © 2017 Gangquan Si et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. P. Huang, M.-P. Jia, and B.-L. Zhong, “Investigation on measuring the fill level of an industrial ball mill based on the vibration characteristics of the mill shell,” Minerals Engineering, vol. 22, no. 14, pp. 1200–1208, 2009. View at Publisher · View at Google Scholar · View at Scopus
  2. J. Tang, W. Yu, T. Chai, Z. Liu, and X. Zhou, “Selective ensemble modeling load parameters of ball mill based on multi-scale frequency spectral features and sphere criterion,” Mechanical Systems and Signal Processing, vol. 66-67, pp. 485–504, 2016. View at Publisher · View at Google Scholar · View at Scopus
  3. G. Si, H. Cao, Y. Zhang, and L. Jia, “Experimental investigation of load behaviour of an industrial scale tumbling mill using noise and vibration signature techniques,” Minerals Engineering, vol. 22, no. 15, pp. 1289–1298, 2009. View at Publisher · View at Google Scholar · View at Scopus
  4. J. Tang, T. Chai, W. Yu, Z. Liu, and X. Zhou, “A comparative study that measures ball mill load parameters through different single-scale and multiscale frequency spectra-based approaches,” IEEE Transactions on Industrial Informatics, vol. 12, no. 6, pp. 2008–2019, 2016. View at Publisher · View at Google Scholar · View at Scopus
  5. S. J. Rutherford and D. J. Cole, “Modelling nonlinear vehicle dynamics with neural networks,” International Journal of Vehicle Design, vol. 53, no. 4, pp. 260–287, 2010. View at Publisher · View at Google Scholar · View at Scopus
  6. Y. Bengio, N. Chapados, O. Delalleau et al., “Detonation classification from acoustic signature with the restricted Boltzmann machine,” Computational Intelligence, vol. 28, no. 2, pp. 261–288, 2012. View at Publisher · View at Google Scholar · View at MathSciNet
  7. Q.-B. Li, H.-L. Yan, L.-N. Li, J.-G. Wu, and G.-J. Zhang, “Application of partial robust M-regression in noninvasive measurement of human blood glucose concentration with near-infrared spectroscopy,” Spectroscopy and Spectral Analysis, vol. 30, no. 8, pp. 2115–2119, 2010. View at Publisher · View at Google Scholar · View at Scopus
  8. L. Chen and H. Liu, “Application of LS-SVM in fault diagnosis for diesel generator set of marine power station,” in Proceedings of the 2013 International Conference on Advanced Computer Science and Electronics Information (Icacsei 2013), vol. 41, pp. 101–104, 2013.
  9. J. A. K. Suykens, J. De Brabanter, L. Lukas, and J. Vandewalle, “Weighted least squares support vector machines: robustness and sparce approximation,” Neurocomputing, vol. 48, no. 1, pp. 85–105, 2002. View at Publisher · View at Google Scholar · View at Scopus
  10. J. A. K. Suykens, L. Lukas, and J. Vandewalle, “Sparse approximation using least squares support vector machines,” in Proceedings of the 2000 IEEE International Symposium on Circuits and Systems, Emerging Technologies for the 21st Century (ISCAS '00), vol. 2, pp. 757–760, International Conference Center, Geneva, Switzerland, May 2000.
  11. B. J. de Kruif and T. J. A. de Vries, “Pruning error minimization in least squares support vector machines,” IEEE Transactions on Neural Networks, vol. 14, no. 3, pp. 696–702, 2003. View at Publisher · View at Google Scholar · View at Scopus
  12. L. Hoegaerts, J. A. K. Suykens, J. Vandewalle, and B. De Moor, “A comparison of pruning algorithms for sparse least squares support vector machines,” in Proceedings of the 11th International Conference Neural Information Processing (ICONIP '04), vol. 3316 of Lecture Notes in Computer Science, pp. 1247–1253, Springer Berlin Heidelberg, Calcutta, India, January 2004. View at Publisher · View at Google Scholar
  13. X. Y. Zeng and X. W. Chen, “SMO-based pruning methods for sparse least squares support vector machines,” IEEE Transactions on Neural Networks, vol. 16, no. 6, pp. 1541–1546, 2005. View at Publisher · View at Google Scholar · View at Scopus
  14. H. Y. Song, W. H. Gui, and C. H. Yang, “Sparse least squares support vector machine and its applications,” Information and Control, vol. 37, no. 3, pp. 334–338, 2008. View at Google Scholar
  15. Z.-T. Yu, J.-J. Zou, X. Zhao, L. Su, and C.-L. Mao, “Sparseness of least squares support vector machines based on active learning,” Journal of Nanjing University of Science and Technology, vol. 36, no. 1, pp. 12–17, 2012. View at Google Scholar · View at Scopus
  16. L. C. Jiao, L. F. Bo, and L. Wang, “Fast sparse approximation for least squares support vector machine,” IEEE Transactions on Neural Networks, vol. 18, no. 3, pp. 685–697, 2007. View at Publisher · View at Google Scholar · View at Scopus
  17. Z. Yongping and S. Jianguo, “Fast method for sparseleast squares support vector regression machine,” Control and Decision, vol. 23, no. 12, pp. 1347–1352, 2008. View at Google Scholar
  18. Y. Zhao and J. Sun, “Recursive reduced least squares support vector regression,” Pattern Recognition, vol. 42, no. 5, pp. 837–842, 2009. View at Publisher · View at Google Scholar · View at Scopus
  19. Y.-P. Zhao, J.-G. Sun, Z.-H. Du, Z.-A. Zhang, Y.-C. Zhang, and H.-B. Zhang, “An improved recursive reduced least squares support vector regression,” Neurocomputing, vol. 87, pp. 1–9, 2012. View at Publisher · View at Google Scholar · View at Scopus
  20. L. G. Sun, C. C. de Visser, Q. P. Chu, and J. A. Mulder, “A novel online adaptive kernel method with kernel centers determined by a support vector regression approach,” Neurocomputing, vol. 124, pp. 111–119, 2014. View at Publisher · View at Google Scholar · View at Scopus
  21. P. B. Nair, A. Choudhury, and A. J. Keane, “Some greedy learning algorithms for sparse regression and classification with Mercer kernels,” The Journal of Machine Learning Research, vol. 3, pp. 781–801, 2003. View at Google Scholar · View at MathSciNet
  22. J. P. Silva and A. R. Da Rocha Neto, “Sparse least squares support vector machines via genetic algorithms,” in Proceedings of the 11th Brazilian Congress on Computational Intelligence (BRICS '13), pp. 248–253, IEEE, Ipojuca, Brazil, September 2013. View at Publisher · View at Google Scholar · View at Scopus
  23. D. A. Silva, J. P. Silva, and A. R. Rocha Neto, “Novel approaches using evolutionary computation for sparse least square support vector machines,” Neurocomputing, vol. 168, pp. 908–916, 2015. View at Publisher · View at Google Scholar · View at Scopus
  24. L. Yang, S. Yang, R. Zhang, and H. Jin, “Sparse least square support vector machine via coupled compressive pruning,” Neurocomputing, vol. 131, pp. 77–86, 2014. View at Publisher · View at Google Scholar · View at Scopus
  25. Z. Ying and K. C. Keong, “Fast leave-one-out evaluation and improvement on inference for LS-SVMs,” in Proceedings of the 17th International Conference on Pattern Recognition (ICPR '04), vol. 3, pp. 494–497, Cambridge, UK, August 2004. View at Publisher · View at Google Scholar · View at Scopus
  26. P. Cao, X. Liu, J. Zhang et al., “2, 1 norm regularized multi-kernel based joint nonlinear feature selection and over-sampling for imbalanced data classification,” Neurocomputing, vol. 234, pp. 38–57, 2017. View at Google Scholar
  27. J. Zhang, X. Wu, and V. S. Sheng, “Imbalanced multiple noisy labeling,” IEEE Transactions on Knowledge and Data Engineering, vol. 27, no. 2, pp. 489–503, 2015. View at Publisher · View at Google Scholar · View at Scopus
  28. R. Yan, Y. Liu, R. Jin, and A. Hauptmann, “On predicting rare classes with SVM ensembles in scene classification,” in Proceedings of the 2003 IEEE International Conference on Accoustics, Speech, and Signal Processing, pp. 21–24, Hong Kong, China, April 2003. View at Scopus
  29. S.-J. Yen and Y.-S. Lee, Cluster-Based under-Sampling Approaches for Imbalanced Data Distributions, Pergamon Press, Oxford, UK, 2009. View at Publisher · View at Google Scholar · View at Scopus
  30. B. Das, N. C. Krishnan, and D. J. Cook, “RACOG and wRACOG: Two probabilistic oversampling techniques,” IEEE Transactions on Knowledge and Data Engineering, vol. 27, no. 1, pp. 222–234, 2015. View at Publisher · View at Google Scholar · View at Scopus
  31. H. Wong, F. Liu, M. Chen, and W. C. Ip, “Empirical likelihood based diagnostics for heteroscedasticity in partially linear errors-in-variables models,” Journal of Statistical Planning and Inference, vol. 139, no. 3, pp. 916–929, 2009. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  32. B. Lejeune, “A diagnostic m-test for distributional specification of parametric conditional heteroscedasticity models for financial data,” Journal of Empirical Finance, vol. 16, no. 3, pp. 507–523, 2009. View at Publisher · View at Google Scholar · View at Scopus
  33. T. Ginker and O. Lieberman, “Robustness of binary choice models to conditional heteroscedasticity,” Economics Letters, vol. 150, pp. 130–134, 2017. View at Publisher · View at Google Scholar · View at MathSciNet
  34. G. Q. Si, H. Cao, Y. B. Zhang, and L. X. Jia, “Density weighted pruning method for sparse least squares support vector machines,” Xi'an Jiaotong Daxue Xuebao. Journal of Xi'an Jiaotong University, vol. 43, no. 10, pp. 11–15, 2009. View at Google Scholar · View at MathSciNet