Table of Contents Author Guidelines Submit a Manuscript
Complexity
Volume 2019, Article ID 5937274, 17 pages
https://doi.org/10.1155/2019/5937274
Research Article

Two-Phase Incremental Kernel PCA for Learning Massive or Online Datasets

1School of Computer Science and Technology, Shandong Technology and Business University, Yantai, China
2Shandong Co-Innovation Center of Future Intelligent Computing, Yantai, China
3BASIRA Lab, Faculty of Computer and Informatics, Istanbul Technical University, Istanbul, Turkey
4School of Science and Engineering, Computing, University of Dundee, UK
5Department of Brain and Cognitive Engineering, Korea University, Seoul 02841, Republic of Korea
6School of Electronic Engineering, Xian University of Posts and Telecommunications, Xi’an, China
7School of Computer Science and Engineering, Xidian University, Xi’an, China
8Department of Radiology and BRIC, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA

Correspondence should be addressed to Dinggang Shen; ude.cnu.dem@nehsgd

Received 2 October 2018; Revised 17 December 2018; Accepted 8 January 2019; Published 11 February 2019

Guest Editor: Jose Garcia-Rodriguez

Copyright © 2019 Feng Zhao et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. I. T. Jolliffe, Principal Component Analysis, Springer-Verlag, New York, NY, USA, 1986. View at Publisher · View at Google Scholar · View at MathSciNet
  2. B. Schölkopf, A. Smola, and K.-R. Müller, “Nonlinear component analysis as a kernel eigenvalue problem,” Neural Computation, vol. 10, no. 5, pp. 1299–1319, 1998. View at Publisher · View at Google Scholar
  3. B. Chen, J. Yang, B. Jeon, and X. Zhang, “Kernel quaternion principal component analysis and its application in RGB-D object recognition,” Neurocomputing, vol. 266, pp. 293–303, 2017. View at Publisher · View at Google Scholar · View at Scopus
  4. X. Deng and L. Wang, “Modified kernel principal component analysis using double-weighted local outlier factor and its application to nonlinear process monitoring,” ISA Transactions®, vol. 72, pp. 218–228, 2018. View at Publisher · View at Google Scholar · View at Scopus
  5. Y. Yang, W. Sheng, Y. Han, and X. Ma, “Multi-beam pattern synthesis algorithm based on kernel principal component analysis and semi-definite relaxation,” IET Communications, vol. 12, no. 1, pp. 82–95, 2018. View at Publisher · View at Google Scholar · View at Scopus
  6. K. I. Kim, M. O. Franz, and B. Schölkopf, “Iterative kernel principal component analysis for image modeling,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, no. 9, pp. 1351–1366, 2005. View at Publisher · View at Google Scholar · View at Scopus
  7. A. R. Teixeira, A. M. Tomé, K. Stadlthanner, and E. W. Lang, “KPCA denoising and the pre-image problem revisited,” Digital Signal Processing, vol. 18, no. 4, pp. 568–580, 2008. View at Publisher · View at Google Scholar · View at Scopus
  8. W. Soh, H. Kim, and B.-J. Yum, “Application of kernel principal component analysis to multi-characteristic parameter design problems,” Annals of Operations Research, vol. 263, no. 1-2, pp. 69–91, 2018. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  9. R. Rosipal and M. Girolami, “An expectation-maximization approach to nonlinear component analysis,” Neural Computation, vol. 13, no. 3, pp. 505–510, 2001. View at Publisher · View at Google Scholar · View at Scopus
  10. G. Simon, N. S. Nicol, and S. V. N. Vishwanathan, “Fast iterative kernel principal component analysis,” Journal of Machine Learning Research (JMLR), vol. 8, no. 4, pp. 1893–1918, 2007. View at Google Scholar · View at MathSciNet
  11. W. Zheng, C. Zou, and L. Zhao, “An improved algorithm for kernel principal component analysis,” Neural Processing Letters, vol. 22, no. 1, pp. 49–56, 2005. View at Publisher · View at Google Scholar · View at Scopus
  12. F. Vojtěch and H. Václav, “Greedy algorithm for a training set reduction in the kernel methods,” in Proceedings of the 10th International Conference on Computer Analysis of Image and Patterns, vol. 2756 of Lecture Notes in Comput. Sci., pp. 426–433, Springer, Groningen, Netherlands, August 2003. View at Publisher · View at Google Scholar · View at MathSciNet
  13. F. Vojtech, Optimization Algorithms for Kernel Methods [Ph.D. Dissertation], Center for Machine Perception, Czech Technical University, Prague, Czech Republic, 2005.
  14. T. Chin and D. Suter, “Incremental kernel PCA for efficient non-linear feature extraction,” in Proceedings of the 17th British Machine Vision Conference, pp. 4–7, Edinburgh, Scotland, September 2006. View at Publisher · View at Google Scholar
  15. T.-J. Chin and D. Suter, “Incremental kernel principal component analysis,” IEEE Transactions on Image Processing, vol. 16, no. 6, pp. 1662–1674, 2007. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  16. B. J. Kim and I. K. Kim, “Incremental nonlinear PCA for classification,” in Proceedings of the European Conference on Knowledge Discovery in Databases (PKDD), vol. 3202 of Lecture Notes in Computer Science, pp. 291–300, Springer, 2004. View at Publisher · View at Google Scholar
  17. B.-J. Kim, “Active visual learning and recognition using incremental kernel PCA,” in Proceedings of the 18th Australian Joint Conference on Advances in Artificial Intelligence AI’05, vol. 3809 of Lecture Notes in Comput. Sci., pp. 585–592, Springer, 2005. View at Publisher · View at Google Scholar · View at MathSciNet
  18. P. M. Hall, D. Marshall, and R. R. Martin, “Incremental eigenanalysis for classification,” in Proceedings of the British Machine Vision Conference, pp. 286–295, 1998. View at Publisher · View at Google Scholar
  19. S. Kimura, S. Ozawa, and S. Abe, “Incremental Kernel PCA for online learning of feature space,” in Proceedings of the 2005 International Conference on Computational Intelligence for Modelling, Control and Automation, vol. 1, pp. 595–600, Vienna, Austria, November 2005. View at Scopus
  20. Y. Takeuchi, S. Ozawa, and S. Abe, “An efficient incremental kernel principal component analysis for online feature selection,” in Proceedings of the 2007 International Joint Conference on Neural Networks, pp. 2346–2351, Orlando, FL, USA, August 2007. View at Publisher · View at Google Scholar
  21. O. Seiichi, Y. Takeuchi, and A. Shigeo, “A fast incremental kernel principal component analysis for online feature extraction,” in Proceedings of the Pacific Rim International Conference on Trends in Artificial Intelligence, vol. 6230 of Lecture Notes in Computer Science, pp. 487–497, Springer, 2010. View at Publisher · View at Google Scholar
  22. T. Takaomi and O. Seiichi, “A fast incremental kernel principal component analysis for learning stream of data chunks,” in Proceedings of the 2011 International Joint Conference on Neural Networks (IJCNN 2011 - San Jose), pp. 2881–2888, San Jose, CA, USA, July 2011. View at Publisher · View at Google Scholar
  23. A. A. Joseph and S. Ozawa, “A fast incremental kernel principal component analysis for data streams,” in Proceedings of the 2014 International Joint Conference on Neural Networks (IJCNN), Beijing, China, July 2014. View at Publisher · View at Google Scholar
  24. A. A. Joseph, T. Tokumoto, and S. Ozawa, “Online feature extraction based on accelerated kernel principal component analysis for data stream,” Evolving Systems, vol. 7, no. 1, pp. 15–27, 2016. View at Publisher · View at Google Scholar · View at Scopus
  25. H. Fredrik and N. Paul, “Incremental kernel PCA and the Nyström method,” 2018, https://arxiv.org/abs/1802.00043.
  26. G. Baudat and F. Anouar, “Kernel-based methods and function approximation,” in Proceedings of the International Joint Conference on Neural Networks (IJCNN'01), pp. 1244–1249, Washington, DC, USA, July 2001. View at Scopus
  27. H. Zhao, P. C. Yuen, and J. T. Kwok, “A novel incremental principal component analysis and its application for face recognition,” IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 36, no. 4, pp. 873–886, 2006. View at Publisher · View at Google Scholar · View at Scopus
  28. J. Weng, Y. Zhang, and W. Hwang, “Candid covariance-free incremental principal component analysis,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 25, no. 8, pp. 1034–1040, 2003. View at Publisher · View at Google Scholar · View at Scopus
  29. S. Nicole, “Feedforward neural networks for principal components extraction,” Computational Statistics & Data Analysis, vol. 33, no. 4, pp. 425–437, 2000. View at Publisher · View at Google Scholar · View at Scopus
  30. T. D. Sanger, “Optimal unsupervised learning in a single-layer linear feedforward neural network,” Neural Networks, vol. 2, no. 6, pp. 459–473, 1989. View at Publisher · View at Google Scholar · View at Scopus
  31. E. Oja, “A simplified neuron model as a principal component analyzer,” Journal of Mathematical Biology, vol. 15, no. 3, pp. 267–273, 1982. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  32. Y. Li, “On incremental and robust subspace learning,” Pattern Recognition, vol. 37, no. 7, pp. 1509–1518, 2004. View at Publisher · View at Google Scholar · View at Scopus
  33. M. Artac, M. Jogan, and A. Leonardis, “Incremental PCA for on-line visual learning and recognition,” in Proceedings of the 16th International Conference on Pattern Recognition, pp. 781–784, Quebec City, Canada, 2002. View at Publisher · View at Google Scholar
  34. O. Seiichi, P. Shaoning, and K. Nikola, “A modified incremental principal component analysis for on-line learning of feature space and classifier,” in Proceedings of the 8th Pacific Rim International Conference on Artificial Intelligence, pp. 231–240, Auckland, New Zealand, 2004.
  35. R. A. Horn and C. R. Johnson, Matrix Analysis, Cambridge University Press, New York, NY, USA, 2013. View at MathSciNet
  36. S. Ling, X. Cheng, and T. Jiang, “An algorithm for coneigenvalues and coneigenvectors of quaternion matrices,” Advances in Applied Clifford Algebras (AACA), vol. 25, no. 2, pp. 377–384, 2015. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  37. C. Han, Y. Wang, and G. He, “On the convergence of asynchronous parallel algorithm for large-scale linearly constrained minimization problem,” Applied Mathematics and Computation, vol. 211, no. 2, pp. 434–441, 2009. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  38. S. B. Mike, B. Scholkopf, and A. J. Smola, “Kernel PCA and denoising in feature space,” in Advances in Neural Information Processing System, pp. 524–536, MTI press, Cambridge, UK, 1999. View at Google Scholar