About this Journal Submit a Manuscript Table of Contents
ISRN Artificial Intelligence
Volume 2012 (2012), Article ID 847305, 19 pages
http://dx.doi.org/10.5402/2012/847305
Review Article

Neural Network Implementations for PCA and Its Extensions

1Enjoyor Labs, Enjoyor Inc., Hangzhou 310030, China
2Faculty of Electromechanical Engineering, Guangdong University of Technology, Guangzhou 510006, China
3Department of Electrical and Computer Engineering, Concordia University, Montreal, QC, Canada H3G 1M8

Received 8 April 2012; Accepted 14 June 2012

Academic Editors: C. Kotropoulos and B. Schuller

Copyright © 2012 Jialin Qiu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Many information processing problems can be transformed into some form of eigenvalue or singular value problems. Eigenvalue decomposition (EVD) and singular value decomposition (SVD) are usually used for solving these problems. In this paper, we give an introduction to various neural network implementations and algorithms for principal component analysis (PCA) and its various extensions. PCA is a statistical method that is directly related to EVD and SVD. Minor component analysis (MCA) is a variant of PCA, which is useful for solving total least squares (TLSs) problems. The algorithms are typical unsupervised learning methods. Some other neural network models for feature extraction, such as localized methods, complex-domain methods, generalized EVD, and SVD, are also described. Topics associated with PCA, such as independent component analysis (ICA) and linear discriminant analysis (LDA), are mentioned in passing in the conclusion. These methods are useful in adaptive signal processing, blind signal separation (BSS), pattern recognition, and information compression.