Table of Contents Author Guidelines Submit a Manuscript
Journal of Applied Mathematics
Volume 2013 (2013), Article ID 427462, 8 pages
http://dx.doi.org/10.1155/2013/427462
Research Article

Multiple Kernel Spectral Regression for Dimensionality Reduction

School of Computer Science and Technology, China University of Mining and Technology, Xuzhou, Jiangsu 221116, China

Received 13 July 2013; Revised 18 September 2013; Accepted 20 September 2013

Academic Editor: Zhihua Zhang

Copyright © 2013 Bing Liu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Traditional manifold learning algorithms, such as locally linear embedding, Isomap, and Laplacian eigenmap, only provide the embedding results of the training samples. To solve the out-of-sample extension problem, spectral regression (SR) solves the problem of learning an embedding function by establishing a regression framework, which can avoid eigen-decomposition of dense matrices. Motivated by the effectiveness of SR, we incorporate multiple kernel learning (MKL) into SR for dimensionality reduction. The proposed approach (termed MKL-SR) seeks an embedding function in the Reproducing Kernel Hilbert Space (RKHS) induced by the multiple base kernels. An MKL-SR algorithm is proposed to improve the performance of kernel-based SR (KSR) further. Furthermore, the proposed MKL-SR algorithm can be performed in the supervised, unsupervised, and semi-supervised situation. Experimental results on supervised classification and semi-supervised classification demonstrate the effectiveness and efficiency of our algorithm.