Table of Contents Author Guidelines Submit a Manuscript
Abstract and Applied Analysis
Volume 2013 (2013), Article ID 715683, 13 pages
http://dx.doi.org/10.1155/2013/715683
Research Article

Density Problem and Approximation Error in Learning Theory

Department of Mathematics, City University of Hong Kong, Tat Chee Avenue, Kowloon, Hong Kong, China

Received 3 May 2013; Accepted 5 August 2013

Academic Editor: Yiming Ying

Copyright © 2013 Ding-Xuan Zhou. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

We study the density problem and approximation error of reproducing kernel Hilbert spaces for the purpose of learning theory. For a Mercer kernel on a compact metric space ( , ), a characterization for the generated reproducing kernel Hilbert space (RKHS) to be dense in is given. As a corollary, we show that the density is always true for convolution type kernels. Some estimates for the rate of convergence of interpolation schemes are presented for general Mercer kernels. These are then used to establish for convolution type kernels quantitative analysis for the approximation error in learning theory. Finally, we show by the example of Gaussian kernels with varying variances that the approximation error can be improved when we adaptively change the value of the parameter for the used kernel. This confirms the method of choosing varying parameters which is used often in many applications of learning theory.