Table of Contents
Advances in Artificial Neural Systems
Volume 2010 (2010), Article ID 597373, 6 pages
Research Article

OP-KNN: Method and Applications

1Department of Information and Computer Science, Aalto University School of Science and Technology, FI-00076 Aalto, Finland
2Department of Computer Technology and Architecture, University of Granada, 18017 Granada, Spain
3Department GEA, University of Lille 1, 59653 Villeneuve d'ascq cedex, France

Received 6 October 2009; Revised 25 January 2010; Accepted 2 February 2010

Academic Editor: Songcan Chen

Copyright © 2010 Qi Yu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


This paper presents a methodology named Optimally Pruned K-Nearest Neighbors (OP-KNNs) which has the advantage of competing with state-of-the-art methods while remaining fast. It builds a one hidden-layer feedforward neural network using K-Nearest Neighbors as kernels to perform regression. Multiresponse Sparse Regression (MRSR) is used in order to rank each kth nearest neighbor and finally Leave-One-Out estimation is used to select the optimal number of neighbors and to estimate the generalization performances. Since computational time of this method is small, this paper presents a strategy using OP-KNN to perform Variable Selection which is tested successfully on eight real-life data sets from different application fields. In summary, the most significant characteristic of this method is that it provides good performance and a comparatively simple model at extremely high-learning speed.