Table of Contents Author Guidelines Submit a Manuscript
Scientific Programming
Volume 2018 (2018), Article ID 3732120, 7 pages
Research Article

An Incremental Optimal Weight Learning Machine of Single-Layer Neural Networks

1School of Computer & Computing Science, Zhejiang University City College, Hangzhou 310015, China
2College of Engineering, Lishui University, Lishui 323000, China
3School of Electronics and Information, Zhejiang University of Media and Communications, Hangzhou 310015, China

Correspondence should be addressed to Cheng-Bo Lu

Received 12 October 2017; Revised 1 January 2018; Accepted 11 January 2018; Published 1 March 2018

Academic Editor: Wenbing Zhao

Copyright © 2018 Hai-Feng Ke et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


An optimal weight learning machine with growth of hidden nodes and incremental learning (OWLM-GHNIL) is given by adding random hidden nodes to single hidden layer feedforward networks (SLFNs) one by one or group by group. During the growth of the networks, input weights and output weights are updated incrementally, which can implement conventional optimal weight learning machine (OWLM) efficiently. The simulation results and statistical tests also demonstrate that the OWLM-GHNIL has better generalization performance than other incremental type algorithms.