Table of Contents Author Guidelines Submit a Manuscript
Computational Intelligence and Neuroscience
Volume 2017, Article ID 3405463, 11 pages
Research Article

Improving Classification Performance through an Advanced Ensemble Based Heterogeneous Extreme Learning Machines

1School of Computer and Communication Engineering, University of Science and Technology Beijing (USTB), Beijing 100083, China
2Beijing Key Laboratory of Knowledge Engineering for Materials Science, Beijing 100083, China
3Department of Electrical Engineering, COMSATS Institute of Information Technology Abbottabad, Abbottabad, Pakistan

Correspondence should be addressed to Dezheng Zhang; moc.621@anihczdz and Xiong Luo; nc.ude.btsu@oulx

Received 26 December 2016; Revised 25 March 2017; Accepted 9 April 2017; Published 4 May 2017

Academic Editor: Pietro Aricò

Copyright © 2017 Adnan O. M. Abuassba et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Extreme Learning Machine (ELM) is a fast-learning algorithm for a single-hidden layer feedforward neural network (SLFN). It often has good generalization performance. However, there are chances that it might overfit the training data due to having more hidden nodes than needed. To address the generalization performance, we use a heterogeneous ensemble approach. We propose an Advanced ELM Ensemble (AELME) for classification, which includes Regularized-ELM, -norm-optimized ELM (ELML2), and Kernel-ELM. The ensemble is constructed by training a randomly chosen ELM classifier on a subset of training data selected through random resampling. The proposed AELM-Ensemble is evolved by employing an objective function of increasing diversity and accuracy among the final ensemble. Finally, the class label of unseen data is predicted using majority vote approach. Splitting the training data into subsets and incorporation of heterogeneous ELM classifiers result in higher prediction accuracy, better generalization, and a lower number of base classifiers, as compared to other models (Adaboost, Bagging, Dynamic ELM ensemble, data splitting ELM ensemble, and ELM ensemble). The validity of AELME is confirmed through classification on several real-world benchmark datasets.