TY - JOUR
T1 - Hessian semi-supervised extreme learning machine
AU - Krishnasamy, Ganesh
AU - Paramesran, Raveendran
N1 - Funding Information:
This work was supported by the HIR-MOHE Grant No. UM.C/625/1/HIR/MOHE/ENG/42.
Publisher Copyright:
© 2016 Elsevier B.V.
Copyright:
Copyright 2017 Elsevier B.V., All rights reserved.
PY - 2016/9/26
Y1 - 2016/9/26
N2 - Extreme learning machine (ELM) has emerged as an efficient and effective learning algorithm for classification and regression tasks. Most of the existing research on the ELMs mainly focus on supervised learning. Recently, researchers have extended ELMs for semi-supervised learning, in which they exploit both the labeled and unlabeled data in order to enhance the learning performances. They have incorporated Laplacian regularization to determine the geometry of the underlying manifold. However, Laplacian regularization lacks extrapolating power and biases the solution towards a constant function. In this paper, we present a novel algorithm called Hessian semi-supervised ELM (HSS-ELM) to enhance the semi-supervised learning of ELM. Unlike the Laplacian regularization, the Hessian regularization that favors functions whose values vary linearly along the geodesic distance and preserves the local manifold structure well. This leads to good extrapolating power. Furthermore, HSS-ELM maintains almost all the advantages of the traditional ELM such as the significant training efficiency and straight forward implementation for multiclass classification problems. The proposed algorithm is tested on publicly available data sets. The experimental results demonstrate that our proposed algorithm is competitive with the state-of-the-art semi-supervised learning algorithms in term of accuracy. Additionally, HSS-ELM requires remarkably less training time compared to semi-supervised SVMs/regularized least-squares methods.
AB - Extreme learning machine (ELM) has emerged as an efficient and effective learning algorithm for classification and regression tasks. Most of the existing research on the ELMs mainly focus on supervised learning. Recently, researchers have extended ELMs for semi-supervised learning, in which they exploit both the labeled and unlabeled data in order to enhance the learning performances. They have incorporated Laplacian regularization to determine the geometry of the underlying manifold. However, Laplacian regularization lacks extrapolating power and biases the solution towards a constant function. In this paper, we present a novel algorithm called Hessian semi-supervised ELM (HSS-ELM) to enhance the semi-supervised learning of ELM. Unlike the Laplacian regularization, the Hessian regularization that favors functions whose values vary linearly along the geodesic distance and preserves the local manifold structure well. This leads to good extrapolating power. Furthermore, HSS-ELM maintains almost all the advantages of the traditional ELM such as the significant training efficiency and straight forward implementation for multiclass classification problems. The proposed algorithm is tested on publicly available data sets. The experimental results demonstrate that our proposed algorithm is competitive with the state-of-the-art semi-supervised learning algorithms in term of accuracy. Additionally, HSS-ELM requires remarkably less training time compared to semi-supervised SVMs/regularized least-squares methods.
KW - Extreme learning machine
KW - Hessian regularization
KW - Manifold learning
KW - Semi-supervised learning
UR - http://www.scopus.com/inward/record.url?scp=84975156977&partnerID=8YFLogxK
U2 - 10.1016/j.neucom.2016.05.039
DO - 10.1016/j.neucom.2016.05.039
M3 - Article
AN - SCOPUS:84975156977
SN - 0925-2312
VL - 207
SP - 560
EP - 567
JO - Neurocomputing
JF - Neurocomputing
ER -