Abstract:
ELM works for the “generalized” singlehidden layer feedforward networks (SLFNs) but the hidden layer (or called feature mapping) in ELM needs not be tuned. Extreme Suppor...Show MoreMetadata
Abstract:
ELM works for the “generalized” singlehidden layer feedforward networks (SLFNs) but the hidden layer (or called feature mapping) in ELM needs not be tuned. Extreme Support Vector Machine (ESVM), combining Support Vector Machine (SVM) and Extreme Learning Machine (ELM) kernels, can lead to a better prediction capability. ESVM can usually have a relatively good predictive capability, and its training time is shorter than SVM most of the time. However, the estimation of regularization parameter of ESVM is very time-consuming. Moreover, the effects of the variance of hidden layer weights and the number of hidden neurons on ESVM are still unclear. Generalized Cross-Validation (GCV) has been widely used in statistics because it can efficiently estimate the ridge parameter without estimating the variance of errors. In this work, we study a connection between ESVM and GCV. Specifically, we consider the computation of the separating plane in ESVM as a ridge regression problem, and propose to use GCV to estimate the regularization parameter of ESVM. Experimental results show that GCV can significantly improve the efficiency of ESVM without accuracy lost. Also, the regularization parameter estimated by GCV can help to analyze how the variance of hidden layer weights and the number of hidden neurons affect the performance of ESVM.
Date of Conference: 12-17 July 2015
Date Added to IEEE Xplore: 01 October 2015
ISBN Information: