Abstract
The imbalanced Extreme Learning Machine based on kernel density estimation (imELM-kde) is a latest classification algorithm for handling the imbalanced binary-class classification. By adjusting the real outputs of training data with intersection point of two probability density functions (p.d.f.s) corresponding to the predictive outputs of majority and minority classes, imELM-kde updates ELM which is trained based on the original training data and thus improves the performance of ELM-based imbalanced classifier. In this paper, we analyze the shortcomings of imELM-kde and then propose an improved version of imELM-kde. The Parzen window method used in imELM-kde leads to multiple intersection points between p.d.f.s of majority and minority classes. In addition, it is unreasonable to update the real outputs with intersection point, because the p.d.f.s are estimated based on the predictive outputs. Thus, in order to improve the shortcomings of imELM-kde, an imbalanced ELM based on normal density estimation (imELM-nde) is proposed in this paper. In imELM-nde, the p.d.f.s of predictive outputs corresponding to majority and minority classes are computed with normal density estimation and the intersection point is used to update the predictive outputs instead of real outputs. This makes the training of probability density estimation-based imbalanced ELM simpler and more feasible. The comparative results show that our proposed imELM-nde performs better than unweighted ELM and imELM-kde for imbalanced binary-class classification problem.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
For simplicity, we don’t use the ridge regression-based ELM [6] in this paper. All the probability density estimation-based ELMs discussed below are designed based on ELM without the regularization factor \(C>0\).
- 2.
In [17], the authors didn’t provide the specific discussion regarding how to map the predictive outputs into a one-dimensional space. Here, we used the mapping method in our proposed inELM-nde to deal with this issue.
- 3.
KEEL-dataset repository. http://www.keel.es/.
References
Chacko, B.P., Krishnan, V.R.V., Raju, G., Anto, P.B.: Handwritten character recognition using wavelet energy and extreme learning machine. Int. J. Mach. Learn. Cybern. 3(2), 149–161 (2012)
Deng, W., Zheng, Q., Chen, L.: Regularized extreme learning machine. In: Proceddings of 2009 IEEE Symposium on Computational Intelligence and Data Mining, pp. 389–395 (2009)
Fu, A.M., Dong, C.R., Wang, L.S.: An experimental study on stability and generalization of extreme learning machines. Int. J. Mach. Learn. Cybern. 6(1), 129–135 (2015)
Huang, G.B., Chen, L., Siew, C.K.: Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans. Neural Netw. 17(4), 879–892 (2006)
Huang, G.B., Wang, D.H., Lan, Y.: Extreme learning machines: a survey. Int. J. Mach. Learn. Cybern. 2(2), 107–122 (2011)
Huang, G.B., Zhou, H., Ding, X., Zhang, R.: Extreme learning machine for regression and multiclass classification. IEEE Trans. Syst. Man Cybern. Part B Cybern. 42(2), 513–529 (2012)
Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: theory and applications. Neurocomputing 70(1), 489–501 (2006)
Lin, S.B., Liu, X., Fang, J., Xu, Z.B.: Is extreme learning machine feasible? A theoretical assessment (Part II). IEEE Trans. Neural Netw. Learn. Syst. 26(1), 21–34 (2015)
Liu, P., Huang, Y.H., Meng, L., Gong, S.Y., Zhang, G.P.: Two-stage extreme learning machine for high-dimensional data. Int. J. Mach. Learn. Cybern. (2014). doi:10.1007/s13042-014-0292-7
Liu, X., Lin, S.B., Fang, J., Xu, Z.B.: Is extreme learning machine feasible? A theoretical assessment (Part I). IEEE Trans. Neural Netw. Learn. Syst. 26(1), 7–20 (2015)
Parzen, E.: On estimation of a probability density function and mode. Ann. Math. Stat. 33(3), 1065–1076 (1962)
Serre, D.: Matrices: Theory and Applications. Springer, New York (2002)
Toh, K.A.: Deterministic neural classification. Neural Comput. 20(6), 1565–1595 (2008)
Wand, M.P., Jones, M.C.: Kernel Smoothing. CRC Press, Boca Raton (1994)
Wang, X.Z., He, Y.L., Wang, D.D.: Non-Naive Bayesian classifiers for classification problems with continuous attributes. IEEE Trans. Cybern. 44(1), 21–39 (2014)
Wu, J., Wang, S.T., Chung, F.L.: Positive and negative fuzzy rule system, extreme learning machine and image classification. Int. J. Mach. Learn. Cybern. 2(4), 261–271 (2011)
Yang, J., Yu, H., Yang, X., Zuo, X.: Imbalanced extreme learning machine based on probability density estimation. In: Bikakis, A., Zheng, X. (eds.) MIWAI 2015. LNCS, vol. 9426, pp. 160–167. Springer, Heidelberg (2015). doi:10.1007/978-3-319-26181-2_15
Zhang, W.B., Ji, H.B.: Fuzzy extreme learning machine for classification. Electron. Lett. 49(7), 448–450 (2013)
Zhao, H.Y., Guo, X.Y., Wang, M.W., Li, T.L., Pang, C.Y., Georgakopoulos, D.: Analyze EEG signals with extreme learning machine based on PMIS feature selection. Int. J. Mach. Learn. Cybern. (2015). doi:10.1007/s13042-015-0378-x
Zong, W., Huang, G.B., Chen, Y.: Weighted extreme learning machine for imbalance learning. Neurocomputing 101, 229–242 (2013)
Acknowledgments
This work is supported by China Postdoctoral Science Foundations (2015M572361 and 2016T90799), Basic Research Project of Knowledge Innovation Program in Shenzhen (JCYJ201503241400368 25), and National Natural Science Foundations of China (61503252, 61473194, 71371063, and 61473111).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
He, Y., Ashfaq, R.A.R., Huang, J.Z., Wang, X. (2016). Imbalanced ELM Based on Normal Density Estimation for Binary-Class Classification. In: Cao, H., Li, J., Wang, R. (eds) Trends and Applications in Knowledge Discovery and Data Mining. PAKDD 2016. Lecture Notes in Computer Science(), vol 9794. Springer, Cham. https://doi.org/10.1007/978-3-319-42996-0_5
Download citation
DOI: https://doi.org/10.1007/978-3-319-42996-0_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-42995-3
Online ISBN: 978-3-319-42996-0
eBook Packages: Computer ScienceComputer Science (R0)