Skip to main content

Imbalanced ELM Based on Normal Density Estimation for Binary-Class Classification

  • Conference paper
  • First Online:
Trends and Applications in Knowledge Discovery and Data Mining (PAKDD 2016)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 9794))

Included in the following conference series:

Abstract

The imbalanced Extreme Learning Machine based on kernel density estimation (imELM-kde) is a latest classification algorithm for handling the imbalanced binary-class classification. By adjusting the real outputs of training data with intersection point of two probability density functions (p.d.f.s) corresponding to the predictive outputs of majority and minority classes, imELM-kde updates ELM which is trained based on the original training data and thus improves the performance of ELM-based imbalanced classifier. In this paper, we analyze the shortcomings of imELM-kde and then propose an improved version of imELM-kde. The Parzen window method used in imELM-kde leads to multiple intersection points between p.d.f.s of majority and minority classes. In addition, it is unreasonable to update the real outputs with intersection point, because the p.d.f.s are estimated based on the predictive outputs. Thus, in order to improve the shortcomings of imELM-kde, an imbalanced ELM based on normal density estimation (imELM-nde) is proposed in this paper. In imELM-nde, the p.d.f.s of predictive outputs corresponding to majority and minority classes are computed with normal density estimation and the intersection point is used to update the predictive outputs instead of real outputs. This makes the training of probability density estimation-based imbalanced ELM simpler and more feasible. The comparative results show that our proposed imELM-nde performs better than unweighted ELM and imELM-kde for imbalanced binary-class classification problem.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    For simplicity, we don’t use the ridge regression-based ELM [6] in this paper. All the probability density estimation-based ELMs discussed below are designed based on ELM without the regularization factor \(C>0\).

  2. 2.

    In [17], the authors didn’t provide the specific discussion regarding how to map the predictive outputs into a one-dimensional space. Here, we used the mapping method in our proposed inELM-nde to deal with this issue.

  3. 3.

    KEEL-dataset repository. http://www.keel.es/.

References

  1. Chacko, B.P., Krishnan, V.R.V., Raju, G., Anto, P.B.: Handwritten character recognition using wavelet energy and extreme learning machine. Int. J. Mach. Learn. Cybern. 3(2), 149–161 (2012)

    Article  Google Scholar 

  2. Deng, W., Zheng, Q., Chen, L.: Regularized extreme learning machine. In: Proceddings of 2009 IEEE Symposium on Computational Intelligence and Data Mining, pp. 389–395 (2009)

    Google Scholar 

  3. Fu, A.M., Dong, C.R., Wang, L.S.: An experimental study on stability and generalization of extreme learning machines. Int. J. Mach. Learn. Cybern. 6(1), 129–135 (2015)

    Article  Google Scholar 

  4. Huang, G.B., Chen, L., Siew, C.K.: Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans. Neural Netw. 17(4), 879–892 (2006)

    Article  Google Scholar 

  5. Huang, G.B., Wang, D.H., Lan, Y.: Extreme learning machines: a survey. Int. J. Mach. Learn. Cybern. 2(2), 107–122 (2011)

    Article  Google Scholar 

  6. Huang, G.B., Zhou, H., Ding, X., Zhang, R.: Extreme learning machine for regression and multiclass classification. IEEE Trans. Syst. Man Cybern. Part B Cybern. 42(2), 513–529 (2012)

    Article  Google Scholar 

  7. Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: theory and applications. Neurocomputing 70(1), 489–501 (2006)

    Article  Google Scholar 

  8. Lin, S.B., Liu, X., Fang, J., Xu, Z.B.: Is extreme learning machine feasible? A theoretical assessment (Part II). IEEE Trans. Neural Netw. Learn. Syst. 26(1), 21–34 (2015)

    Article  MathSciNet  Google Scholar 

  9. Liu, P., Huang, Y.H., Meng, L., Gong, S.Y., Zhang, G.P.: Two-stage extreme learning machine for high-dimensional data. Int. J. Mach. Learn. Cybern. (2014). doi:10.1007/s13042-014-0292-7

    Google Scholar 

  10. Liu, X., Lin, S.B., Fang, J., Xu, Z.B.: Is extreme learning machine feasible? A theoretical assessment (Part I). IEEE Trans. Neural Netw. Learn. Syst. 26(1), 7–20 (2015)

    Article  MathSciNet  Google Scholar 

  11. Parzen, E.: On estimation of a probability density function and mode. Ann. Math. Stat. 33(3), 1065–1076 (1962)

    Article  MathSciNet  MATH  Google Scholar 

  12. Serre, D.: Matrices: Theory and Applications. Springer, New York (2002)

    MATH  Google Scholar 

  13. Toh, K.A.: Deterministic neural classification. Neural Comput. 20(6), 1565–1595 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  14. Wand, M.P., Jones, M.C.: Kernel Smoothing. CRC Press, Boca Raton (1994)

    MATH  Google Scholar 

  15. Wang, X.Z., He, Y.L., Wang, D.D.: Non-Naive Bayesian classifiers for classification problems with continuous attributes. IEEE Trans. Cybern. 44(1), 21–39 (2014)

    Article  Google Scholar 

  16. Wu, J., Wang, S.T., Chung, F.L.: Positive and negative fuzzy rule system, extreme learning machine and image classification. Int. J. Mach. Learn. Cybern. 2(4), 261–271 (2011)

    Article  Google Scholar 

  17. Yang, J., Yu, H., Yang, X., Zuo, X.: Imbalanced extreme learning machine based on probability density estimation. In: Bikakis, A., Zheng, X. (eds.) MIWAI 2015. LNCS, vol. 9426, pp. 160–167. Springer, Heidelberg (2015). doi:10.1007/978-3-319-26181-2_15

    Google Scholar 

  18. Zhang, W.B., Ji, H.B.: Fuzzy extreme learning machine for classification. Electron. Lett. 49(7), 448–450 (2013)

    Article  Google Scholar 

  19. Zhao, H.Y., Guo, X.Y., Wang, M.W., Li, T.L., Pang, C.Y., Georgakopoulos, D.: Analyze EEG signals with extreme learning machine based on PMIS feature selection. Int. J. Mach. Learn. Cybern. (2015). doi:10.1007/s13042-015-0378-x

    Google Scholar 

  20. Zong, W., Huang, G.B., Chen, Y.: Weighted extreme learning machine for imbalance learning. Neurocomputing 101, 229–242 (2013)

    Article  Google Scholar 

Download references

Acknowledgments

This work is supported by China Postdoctoral Science Foundations (2015M572361 and 2016T90799), Basic Research Project of Knowledge Innovation Program in Shenzhen (JCYJ201503241400368 25), and National Natural Science Foundations of China (61503252, 61473194, 71371063, and 61473111).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yulin He .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

He, Y., Ashfaq, R.A.R., Huang, J.Z., Wang, X. (2016). Imbalanced ELM Based on Normal Density Estimation for Binary-Class Classification. In: Cao, H., Li, J., Wang, R. (eds) Trends and Applications in Knowledge Discovery and Data Mining. PAKDD 2016. Lecture Notes in Computer Science(), vol 9794. Springer, Cham. https://doi.org/10.1007/978-3-319-42996-0_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-42996-0_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-42995-3

  • Online ISBN: 978-3-319-42996-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics