Skip to main content
Log in

NBWELM: naive Bayesian based weighted extreme learning machine

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

Weighted extreme learning machines (WELMs) aim to find the better tradeoff between empirical and structural risks, so they obtain the good generalization performances, especially when using them to deal with the imbalance classification problems. The existing weighting strategies assign the distribution-independent weight matrices for WELMs, i.e., the weights do not consider the probabilistic information of samples. This causes that WELM strengthens the affect of outliers to some extent. In this paper, a naive Bayesian based WELM (NBWELM) is proposed, in which the weight is determined with the flexible naive Bayesian (FNB) classifier. Through calculating the posterior probability of sample, NBWELM cannot only handle the outliers effectively but also consider two different weighting information i.e., the training error in weighted regularized ELM (WRELM) and class distribution in Zong et al.’s WELM (ZWELM), synchronously. The experimental results on 45 KEEL and UCI datasets show that our proposed NBWELM can further improve the generalization capability of WELM and thus obtain a higher classification accuracy than WRELM and ZWELM. Meanwhile, NBWELM does not remarkably increase the computational complexity of WELM due to the simplicity of FNB.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1

Similar content being viewed by others

Notes

  1. http://sci2s.ugr.es/keel/category.php?cat=clas.

  2. http://archive.ics.uci.edu/ml/.

  3. http://www.ntu.edu.sg/home/egbhuang/.

  4. http://www.cs.waikato.ac.nz/ml/weka/.

References

  1. An L, Bhanu B (2012) Image super-resolution by extreme learning machine. Proceedings International Conference Image Process 2209–2212

  2. Barnett V, Lewis T (1994) Outliers in statistical data. Wiley, Chichester

    MATH  Google Scholar 

  3. Choi K, Toh KA, Byun H (2012) Incremental face recognition for large-scale social network services. Pattern Recogn 5(8):2868–2883

    Article  Google Scholar 

  4. Deng W, Zheng Q, Chen L (2009) Regularized extreme learning machine. Proceedings International Symposium Computer Intelligent Data Mining 389–395

  5. Fu AM, Dong CR, Wang LS (2014) An experimental study on stability and generalization of extreme learning machines. Int J Mach Learn & Cybern. doi:10.1007/s13042-014-0238-0

  6. Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten IH (2009) The WEKA data mining software: an update. ACM SIGKDD Explor Newsl 11(1):10–18

    Article  Google Scholar 

  7. Hawkins DM (1980) Identification of outliers. Chapman and Hall, London

    Book  MATH  Google Scholar 

  8. Huang GB, Chen L, Siew CK (2006) Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans Neural Network 17(4):879–892

    Article  Google Scholar 

  9. Huang GB, Zhou H, Ding X, Zhang R (2012) Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern Part B 42(2):513–529

    Article  Google Scholar 

  10. Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1):489–501

    Article  Google Scholar 

  11. Huang GB, Wang DH, Lan Y (2011) Extreme learning machines: a survey. Int J Mach Learn Cybern 2(2):107–122

    Article  Google Scholar 

  12. Janez D (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30

    MathSciNet  MATH  Google Scholar 

  13. John GH, Langley P (1995) Estimating continuous distributions in Bayesian classifiers. Proceedings International Uncertainty Artificial Intelligence 338–345

  14. Jones MC, Marron JS, Sheather SJ (1996) A brief survey of bandwidth selection for density estimation. J Am Stat Assoc 91(433):401–407

    Article  MathSciNet  MATH  Google Scholar 

  15. Khamis A, Ismail Z, Haron K (2005) The effects of outliers data on neural network performance. J Appl Sci 5:1394–1398

    Google Scholar 

  16. Liano K (1996) Robust error measure for supervised neural network learning with outliers. IEEE Trans Neural Netw 7(1):246–250

    Article  Google Scholar 

  17. Liu X, Gao C, Li P (2012) A comparative analysis of support vector machines and extreme learning machines. Neural Netw 33:58–66

    Article  MATH  Google Scholar 

  18. Mirza B, Lin ZP, Toh KA (2013) Weighted online sequential extreme learning machine for class imbalance learning. Neural Process Lett 38(3):465–486

    Article  Google Scholar 

  19. Parzen E (1962) On estimation of a probability density function and mode. Ann Math Stat 33(3):1065–1076

    Article  MathSciNet  MATH  Google Scholar 

  20. Samet S, Miri A (2012) Privacy-preserving back-propagation and extreme learning machine algorithms. Data Knowl Eng 79:40–61

    Article  Google Scholar 

  21. Toh KA (2008) Deterministic neural classification. Neural Comput 20(6):1565–1595

    Article  MathSciNet  MATH  Google Scholar 

  22. Wang XZ, Shao QY, Miao Q, Zhai JH (2013) Architecture selection for networks trained with extreme learning machine using localized generalization error model. Neurocomputing 102:3–9

    Article  Google Scholar 

  23. Wang XZ, He YL, Wand DD (2014) Non-naive Bayesian classifiers for classification problems with continuous attributes. IEEE Trans Cybern 44(1):21–39

    Article  Google Scholar 

  24. Xu Y, Dong ZY, Zhao JH, Zhang P, Wong KP (2012) A reliable intelligent system for real-time dynamic security assessment of power systems. IEEE Trans Power Syst 27(3):1253–1263

    Article  Google Scholar 

  25. Zhang WB, Ji HB (2013) Fuzzy extreme learning machine for classification. Electron Lett 49(7):448–450

    Article  Google Scholar 

  26. Zhao G, Shen Z, Miao C, Man Z (2009) On improving the conditioning of extreme learning machine: a linear case. Proceedings International Information Communications and Signal Processing 1–5

  27. Zhu QY, Qin AK, Suganthan PN, Huang GB (2005) Evolutionary extreme learning machine. Pattern Recogn 38(10):1759–1763

    Article  MATH  Google Scholar 

  28. Zong W, Huang GB, Chen Y (2013) Weighted extreme learning machine for imbalance learning. Neurocomputing 101:229–242

    Article  Google Scholar 

Download references

Acknowledgments

The authors are very grateful for the editors and anonymous reviewers. Their many valuable and constructive comments and suggestions helped us significantly improve this work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jing Wang.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, J., Zhang, L., Cao, Jj. et al. NBWELM: naive Bayesian based weighted extreme learning machine. Int. J. Mach. Learn. & Cyber. 9, 21–35 (2018). https://doi.org/10.1007/s13042-014-0318-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-014-0318-1

Keywords

Navigation