Skip to main content
Log in

Robust Fisher-regularized extreme learning machine with asymmetric Welsch-induced loss function for classification

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

In general, it is a worth challenging problem to build a robust classifier for data sets with noises or outliers. Establishing a robust classifier is a more difficult problem for datasets with asymmetric noise distribution. The Fisher-regularized extreme learning machine (Fisher-ELM) considers the statistical knowledge of the data, however, it ignores the impact of noises or outliers. In this paper, to reduce the negative influence of noises or outliers, we first put forward a novel asymmetric Welsch loss function named AW-loss based on asymmetric \(L_{2}\)-loss function and Welsch loss function. Based on the AW-loss function, we then present a new robust Fisher-ELM called AWFisher-ELM. The proposed AWFisher-ELM not only takes into account the statistical information of the data, but also considers the impact of asymmetric distribution noises. We utilize concave-convex procedure (CCCP) and dual method to solve the non-convexity of the proposed AWFisher-ELM. Simultaneously, an algorithm for AWFisher-ELM is given and a theorem about the convergence of the algorithm is proved. To validate the effectiveness of our algorithm, we compare our AWFisher-ELM with the other state-of-the-art methods on artificial data sets, UCI data sets, NDC large data sets and image data sets by setting different ratios of noises. The experimental results are as follows, the accuracy of AWFisher-ELM is the highest in the artificial data sets, reaching 98.9%. For the large-scale NDC data sets and the image data sets, the accuracy of AWFisher-ELM is also the highest. For the ten UCI data sets, the accuracy and \(F_{1}\) value of AWFisher-ELM are the highest in most data sets expect for Diabetes. In terms of training time, our AWFisher-ELM has almost the same training time with RHELM and CHELM, but it takes longer time than OPT-ELM, WCS-SVM, Fisher-SVM, Pinball-FisherSVM, and Fisher-ELM. This is because AWFisher-ELM, RHELM, and CHELM need to solve a convex quadratic subprogramming problem in each iteration. In conclusion, our method exhibits excellent generalization performance expect for the longer training time.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Algorithm 1
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data Availability and Access

The UCI machine Learning repository is available at http://archive.ics.uci.edu/ml/datasets.php. The NDC datasets are available at http://www.cs.wisc.edu/musicant/data/ndc. The image data are available at http://www.cad.zju.edu.cn/home/dengcai/Data/FaceData.html.

Notes

  1. http://www.cs.wisc.edu/musicant/data/ndc

  2. http://www.cad.zju.edu.cn/home/dengcai/Data/FaceData.html

  3. http://www.cad.zju.edu.cn/home/dengcai/Data/FaceData.html

  4. http://www.cad.zju.edu.cn/home/dengcai/Data/FaceData.html

References

  1. Huang G-B, Zhu Q-Y, Siew C-K (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. In: 2004 IEEE international joint conference on neural networks (IEEE Cat. No. 04CH37541), vol 2, pp 985–990. Ieee

  2. Huang G-B, Zhu Q-Y, Siew C-K (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1–3):489–501

    Article  Google Scholar 

  3. Banskota N, Alsadoon A, Prasad PWC, Dawoud A, Rashid TA, Alsadoon OH (2023) A novel enhanced convolution neural network with extreme learning machine: facial emotional recognition in psychology practices. Multimed Tools Appl 82(5):6479–6503

    Article  Google Scholar 

  4. Ergul U, Bilgin G (2020) Mck-elm: multiple composite kernel extreme learning machine for hyperspectral images. Neural Comput Appl 32:6809–6819

    Article  Google Scholar 

  5. Valipour A, Shirgahi H (2022) Estimation of rip density on intermediate beaches using an extreme learning machine model. Reg Stud Mar Sci 52:102332

    Google Scholar 

  6. Yang Y, Zhou H, Gao Y, Wu J, Wang Y-G, Fu L (2022) Robust penalized extreme learning machine regression with applications in wind speed forecasting. Neural Comput Appl 34(1):391–407

    Article  Google Scholar 

  7. Biswas S, Mahanti GK, Chattaraj N (2023) Investigation of extreme learning machine-based fault diagnosis to identify faulty components in analog circuits. Circ Syst Sig Process 1–18

  8. Akcan E, Kuncan M, Kaplan K, Kaya Y (2024) Diagnosing bearing fault location, size, and rotational speed with entropy variables using extreme learning machine. J Braz Soc Mech Sci Eng 46(1):4

    Article  Google Scholar 

  9. Seidu J, Ewusi A, Kuma JSY, Ziggah YY, Voigt H-J (2022) A hybrid groundwater level prediction model using signal decomposition and optimised extreme learning machine. Model Earth Syst Environ 1–18

  10. Deng W, Zheng Q, Chen L (2009) Regularized extreme learning machine. In: 2009 IEEE symposium on computational intelligence and data mining, pp 389–395. IEEE

  11. Zong W, Huang G-B, Chen Y (2013) Weighted extreme learning machine for imbalance learning. Neurocomputing 101:229–242

    Article  Google Scholar 

  12. Zhang WB, Ji HB (2013) Fuzzy extreme learning machine for classification. Electron Lett 49(7):448–450

    Article  Google Scholar 

  13. Huang G, Song S, Gupta JND, Wu C (2014) Semi-supervised and unsupervised extreme learning machines. IEEE Trans Cybern 44(12):2405–2417

    Article  Google Scholar 

  14. Huang G-B, Ding X, Zhou H (2010) Optimization method based extreme learning machine for classification. Neurocomputing 74(1–3):155–163

    Article  Google Scholar 

  15. Hastie T (2009) Hastie t, tibshirani r, friedman j. the elements of statistical learning

  16. Balasundaram S, Gupta D et al (2014) 1-norm extreme learning machine for regression and multiclass classification using newton method. Neurocomputing 128:4–14

    Article  Google Scholar 

  17. Li R, Wang X, Lei L, Song Y (2018) \({L}_{2,1}\)-norm based loss function and regularization extreme learning machine. IEEE Access 7:6575–6586

    Article  Google Scholar 

  18. Yang C, Nie K, Qiao J, Li B (2020) Design of extreme learning machine with smoothed \(l_{0}\) regularization. Mob Netw Appl 25:2434–2446

    Article  Google Scholar 

  19. Gupta D, Hazarika BB, Berlin M (2020) Robust regularized extreme learning machine with asymmetric huber loss function. Neural Comput Appl 32(16):12971–12998

    Article  Google Scholar 

  20. Ren Z, Yang L (2019) Robust extreme learning machines with different loss functions. Neural Process Lett 49:1543–1565

    Article  Google Scholar 

  21. Zhang F, Chen S, Hong Z, Shan B, Xu Q (2023) A robust extreme learning machine based on adaptive loss function for regression modeling. Neural Process Lett 1–24

  22. Ma J, Yang L (2021) Robust supervised and semi-supervised twin extreme learning machines for pattern classification. Signal Process 180:107861

    Article  Google Scholar 

  23. Jun MA (2020) Capped \({L}_{1}\)-norm distance metric-based fast robust twin extreme learning machine. Appl Intell (1-3)

  24. Dong X, Wang L (2021) Robust semi-supervised support vector machines with laplace kernel-induced correntropy loss functions. Applied Intelligence The International Journal of Artificial Intelligence, Neural Networks, and Complex Problem-Solving Technologies 51(2)

  25. Ke J, Gong C, Liu T, Zhao L, Yang J, Tao D (2020) Laplacian welsch regularization for robust semisupervised learning. IEEE Trans Cybern 52(1):164–177

    Article  Google Scholar 

  26. Liu W, Pokharel PP, Principe JC (2007) Correntropy: properties and applications in non-gaussian signal processing. IEEE Trans Signal Process 55(11):5286–5298

    Article  MathSciNet  Google Scholar 

  27. Singh A, Principe JC (2010) A loss function for classification based on a robust similarity metric. In: The 2010 International Joint Conference on Neural Networks (IJCNN), pp 1–6. IEEE

  28. Ren Z, Yang L (2018) Correntropy-based robust extreme learning machine for classification. Neurocomputing 313:74–84

  29. Nikolova M, Ng MK (2005) Analysis of half-quadratic minimization methods for signal and image recovery. SIAM J Sci Comput 27(3):937–966

    Article  MathSciNet  Google Scholar 

  30. Tao PD, An LTH (1997) Convex analysis approach to dc programming: theory, algorithms and applications. Acta Math Vietnamica 22(1):289–355

    MathSciNet  Google Scholar 

  31. Yuille AL, Rangarajan A (2003) The concave-convex procedure. Neural Comput 15(4):915–936

    Article  Google Scholar 

  32. Davis JC (1996) Introduction to statistical pattern recognition. Comput Geosci 7(22):833–834

    Article  Google Scholar 

  33. An W, Liang M (2013) Fuzzy support vector machine based on within-class scatter for classification problems with outliers or noises. Neurocomputing 110:101–110

    Article  Google Scholar 

  34. Zhang L, Zhou W-D (2016) Fisher-regularized support vector machine. Inf Sci 343:79–93

    Article  MathSciNet  Google Scholar 

  35. Zhang Z, Zhang L, Zhang Z (2021) Fisher-regularized support vector machine with pinball loss function. In: 2021 International Joint Conference on Neural Networks (IJCNN), pp 1–8. IEEE

  36. Ma J, Wen Y, Yang L (2020) Fisher-regularized supervised and semi-supervised extreme learning machine. Knowl Inf Syst 62:3995–4027

    Article  Google Scholar 

  37. Geyer CJ (1994) On the asymptotics of constrained m-estimation. The Annals of statistics 1993–2010

  38. Yang L, Zhang S (2016) A sparse extreme learning machine framework by continuous optimization algorithms and its application in pattern recognition. Eng Appl Artif Intell 53:176–189

    Article  Google Scholar 

Download references

Acknowledgements

The authors wish to acknowledge the financial support of the National Nature Science Youth Foundation of China (No. 61907012), the Construction Project of First-Class Disciplines in Ningxia Higher Education (NXYLXK2017B09), Postgraduate Innovation Project of North Minzu University (YCX23078), the National Nature Science Foundation of China (No. 62366001), the Fundamental Research Funds for the Central Universities (2021KYQD23), the Key Research and Development Program of Ningxia (2022BSB03046), and the Nature Science Foundation of Ningxia Provincial (2023AAC02053).

Author information

Authors and Affiliations

Authors

Contributions

Z.X., conceptualization, methodology, validation, investigation, project administration, writing original draft. C.Z., methodology, software, validation, formal analysis, data curation, writing original draft. S.W., provided contributions and assistance in the organizational structure and language polishing of the article in the revised manuscript. J.M., in revising the manuscript, contributions and assistance were provided in NDC large data experiments and image data experiments. S.L., provided contributions and assistance in parameter sensitivity experiments in the revised manuscript.

Corresponding author

Correspondence to Zhenxia Xue.

Ethics declarations

Competing Interests

The authors declare no conflict of interest.

Ethical standard

This paper does not include any studies with human participants conducted by any of the authors.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xue, Z., Zhao, C., Wei, S. et al. Robust Fisher-regularized extreme learning machine with asymmetric Welsch-induced loss function for classification. Appl Intell 54, 7352–7376 (2024). https://doi.org/10.1007/s10489-024-05528-5

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-024-05528-5

Keywords