Abstract
Extreme learning machine (ELM) has fast learning speed and perfect performance, at the same time, ELM provides a unified learning framework with a widespread type of feature mappings which can be applied in multiclass classification applications directly. These advantages make ELM become one of the best classification algorithms, and ELM has attracted great attention in supervised learning and semi-supervised learning. However, noise and outliers of data are usually existed in the real world, which will affect the performance of ELM. To improve the robustness and classification performance of ELM, we propose the Kernel Risk-Sensitive Mean p-power Loss Based Hyper-graph Regularized Robust Extreme Learning Machine (KRP-HRELM) method. On the one side, as a nonlinear similarity measure defined in the reproducing kernel space, the kernel risk-sensitive mean p-power loss (KRP) can effectively weaken the negative effects caused by noise and outliers. Therefore, the KRP is introduced into ELM to enhance its robustness. Then, the application of hyper-graph can help the ELM to explore higher-order geometric structures among more sampling points, thereby obtaining more comprehensive data information. In addition, to obtain a more sparsity network model, the L2,1-norm is used to constrain the output weight. On the other side, improving the practical application ability of KRP-HRELM is also the focus of our research, so KRP-HRELM is extended to semi-supervised learning, which is called the semi-supervised KRP-HRELM (SS-KRP-HRELM). Notably, the results of the robustness experiment have proved that our method has extraordinary robustness. At the same time, by using four evaluation measures such as accuracy, recall, precision, and F1-measure, to evaluate the classification results, we can find that our method has obtained better classification performance than other advanced methods.






Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Leshno M, Lin VY, Pinkus A, Schocken S (1993) Multilayer feedforward networks with a nonpolynomial activation function can approximate any function. Neural Netw 6(6):861–867
Rumelhart DE, Hinton GE, Williams RJ (1986) Learning representations by back-propagating errors. Nature 323(6088):533–536
Suykens JA, Vandewalle J (1999) Least squares support vector machine classifiers. Neural Process Lett 9(3):293–300
Tong S, Koller D (2001) Support vector machine active learning with applications to text classification. J Mach Learn Res 2(Nov):45–66
Huang G-B, Zhu Q-Y, Siew C-K (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1–3):489–501
Huang G-B, Chen L, Siew CK (2006) Universal approximation using incremental constructive feedforward networks with random hidden nodes. Neural Netw 17(4):879–892
Huang G-B, Zhu Q-Y, Siew C-K (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. Neural Netw 2:985–990
Tang J, Deng C, Huang G-B (2015) Extreme learning machine for multilayer perceptron. IEEE Trans Neural Netw Learn Syst 27(4):809–821
Huang G-B (2014) An insight into extreme learning machines: random neurons, random features and kernels. Cogn Comput 6(3):376–390
Huang G-B, Ding X, Zhou H (2010) Optimization method based extreme learning machine for classification. Neurocomputing 74(1–3):155–163
Salman HM (2019) Text Classification Based on Weighted Extreme Learning Machine. Ibn AL-Haitham J Pure Appl Sci 32(1):197–204
Jiang M, Pan Z, Li N (2017) Multi-label text categorization using L21-norm minimization extreme learning machine. Neurocomputing 261:4–10
Chen Y, Song S, Li S, Yang L, Wu C (2018) Domain space transfer extreme learning machine for domain adaptation. IEEE Trans Cybern 49(5):1909–1922
Huang Z, Yu Y, Gu J, Liu H (2016) An efficient method for traffic sign recognition based on extreme learning machine. IEEE Trans Cybern 47(4):920–933
Deng J, Frühholz S, Zhang Z, Schuller B (2017) Recognizing emotions from whispered speech based on acoustic feature transfer learning. IEEE Access 5:5235–5246
Zhang Y, Wang Y, Zhou G, Jin J, Wang B, Wang X, Cichocki A (2018) Multi-kernel extreme learning machine for EEG classification in brain-computer interfaces. Expert Syst Appl 96:302–310
Liang N-Y, Saratchandran P, Huang G-B, Sundararajan N (2006) Classification of mental tasks from EEG signals using extreme learning machine. Int J Neural Syst 16(01):29–38
Li L, Zeng J, Jiao L, Liang P, Liu F, Yang S (2019) Online active extreme learning machine with discrepancy sampling for PolSAR classification. IEEE Trans Geosci Remote Sens 58(3):2027–2041
Samanta IS, Rout PK, Mishra S (2021) Feature extraction and power quality event classification using Curvelet transform and optimized extreme learning machine. Electr Eng 1–16
Lv W, Kang Y, Zheng WX, Wu Y, Li Z (2020) Feature-temporal semi-supervised extreme learning machine for robotic terrain classification. IEEE Trans Circuits Syst II Express Briefs 67(12):3567–3571
Lv F, Han M (2019) Hyperspectral image classification based on multiple reduced kernel extreme learning machine. Int J Mach Learn Cybern 10(12):3397–3405
Zhang K, Luo M (2015) Outlier-robust extreme learning machine for regression problems. Neurocomputing 151:1519–1527
Horata P, Chiewchanwattana S, Sunat K (2013) Robust extreme learning machine. Neurocomputing 102:31–44
Huang G-B, Zhou H, Ding X, Zhang R (2011) Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern Part B (Cybernetics) 42(2):513–529
Li R, Wang X, Lei L, Song Y (2018) L2,1-norm based loss function and regularization extreme learning machine. IEEE Access 7:6575–6586
Zhao Y-P, Tan J-F, Wang J-J, Yang Z (2019) C-loss based extreme learning machine for estimating power of small-scale turbojet engine. Aerosp Sci Technol 89:407–419
Zhang T, Wang S, Zhang H, Xiong K, Wang L (2019) Kernel risk-sensitive mean p-power error algorithms for robust learning. Entropy 21(6):588
Chen B, Xing L, Xu B, Zhao H, Zheng N, Principe JC (2017) Kernel risk-sensitive loss: definition, properties and application to robust adaptive filtering. IEEE Trans Signal Process 65(11):2888–2901
Jiao C-N, Gao Y-L, Yu N, Liu J-X, Qi L-Y (2020) Hyper-graph Regularized Constrained NMF for Selecting Differentially Expressed Genes and Tumor Classification. IEEE J Biomed Health Inform 24(10):3002–3011
Cai D, He X, Han J, Huang TS (2010) Graph regularized nonnegative matrix factorization for data representation. IEEE Trans Pattern Anal Mach Intell 33(8):1548–1560
Huang G, Song S, Gupta JN, Wu C (2014) Semi-supervised and unsupervised extreme learning machines. IEEE Trans Cybern 44(12):2405–2417
Luo X, Liu F, Yang S, Wang X, Zhou Z (2015) Joint sparse regularization based sparse semi-supervised extreme learning machine (S3ELM) for classification. Knowl Based Syst 73:149–160
Liang N-Y, Huang G-B, Saratchandran P, Sundararajan N (2006) A fast and accurate online sequential learning algorithm for feedforward networks. IEEE Trans Neural Netw 17(6):1411–1423
Chen B, Xing L, Wang X, Qin J, Zheng N (2017) Robust learning with kernel mean p-power error loss. IEEE Trans Cybern 48(7):2101–2113
Ren L-R, Gao Y-L, Liu J-X, Shang J, Zheng C-H (2020) Correntropy induced loss based sparse robust graph regularized extreme learning machine for cancer classification. BMC Bioinform 21(1):1–22
Yu N, Wu M-J, Liu J-X, Zheng C-H, Xu Y (2020) Correntropy-based hypergraph regularized NMF for clustering and feature selection on multi-cancer integrated data. IEEE Trans Cybern 51(8):1-12
Zhang N, Ding S (2017) Unsupervised and semi-supervised extreme learning machine with wavelet kernel for high dimensional data. Memetic Comput 9(2):129–139
Ke J, Gong C, Liu T, Zhao L, Yang J, Tao D (2020) Laplacian Welsch regularization for robust semisupervised learning. IEEE Trans Cybern. https://doi.org/10.1109/TCYB.2019.2953337
Shen Q, Ban X, Guo C (2017) Urban traffic congestion evaluation based on kernel the semi-supervised extreme learning machine. Symmetry 9(5):70
Yang J, Cao J, Wang T, Xue A, Chen B (2020) Regularized correntropy criterion based semi-supervised ELM. Neural Netw 122:117–129
Funding
This work was supported in part by the grants provided by the National Natural Science Foundation of China, No. 61872220.
Author information
Authors and Affiliations
Contributions
ZXN and JXL contributed to the design of the study. ZXN proposed the KRP-HRELM and SS-KRP-HRELM methods, performed the experiments, and drafted the manuscript. LRR and RZ contributed to the data analysis. JW and CNJ contributed to improving the writing of manuscripts. All authors read and approved the final manuscript.
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Niu, ZX., Jiao, CN., Ren, LR. et al. Kernel risk-sensitive mean p-power loss based hyper-graph regularized robust extreme learning machine and its semi-supervised extension for sample classification. Appl Intell 52, 8572–8587 (2022). https://doi.org/10.1007/s10489-021-02852-y
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10489-021-02852-y