Abstract
The extreme learning machine (ELM) requires a large number of hidden layer nodes in the training process. Thus, random parameters will exponentially increase and affect network stability. Moreover, the single activation function affects the generalization capability of the network. This paper proposes a derived least square fast learning network (DLSFLN) to solve the aforementioned problems. DLSFLN uses the inheritance of some functions to obtain various activation functions through continuous differentiation of functions. The types of activation functions were increased and the mapping capability of hidden layer neurons was enhanced when the random parameter dimension was maintained. DLSFLN randomly generates the input weights and hidden layer thresholds and uses the least square method to determine the connection weights between the output and the input layers and that between the output and the input nodes. The regression and classification experiments show that DLSFLN has a faster training speed and better training accuracy, generalization capability, and stability compared with other neural network algorithms, such as fast learning network(FLN).


Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Ak R, Fink O, Zio E (2015) Two machine learning approaches for short-term wind speed time-series prediction. IEEE T Neur Net Lear 27(8):1734–1747
Anam K, Al-Jumaily A (2017) Evaluation of extreme learning machine for classification of individual and combined finger movements using electromyography on amputees and non-amputees. Neural Netw 85:51–68
Bartlett PL (1998) The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network. IEEE T Inform Theory 44(2):525–536
Chen J, Chen H, Wan X, Zheng G (2016) Mr -elm : a mapreduce-based framework for large-scale elm training in big data era. Neural Comput Appl 27(1):101–110
Chen Z, Gryllias K, Li W (2019) Mechanical fault diagnosis using convolutional neural networks and extreme learning machine. Mech Syst Signal Process 133
Dai H, Cao J, Wang T, Deng M, Yang Z (2019) Multilayer one-class extreme learning machine. Neural Netw 115:11–22
Deng C, Huang GB, Xu J, Tang J (2015) Extreme learning machines: new trends and applications. Sci China (Inform Sci) 58(2):20301–020301
Huang G, Huang GB, Song S, You K (2015) Trends in extreme learning machines: a review. Neural Netw 61:32–48
Huang GB, Zhu QY, Siew C (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. Neural netw 2:985–990
Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1-3):489–501
Huang GB, Zhou H, Ding X, Zhang R (2011) Extreme learning machine for regression and multiclass classification. IEEE T Syst Man Cy B) 42(2):513–529
Kim J, Kim J, Jang G, Lee M (2017) Fast learning method for convolutional neural networks using extreme learning machine and its application to lane detection. Neural Netw 87:109–121
Kumar N, Savitha R, Mamun A (2018) Ocean wave height prediction using ensemble of extreme learning machine. Neurocomputing 277:12–20
Kutlu Y, Yayık A, Yildirim E, Yildirim S (2019) Lu triangularization extreme learning machine in eeg cognitive task classification. Neural Comput Appl 31(4):1117–1126
Li G, Niu P (2016) Combustion optimization of a coal-fired boiler with double linear fast learning network. Soft Comput 20(1):149–156
Li G, Niu P, Duan X, Zhang X (2014a) Fast learning network: a novel artificial neural network with a fast learning speed. Neural Comput Applic 24(7-8):1683–1695
Li GQ, Niu PF, Wang HB, Liu YC (2014b) Least square fast learning network for modeling the combustion efficiency of a 300wm coal-fired boiler. Neural Netw 51:57–66
Li K, Xiong M, Li F, et al. (2019) A novel fault diagnosis algorithm for rotating machinery based on a sparsity and neighborhood preserving deep extreme learning machine. Neurocomputing 350:261–270
Li Z, Fan X, Chen G, Yang G, Sun Y (2017) Optimization of iron ore sintering process based on elm model and multi-criteria evaluation. Neural Comput Applic 28(8):2247–2253
Ma YP, Niu PF, Zhang XX, Li GQ (2017) Research and application of quantum-inspired double parallel feed-forward neural network. Knowl-Based Syst 136:140–149
Maliha A, Yusof R, Shapiai M (2018) Extreme learning machine for structured output spaces. Neural Comput Applic 30(4):1251–1264
Mirza B, Lin Z (2016) Meta-cognitive online sequential extreme learning machine for imbalanced and concept-drifting data classification. Neural Netw 80:79–94
Nayak D, Dash R, Majhi B (2018) Discrete ripplet-ii transform and modified pso based improved evolutionary extreme learning machine for pathological brain detection. Neurocomputing 282:232–247
Raghuwanshi B, Shukla S (2018) Class-specific extreme learning machine for handling binary class imbalance problem. Neural Netw 105:206–217
Rumelhart D, Hinton G, Williams R (1988) Learning representations by back-propagating errors. Cognitive modeling 5 (3):1
Shinozaki N, Sibuya M, Tanabe K (1972) Numerical algorithms for the moore-penrose inverse of a matrix: direct methods. Ann Inst Stat Math 24(1):193–203
Singh Y, Chandra P (2003) A class+ 1 sigmoidal activation functions for ffanns. J Econ Dyn Control 28(1):183–187
Söderström T, Stewart G (1974) On the numerical properties of an iterative method for computing the moore–penrose generalized inverse. SIAM J Numer Anal 11(1):61–74
Xiao W, Zhang J, Li Y, Zhang S, Yang W (2017) Class-specific cost regulation extreme learning machine for imbalanced classification. Neurocomputing 261:70–82
Xie J, Liu S, Dai H (2019) Manifold regularization based distributed semi-supervised learning algorithm using extreme learning machine over time-varying network. Neurocomputing 355:24–34
Yildirim H, Özkale M (2019) The performance of elm based ridge regression via the regularization parameters. Expert Syst Appl 134:225–233
Yu Y, Sun Z (2017) Sparse coding extreme learning machine for classification. Neurocomputing 261:50–56
Zhang J, Xiao W, Li Y, Zhang S (2018) Residual compensation extreme learning machine for regression. Neurocomputing 311:126–136
Zheng W, Qian Y, Lu H (2013) Text categorization based on regularization extreme learning machine. Neural Comput Applic 22(3-4):447–456
Acknowledgements
This work was supported by the following grants: Major Program of the National Natural Science Foundation of China (No. 11790282); National Natural Science Foundation of China (No. 11702179,51605315); Young Top-Notch Talents Program of Higher School in Hebei Province (No. BJ2019035); the Independent Project of State Key Laboratory of Mechanical Behavior and System Safety of Traffic Engineering Structures (No. ZZ2020-42); Preferred Hebei Postdoctoral Research Project (No. B2019003017). S&T Promgram of Hebei(No.20310803D); Postgraduate innovation funding project of Shijiazhuang Tiedao University(YC2020030) .
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interests
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Wang, M., Jia, S., Chen, E. et al. A derived least square fast learning network model. Appl Intell 50, 4176–4194 (2020). https://doi.org/10.1007/s10489-020-01773-6
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10489-020-01773-6