Abstract
Thanks to the broad application fields, learning neural networks is still a more significant problem nowadays. Any attempt in the construction of faster learning algorithms is highly well come. This article presents a new way of learning neural networks with kernels with modified pseudo-inverse learning by modified SVD.
The new algorithm constructs the kernels during the learning and estimates the right number in the results. There is no longer a need to define their number of kernels before the learning. This means there is no need to test networks with a number of kernels that is too large, and the number of kernels is no longer a parameter in the selection process (in cross-validation).
The results show that the proposed algorithm constructs reasonable kernel bases, and final neural networks are accurate in classification.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
We use the paired t-test to test the significance of statistical differences.
References
Broomhead, D.S., Lowe, D.: Multivariable functional interpolation and adaptive networks. Complex Syst. 2(3), 321–355 (1988)
Dudek, G.: A constructive approach to data-driven randomized learning for feedforward neural networks. Appl. Soft Comput. 112 (2021). https://doi.org/10.1016/j.asoc.2021.107797
Halko, N., Martinsson, P.G., Tropp, J.A.: Finding structure with randomness: probabilistic algorithms for constructing approximate matrix decompositions. SIAM Rev. 53(2), 217–288 (2011)
Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: a new learning scheme of feedforward neural networks. In: International Joint Conference on Neural Networks, Budapest, Hungary, pp. 985–990. IEEE Press (2004)
Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: theory and applications. Neurocomputing 70(1–3), 489–501 (2006)
Merz, C.J., Murphy, P.M.: UCI repository of machine learning databases (1998). http://www.ics.uci.edu/~mlearn/MLRepository.html
Tang, J., Deng, C., Member, S., Huang, G.B.: Extreme learning machine for multilayer perceptron. IEEE Trans. Neural Netw. Learn. Syst. 27(4), 809–821 (2016)
Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1995)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Jankowski, N., Dudek, G. (2024). Automatic Kernel Construction During the Neural Network Learning by Modified Fast Singular Value Decomposition. In: Franco, L., de Mulatier, C., Paszynski, M., Krzhizhanovskaya, V.V., Dongarra, J.J., Sloot, P.M.A. (eds) Computational Science – ICCS 2024. ICCS 2024. Lecture Notes in Computer Science, vol 14834. Springer, Cham. https://doi.org/10.1007/978-3-031-63759-9_25
Download citation
DOI: https://doi.org/10.1007/978-3-031-63759-9_25
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-63758-2
Online ISBN: 978-3-031-63759-9
eBook Packages: Computer ScienceComputer Science (R0)