Skip to main content

Automatic Kernel Construction During the Neural Network Learning by Modified Fast Singular Value Decomposition

  • Conference paper
  • First Online:
Computational Science – ICCS 2024 (ICCS 2024)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14834))

Included in the following conference series:

  • 504 Accesses

Abstract

Thanks to the broad application fields, learning neural networks is still a more significant problem nowadays. Any attempt in the construction of faster learning algorithms is highly well come. This article presents a new way of learning neural networks with kernels with modified pseudo-inverse learning by modified SVD.

The new algorithm constructs the kernels during the learning and estimates the right number in the results. There is no longer a need to define their number of kernels before the learning. This means there is no need to test networks with a number of kernels that is too large, and the number of kernels is no longer a parameter in the selection process (in cross-validation).

The results show that the proposed algorithm constructs reasonable kernel bases, and final neural networks are accurate in classification.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    We use the paired t-test to test the significance of statistical differences.

References

  1. Broomhead, D.S., Lowe, D.: Multivariable functional interpolation and adaptive networks. Complex Syst. 2(3), 321–355 (1988)

    MathSciNet  Google Scholar 

  2. Dudek, G.: A constructive approach to data-driven randomized learning for feedforward neural networks. Appl. Soft Comput. 112 (2021). https://doi.org/10.1016/j.asoc.2021.107797

  3. Halko, N., Martinsson, P.G., Tropp, J.A.: Finding structure with randomness: probabilistic algorithms for constructing approximate matrix decompositions. SIAM Rev. 53(2), 217–288 (2011)

    Article  MathSciNet  Google Scholar 

  4. Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: a new learning scheme of feedforward neural networks. In: International Joint Conference on Neural Networks, Budapest, Hungary, pp. 985–990. IEEE Press (2004)

    Google Scholar 

  5. Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: theory and applications. Neurocomputing 70(1–3), 489–501 (2006)

    Article  Google Scholar 

  6. Merz, C.J., Murphy, P.M.: UCI repository of machine learning databases (1998). http://www.ics.uci.edu/~mlearn/MLRepository.html

  7. Tang, J., Deng, C., Member, S., Huang, G.B.: Extreme learning machine for multilayer perceptron. IEEE Trans. Neural Netw. Learn. Syst. 27(4), 809–821 (2016)

    Article  MathSciNet  Google Scholar 

  8. Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1995)

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Norbert Jankowski .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Jankowski, N., Dudek, G. (2024). Automatic Kernel Construction During the Neural Network Learning by Modified Fast Singular Value Decomposition. In: Franco, L., de Mulatier, C., Paszynski, M., Krzhizhanovskaya, V.V., Dongarra, J.J., Sloot, P.M.A. (eds) Computational Science – ICCS 2024. ICCS 2024. Lecture Notes in Computer Science, vol 14834. Springer, Cham. https://doi.org/10.1007/978-3-031-63759-9_25

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-63759-9_25

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-63758-2

  • Online ISBN: 978-3-031-63759-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics