Skip to main content

Soft Patterns Reduction for RBF Network Performance Improvement

  • Conference paper
  • First Online:
Artificial Intelligence and Soft Computing (ICAISC 2018)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 10841))

Included in the following conference series:

  • 2166 Accesses

Abstract

Successful training of artificial neural networks depends primarily on used architecture and suitable algorithm that is able to train given network. During training process error for many patterns reach low level very fast while for other patterns remains on relative high level. In this case already trained patterns make impossible to adjust all trainable network parameters and overall training error is unable to achieve desired level. The paper proposes soft pattern reduction mechanism that allows to reduce impact of already trained patterns which helps in getting better results for all training patterns. Suggested approach has been confirmed by several experiments.

This work was supported by the National Science Centre, Krakow, Poland, undergrant No. 2015/17/B/ST6/01880.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Wilamowski, B.M., Yu, H.: Neural network learning without backpropagation. IEEE Trans. Neural Netw. 21(11), 1793–1803 (2010)

    Article  Google Scholar 

  2. Wilamowski, B.M.: Neural network architectures and learning algorithms, how not to be frustrated with neural networks. IEEE Ind. Electron. Mag. 3(4), 56–63 (2009)

    Article  Google Scholar 

  3. Hunter, D., Yu, H., Pukish, M.S., Kolbusz, J., Wilamowski, B.M.: Selection of proper neural network sizes and architectures-a comparative study. IEEE Trans. Ind. Inf. 8, 228–240 (2012)

    Article  Google Scholar 

  4. Hohil, M.E., Liu, D.: Solving the N-bit parity problem using neural networks. Neural Netw. 12, 1321–1323 (1999)

    Article  Google Scholar 

  5. Wilamowski, B.M.: Challenges in applications of computational intelligence in industrial electronics. In: ISIE 2010, pp. 15–22, 4–7 July 2010

    Google Scholar 

  6. Fahlman, S.E., Lebiere, C.: The cascade-correlation learning architecture. In: Advances in Neural Information Processing Systems 2, pp. 524–532. Morgan Kaufmann, San Mateo (1990)

    Google Scholar 

  7. Lang, K.L., Witbrock, M.J.: Learning to tell two spirals apart. In: Proceedings of the 1988 Connectionists Models Summer School. Morgan Kaufman (1998)

    Google Scholar 

  8. Wilamowski, B.M., Korniak, J.: Learning architectures with enhanced capabilities and easier training. In: 19th IEEE International Conference on Intelligent Engineering Systems (INES 2015), 03–05 September, pp. 21–29 (2015)

    Google Scholar 

  9. Nguyen, G.H., Bouzerdoum, A., Phung, S.L.: Efficient supervised learning with reduced training exemplars. In: 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence), Hong Kong, pp. 2981–2987 (2008)

    Google Scholar 

  10. Lozano, M.T.: Data reduction techniques in classification processes. Ph.D. Dissertation, Universitat Jaume I, Spain (2007)

    Google Scholar 

  11. Chouvatut, V., Jindaluang, W., Boonchieng, E.: Training set size reduction in large dataset problems. In: 2015 International Computer Science and Engineering Conference (ICSEC), Chiang Mai, pp. 1–5 (2015)

    Google Scholar 

  12. Kolbusz, J., Rozycki, P.: Outliers elimination for error correction algorithm improvement. In: CS&P Proceedings 24th International Workshop Concurrency, Specification & Programming, (CS&P 2015), vol. 2, pp. 120–129 (2015)

    Google Scholar 

  13. Rozycki, P., Kolbusz, J., Lysenko, O., Wilamowski, B.M.: Neural network training improvement by patterns removing. Artif. Intell. Soft Comput. ICAISC 2017, 154–164 (2017)

    Google Scholar 

  14. Yu, H., Reiner, P., Xie, T., Bartczak, T., Wilamowski, B.M.: An incremental design of radial basis function networks. IEEE Trans. Neural Netw. Learn. Syst. 25(10), 1793–1803 (2014)

    Article  Google Scholar 

  15. Xie, T.: Growing and learning algorithms of radial basis function networks. Ph.D. Dissertation, Auburn University, USA (2013)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pawel Rozycki .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Rozycki, P., Kolbusz, J., Lysenko, O., Wilamowski, B.M. (2018). Soft Patterns Reduction for RBF Network Performance Improvement. In: Rutkowski, L., Scherer, R., Korytkowski, M., Pedrycz, W., Tadeusiewicz, R., Zurada, J. (eds) Artificial Intelligence and Soft Computing. ICAISC 2018. Lecture Notes in Computer Science(), vol 10841. Springer, Cham. https://doi.org/10.1007/978-3-319-91253-0_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-91253-0_19

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-91252-3

  • Online ISBN: 978-3-319-91253-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics