Abstract
Successful training of artificial neural networks depends primarily on used architecture and suitable algorithm that is able to train given network. During training process error for many patterns reach low level very fast while for other patterns remains on relative high level. In this case already trained patterns make impossible to adjust all trainable network parameters and overall training error is unable to achieve desired level. The paper proposes soft pattern reduction mechanism that allows to reduce impact of already trained patterns which helps in getting better results for all training patterns. Suggested approach has been confirmed by several experiments.
This work was supported by the National Science Centre, Krakow, Poland, undergrant No. 2015/17/B/ST6/01880.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Wilamowski, B.M., Yu, H.: Neural network learning without backpropagation. IEEE Trans. Neural Netw. 21(11), 1793–1803 (2010)
Wilamowski, B.M.: Neural network architectures and learning algorithms, how not to be frustrated with neural networks. IEEE Ind. Electron. Mag. 3(4), 56–63 (2009)
Hunter, D., Yu, H., Pukish, M.S., Kolbusz, J., Wilamowski, B.M.: Selection of proper neural network sizes and architectures-a comparative study. IEEE Trans. Ind. Inf. 8, 228–240 (2012)
Hohil, M.E., Liu, D.: Solving the N-bit parity problem using neural networks. Neural Netw. 12, 1321–1323 (1999)
Wilamowski, B.M.: Challenges in applications of computational intelligence in industrial electronics. In: ISIE 2010, pp. 15–22, 4–7 July 2010
Fahlman, S.E., Lebiere, C.: The cascade-correlation learning architecture. In: Advances in Neural Information Processing Systems 2, pp. 524–532. Morgan Kaufmann, San Mateo (1990)
Lang, K.L., Witbrock, M.J.: Learning to tell two spirals apart. In: Proceedings of the 1988 Connectionists Models Summer School. Morgan Kaufman (1998)
Wilamowski, B.M., Korniak, J.: Learning architectures with enhanced capabilities and easier training. In: 19th IEEE International Conference on Intelligent Engineering Systems (INES 2015), 03–05 September, pp. 21–29 (2015)
Nguyen, G.H., Bouzerdoum, A., Phung, S.L.: Efficient supervised learning with reduced training exemplars. In: 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence), Hong Kong, pp. 2981–2987 (2008)
Lozano, M.T.: Data reduction techniques in classification processes. Ph.D. Dissertation, Universitat Jaume I, Spain (2007)
Chouvatut, V., Jindaluang, W., Boonchieng, E.: Training set size reduction in large dataset problems. In: 2015 International Computer Science and Engineering Conference (ICSEC), Chiang Mai, pp. 1–5 (2015)
Kolbusz, J., Rozycki, P.: Outliers elimination for error correction algorithm improvement. In: CS&P Proceedings 24th International Workshop Concurrency, Specification & Programming, (CS&P 2015), vol. 2, pp. 120–129 (2015)
Rozycki, P., Kolbusz, J., Lysenko, O., Wilamowski, B.M.: Neural network training improvement by patterns removing. Artif. Intell. Soft Comput. ICAISC 2017, 154–164 (2017)
Yu, H., Reiner, P., Xie, T., Bartczak, T., Wilamowski, B.M.: An incremental design of radial basis function networks. IEEE Trans. Neural Netw. Learn. Syst. 25(10), 1793–1803 (2014)
Xie, T.: Growing and learning algorithms of radial basis function networks. Ph.D. Dissertation, Auburn University, USA (2013)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this paper
Cite this paper
Rozycki, P., Kolbusz, J., Lysenko, O., Wilamowski, B.M. (2018). Soft Patterns Reduction for RBF Network Performance Improvement. In: Rutkowski, L., Scherer, R., Korytkowski, M., Pedrycz, W., Tadeusiewicz, R., Zurada, J. (eds) Artificial Intelligence and Soft Computing. ICAISC 2018. Lecture Notes in Computer Science(), vol 10841. Springer, Cham. https://doi.org/10.1007/978-3-319-91253-0_19
Download citation
DOI: https://doi.org/10.1007/978-3-319-91253-0_19
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-91252-3
Online ISBN: 978-3-319-91253-0
eBook Packages: Computer ScienceComputer Science (R0)