Abstract
VSF-Network,Vibration Synchronizing Function Network, is a hybrid neural network combining a chaos neural network with a hierarchical network. It is a neural network model which learns symbols. In this paper, the two theoretical backgrounds of VSF–Network are described. The first one is the incremental learning by CNN and the second background is ensemble learning. VSF-Network finds unknown parts of input data by comparing to learned pattern and it learns the unknown parts using unused part of the network. By the ensemble learning, the capability of VSF-network for recognizing combined patterns that are learned by every sub-network of VSF-network can be explained. Through the experiments, we show that VSF-network can recognize combined patterns only if it has learned parts of the patterns and show factors for affecting performance of the learning.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Kakemoto, Y., Nakasuka, S.: Dynamics of Incremental Learning by VSF-Network. In: Alippi, C., Polycarpou, M., Panayiotou, C., Ellinas, G. (eds.) ICANN 2009. LNCS, vol. 5768, pp. 688–697. Springer, Heidelberg (2009)
Kakemoto, Y., Nakasuka, S.: Neural assembly generation by selective connection weight updating. In: Proc. IJCNN 2010 (2010)
Inamura, T., Tanie, H., Nakamura, Y.: Proto-symbol development and manipulation in the geometry of stochastic model for motion generation and recognition. Technical Report NC2003-65. IEICE (2003)
Chandler, D.: Semiotics for Beginners. Routledge (1995)
Kakemoto, Y., Nakasuka, S.: The learning and dynamics of vsf-network. In: Proc. of ISIC 2006 (2006)
Giraud-Carrier, C.: A note on the utility of incremental learning. AI Communications 13, 215–223 (2000)
Lin, M., Tang, K., Yao, X.: Incremental learning by negative correlation leaning. In: Proc. of IJCNN 2008 (2008)
Aihara, T., Tanabe, T., Toyoda, M.: Chaotic neural networks. Phys. Lett. 144A, 333–340 (1990)
Kaneko, K.: Chaotic but regular posi-nega switch among coded attractors by cluster size variation. Phys. Rev. Lett. 63, 219 (1989)
Komuro, M.: A mechanism of chaotic itinerancy in globally coupled maps. In: Dynamical Systems, NDDS 2002 (2002)
Uchiyama, S., Fujisaki, H.: Chaotic itinerancy in the oscillator neural network without lyapunov functions. Chaos 14, 699–706 (2004)
Jones, L.K.: A simple lemma on greedy approximation in hilbert space and convergence rates for projection pursuit regression and neural networktraining. Annals of Statistics 20(1), 608–613 (1992)
Barron, A.R.: Universal approximation bounds for superpositions of a sigmoidal function. IEEE Trans. Information Theory 39(3), 930–945 (1993)
Girosi, F., Anzellotti, G.: Convergence rates of approximation by translates. artificial intelligence laboratory technical report. Technical report, Massachusetts Institute of Technology (1992)
Murata, N.: Approximation bounds of three-layered neural networks – a theorem on an integral transform with ridge functions. Electronics and Communications in Japan 79(3), 23–33 (1996)
Opitz, D., Maclin, R.: Popular ensemble methods: An empirical study. Journal of Artificial Intelligence Research 11, 169–198 (1999)
Amari, S., Nagaoka, H.: Methods of Information Geometry. Oxford University Press (2007)
Akaho, S.: Information geometry in machine learning. Journal of the Society of Instrument and Control Engineers 44(5), 299–306 (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kakemoto, Y., Nakasuka, S. (2012). Selective Weight Update Rule for Hybrid Neural Network. In: Wang, J., Yen, G.G., Polycarpou, M.M. (eds) Advances in Neural Networks – ISNN 2012. ISNN 2012. Lecture Notes in Computer Science, vol 7367. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-31346-2_56
Download citation
DOI: https://doi.org/10.1007/978-3-642-31346-2_56
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-31345-5
Online ISBN: 978-3-642-31346-2
eBook Packages: Computer ScienceComputer Science (R0)