Skip to main content

Selective Weight Update for Neural Network – Its Backgrounds

  • Conference paper
Active Media Technology (AMT 2013)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 8210))

Included in the following conference series:

Abstract

VSF–Network, Vibration Synchronizing Function Network, is a hybrid neural network combining Chaos Neural Network and hierarchical neural network. VSF–Network is designed for symbol learning. VSF–Network finds unknown parts of input data by comparing to stored pattern and it learns unknown patterns using unused part of the network. New patterns are learned incrementally and they are stored as sub-networks . Combinations of patterns are represented as combinations of the sub-networks. In this paper, the two theoretical backgrounds of VSF–Network are introduced. At the first, an incremental learning framework with Chaos Neural Networks is introduced. Next, the pattern recognition with the combined with symbols is introduced. From the viewpoints of9 differential topology and mixture distribution, the combined pattern recognition by VSF-Network is explained. Through an experiment, both the incremental learning capability and the pattern recognition with pattern combination are shown.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Kakemoto, Y., Nakasuka, S.: The dynamics of incremental learning by vsf-network. In: Alippi, C., Polycarpou, M., Panayiotou, C., Ellinas, G. (eds.) ICANN 2009, Part I. LNCS, vol. 5768, pp. 688–697. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  2. Kakemoto, Y., Nakasuka, S.: Neural assembly generation by selective connection weight updating. In: Proc. IjCNN 2010 (2010)

    Google Scholar 

  3. Inamura, T., Tanie, H., Nakamura, Y.: Proto-symbol development and manipulation in the geometry of stochastic model for motion generation and recognition. Technical Report NC2003-65, IEICE (2003)

    Google Scholar 

  4. Chandler, D.: Semiotics for Beginners. Routledge (1995)

    Google Scholar 

  5. Giraud-Carrier, C.: A note on the utility of incremental learning. AI Communications 13, 215–223 (2000)

    MATH  Google Scholar 

  6. Lin, M., Tang, K., Yao, X.: Incremental learning by negative correlation leaning. In: Proc. of IJCNN 2008 (2008)

    Google Scholar 

  7. Aihara, T., Tanabe, T., Toyoda, M.: Chaotic neural networks. Phys. Lett. 144A, 333–340 (1990)

    Article  Google Scholar 

  8. Uchiyama, S., Fujisaki, H.: Chaotic itinerancy in the oscillator neural network without lyapunov functions. Chaos 14, 699–706 (2004)

    Article  Google Scholar 

  9. Hopfield, J.: Neurons with graded response have collective computational properties like those of two-stage neurons. Proceedings of the National Academy of Sciences of U.S.A. 81, 13088–13092 (1984)

    Article  Google Scholar 

  10. Kaneko, K.: Chaotic but regular posi-nega switch among coded attractors by cluster size variation. Phys. Rev. Lett. 63, 219 (1989)

    Article  MathSciNet  Google Scholar 

  11. Komuro, M.: A mechanism of chaotic itinerancy in globally coupled maps. In: Dynamical Systems (NDDS 2002) (2002)

    Google Scholar 

  12. Cobb, L., Ragade, R.: Applications of catastrophe theory in the behavioral and life. Behavioral Science 79(23), 291 (1978)

    MathSciNet  Google Scholar 

  13. Cobb, L., Watson, B.: Statistical catastrophe theory: An overview. Mathematical Modellin 23(8), 1–27 (1980)

    MathSciNet  Google Scholar 

  14. Thom, R.: Stability and Morphogenesis.: Essai D’une Theorie Generale Des Modeles. W. A. Benjamin, California (1973)

    Google Scholar 

  15. Grasman, R., van der Maas, H., Wagenmakers, E.: Journal of statistical software. Mathematical Modelling 32, 1–27 (2009)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer International Publishing Switzerland

About this paper

Cite this paper

Kakemoto, Y., Nakasuka, S. (2013). Selective Weight Update for Neural Network – Its Backgrounds. In: Yoshida, T., Kou, G., Skowron, A., Cao, J., Hacid, H., Zhong, N. (eds) Active Media Technology. AMT 2013. Lecture Notes in Computer Science, vol 8210. Springer, Cham. https://doi.org/10.1007/978-3-319-02750-0_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-02750-0_12

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-02749-4

  • Online ISBN: 978-3-319-02750-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics