Skip to main content

Capacity and parasitic fixed points control in a recursive neural network

  • Formal Tools and Computational Models of Neurons and Neural Net Architectures
  • Conference paper
  • First Online:
Biological and Artificial Computation: From Neuroscience to Technology (IWANN 1997)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1240))

Included in the following conference series:

  • 80 Accesses

Abstract

This paper describes a new method for controlling the capacity and for diminishing the number of parasitic fixed points in a Recursive Neural Network RNN. Based on preliminary researches [1] a Recursive Neural Network may be seen as a graph. The matrix of weights W presents certain properties for which it may be called a tetrahedral matrix [2]. The geometrical properties of these kind of matrices may be used for classifying the n-dimensional state-vector space in n classes[2]. In the recall stage, a parameter vector σ may be introduced, which is related with the capacity of the network [3]. It may be shown that the bigger is the value of the i-th component the vector σ the higher became the capacity of the i class of the state-vector space[2]. Once the capacity has been controlled with the parameter σ, we introduce a new parameter that use the statistical deviation of the prototypes to compare them with those that appears as fixed points, eliminating in this way a great number of parasitic fixed points.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. V. Giménez, P. Gómez-Vilda, M. Pérez-Castellanos and V. Rodellar, A New Approach for Finding the Weights in a Neural Network using Graphs, Proc. of the 36th Midwest Symposium on Circuits and Systems, Detroit, August 16–18, 1993, pp. 113–116.

    Google Scholar 

  2. V. Giménez, E. Torrano, P. Gómez-Vilda and M. Pérez-Castellanos, A Class of Recursive Neural Networks Based on Analytic Geometry, Proc. of the International Conference on Brain Processes, Theories and Models. Canary Islands, Spain, November 12–17, 1995. pp.330–339.

    Google Scholar 

  3. V. Giménez, P. Gómez-Vilda, M. Pérez-Castellanos and E. Torrano, A New Approach for improving the capacity limit on a Recursive Neural Network, Proc. of the AMS'94. IASTED, Lugano, Switzerland, June 20–22, 1994, pp. 90–93.

    Google Scholar 

  4. V. Giménez, P. Gómez-Vilda, E. Torrano and M. Pérez-Castellanos, A New Algorithm for Implementing a Recursive Neural Network, Proc. of the IWANN'95 Málaga-Torremolinos, Spain, June 1995, pp. 252–259.

    Google Scholar 

  5. V. Rodellar, P. Gómez, M. Hermida and R. W. Newcomb, An Auditory Neural System for Speech Processing and Recognition, Proceedings of the ICARCV92, Singapore, September 16–18, 1992, pp. INV-6.2.1–5.

    Google Scholar 

  6. Yves Kamp and Martin Hasler, Recursive Neural Networks for Associative Memory, Wiley-Interscience Series in Systems and Optimization, England, 1990, pp. 10–34.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

José Mira Roberto Moreno-Díaz Joan Cabestany

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Giménez, V., Pérez-Castellanos, M., Rios Carrion, J., de Mingo, F. (1997). Capacity and parasitic fixed points control in a recursive neural network. In: Mira, J., Moreno-Díaz, R., Cabestany, J. (eds) Biological and Artificial Computation: From Neuroscience to Technology. IWANN 1997. Lecture Notes in Computer Science, vol 1240. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0032479

Download citation

  • DOI: https://doi.org/10.1007/BFb0032479

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-63047-0

  • Online ISBN: 978-3-540-69074-0

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics