Skip to main content

Extended nonlinear hebbian learning for developing sparse-distributed representation

  • Plasticity Phenomena (Maturing, Learning & Memory)
  • Conference paper
  • First Online:
Foundations and Tools for Neural Modeling (IWANN 1999)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1606))

Included in the following conference series:

Abstract

Recently, Hebbian learning has been extended to nonlinear units with a number of interesting properties and potential applications, e.g., blind signal separation. However, when generalizing these nonlinear Hebbian learning algorithms to a network with multiple units, all the existing methods assume orthonormality constraints, which is too strict in many occasions. In this paper, we propose two alternative approaches to generalize nonlinear Hebbian learning to a network with M neurons, based on the mixture-of-experts paradigm. Preliminary simulation shows interesting results.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. E. Oja, “Neural Networks, principal components, and subspaces,” Int. J. Neural Systems, vol. 1, no. 1, pp. 61–68, 1989.

    Article  Google Scholar 

  2. E. Oja, H. Ogawa, and J. Wangviwattana, “Learning in nonlinear constrained Hebbian networks” Artificial Neural Networks (Proc. ICANN-91, Espoo, Finland), Eds. T. Kohonen, etc., North-Holland, pp. 385–390, 1991

    Google Scholar 

  3. M.I. Jordan and R.A. Jacobs, “Adaptive mixtures of local experts,” Neural Computa., vol. 3, pp. 79–87, 1991.

    Article  Google Scholar 

  4. A. Sudjianto, M.H. Hassoun, “Nonlinear Hebbian rule: A statistical interpretation,” In: Proc. IEEE Intl. Conf. Neural Networks, pp. 1247–1252, 1994

    Google Scholar 

  5. J. Karhunen, J. Joutsensalo, “Representation and separation of signals using nonlinear PCA type learning,” Neural Networks, vol. 7, pp. 113–127, 1994.

    Article  Google Scholar 

  6. J. Karhunen, P. Rajunen and E. Oja, “The nonlinear PCA criterion in blind source separation: relations with other approaches,” Technical Report, Helsinki University of Technology, 1998.

    Google Scholar 

  7. C. Fyfe, R. Baddeley, “Finding compact and sparse-distributed representations of visual images,” Network: Computation in Neural Systems, vol. 6, pp. 333–344, 1995.

    Article  MATH  Google Scholar 

  8. C. Fyfe, R. Baddeley, “Nonlinear data structure extraction using simple Hebbian networks,” Biol. Cybern., vol. 72, pp. 533–541, 1995.

    Article  MATH  Google Scholar 

  9. T. Martinetz, K. Schulten, “Neural gas’ network learns topologies,” in Kohonen et al. (Eds.), Artificial neural networks, (vol. I, pp. 397–402). Amsterdam: North Holland, 1991.

    Google Scholar 

  10. K. Rose, F. Gurewitz and G. Fox, “Statistical mechanics and phase transitions in clustering,” Physical Rev. Lett., vol. 65, pp. 945–948, 1990.

    Article  Google Scholar 

  11. L. Xu, “A modified gating network for the mixtures of experts architecture,” in World Congress on Neural Networks, San Diego, pp. II. 405–410, 1994.

    Google Scholar 

  12. B.L. Zhang, L. Xu and M.Y. Fu, “Learning multiple causes by competition enhanced least mean square error reconstruction,” Intl. J. Neural System, vol. 7, pp. 223–236, 1996.

    Article  Google Scholar 

  13. C.J.S. Webber, “Emergent componential coding of a handwritten image database by neural self-organization,” Network: Computation in Neural System, vol. 9, pp. 433–447, 1998.

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

José Mira Juan V. Sánchez-Andrés

Rights and permissions

Reprints and permissions

Copyright information

© 1999 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Zhang, Bl., Gedeon, T.D. (1999). Extended nonlinear hebbian learning for developing sparse-distributed representation. In: Mira, J., Sánchez-Andrés, J.V. (eds) Foundations and Tools for Neural Modeling. IWANN 1999. Lecture Notes in Computer Science, vol 1606. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0098201

Download citation

  • DOI: https://doi.org/10.1007/BFb0098201

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-66069-9

  • Online ISBN: 978-3-540-48771-5

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics