Abstract
Recently, Hebbian learning has been extended to nonlinear units with a number of interesting properties and potential applications, e.g., blind signal separation. However, when generalizing these nonlinear Hebbian learning algorithms to a network with multiple units, all the existing methods assume orthonormality constraints, which is too strict in many occasions. In this paper, we propose two alternative approaches to generalize nonlinear Hebbian learning to a network with M neurons, based on the mixture-of-experts paradigm. Preliminary simulation shows interesting results.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
E. Oja, “Neural Networks, principal components, and subspaces,” Int. J. Neural Systems, vol. 1, no. 1, pp. 61–68, 1989.
E. Oja, H. Ogawa, and J. Wangviwattana, “Learning in nonlinear constrained Hebbian networks” Artificial Neural Networks (Proc. ICANN-91, Espoo, Finland), Eds. T. Kohonen, etc., North-Holland, pp. 385–390, 1991
M.I. Jordan and R.A. Jacobs, “Adaptive mixtures of local experts,” Neural Computa., vol. 3, pp. 79–87, 1991.
A. Sudjianto, M.H. Hassoun, “Nonlinear Hebbian rule: A statistical interpretation,” In: Proc. IEEE Intl. Conf. Neural Networks, pp. 1247–1252, 1994
J. Karhunen, J. Joutsensalo, “Representation and separation of signals using nonlinear PCA type learning,” Neural Networks, vol. 7, pp. 113–127, 1994.
J. Karhunen, P. Rajunen and E. Oja, “The nonlinear PCA criterion in blind source separation: relations with other approaches,” Technical Report, Helsinki University of Technology, 1998.
C. Fyfe, R. Baddeley, “Finding compact and sparse-distributed representations of visual images,” Network: Computation in Neural Systems, vol. 6, pp. 333–344, 1995.
C. Fyfe, R. Baddeley, “Nonlinear data structure extraction using simple Hebbian networks,” Biol. Cybern., vol. 72, pp. 533–541, 1995.
T. Martinetz, K. Schulten, “Neural gas’ network learns topologies,” in Kohonen et al. (Eds.), Artificial neural networks, (vol. I, pp. 397–402). Amsterdam: North Holland, 1991.
K. Rose, F. Gurewitz and G. Fox, “Statistical mechanics and phase transitions in clustering,” Physical Rev. Lett., vol. 65, pp. 945–948, 1990.
L. Xu, “A modified gating network for the mixtures of experts architecture,” in World Congress on Neural Networks, San Diego, pp. II. 405–410, 1994.
B.L. Zhang, L. Xu and M.Y. Fu, “Learning multiple causes by competition enhanced least mean square error reconstruction,” Intl. J. Neural System, vol. 7, pp. 223–236, 1996.
C.J.S. Webber, “Emergent componential coding of a handwritten image database by neural self-organization,” Network: Computation in Neural System, vol. 9, pp. 433–447, 1998.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1999 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zhang, Bl., Gedeon, T.D. (1999). Extended nonlinear hebbian learning for developing sparse-distributed representation. In: Mira, J., Sánchez-Andrés, J.V. (eds) Foundations and Tools for Neural Modeling. IWANN 1999. Lecture Notes in Computer Science, vol 1606. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0098201
Download citation
DOI: https://doi.org/10.1007/BFb0098201
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-66069-9
Online ISBN: 978-3-540-48771-5
eBook Packages: Springer Book Archive