Abstract
In this paper, the auto-association problem is discussed using group theoretical methods. Considering the symmetry group of a given set of test sequences, it is shown to be possible to construct a class of neural networks acting as auto-associators on this set. It turns out that the symmetry of the network structure is already determined by the symmetries of the set of test sequences, indicating that learning a set of elements applied is concerned with finding invariant relations inherent in this set. Moreover, the main result offers the possibility, to construct all optimal network structures and, hence, to decide whether a solution found by a particular learning algorithm is optimal or not.
Similar content being viewed by others
References
R. Hecht-Nielsen, “On the algebraic structure of feedforward network weight spaces”, in R. Eckmiller (ed.) Advanced Neural Computers, pp. 129–135, Elsevier Science Publishers: North-Holland, 1990.
V. Kurkova & P.C. Kainen, “Functionally equivalent feedforward Neural Networks”, Neural Computation, Vol. 6 pp. 543–558, 1994.
H.J. Sussmann, “Uniqueness of the weights for minimal feedforward nets with a given input-output map”, Neural Networks, Vol. 5, pp. 589–593, 1992.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Reimann, S. Symmetry and Network Structure. Neural Processing Letters 6, 1–10 (1997). https://doi.org/10.1023/A:1009622405402
Issue Date:
DOI: https://doi.org/10.1023/A:1009622405402