Abstract
We study a novel recurrent network architecture with dynamics of iterative function systems used in chaos game representations of DNA sequences [16, 11] We show that such networks code the temporal and statistical structure of input sequences in a strict mathematical sense: generalized dimensions of network states are in direct correspondence with statistical properties of input sequences expressed via generalized Rényi entropy spectra. We also argue and experimentally illustrate that the commonly used heuristic of finite state machine extraction by network state space quantization corresponds in this case to variable memory length Markov model construction.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Barnsley, M.F.: Fractals everywhere. Academic Press, New York (1988)
Beck, C., Schlogl, F.: Thermodynamics of chaotic systems. Cambridge University Press, Cambridge (1995)
Bruske, J., Sommer, G.: Dynamic cell structure learns perfectly topology preserving map. Neural Computation 7(4), 845–865 (1995)
Casey, M.P.: The dynamics of discrete-time computation, with application to recurrent neural networks and finite state machine extraction. Neural Computation 8(6), 1135–1178 (1996)
Crutchfield, J.P., Young, K.: Inferring statistical complexity. Physical Review Letters 63, 105–108 (1989)
Crutchfield, J.P., Young, K.: Computation at the onset of chaos. In: Zurek, W.H. (ed.) Complexity, Entropy, and the physics of Information. SFI Studies in the Sciences of Complexity, vol. 8, pp. 223–269. Addison-Wesley, Reading (1990)
Frasconi, P., Gori, M., Maggini, M., Soda, G.: Insertion of finite state automata in recurrent radial basis function networks. Machine Learning 23, 5–32 (1996)
Freund, J., Ebeling, W., Rateitschak, K.: Self-similar sequences and universal scaling of dynamical entropies. Physical Review E 54(5), 5561–5566 (1996)
Grassberger, P.: Information and complexity measures in dynamical systems. In: Atmanspacher, H., Scheingraber, H. (eds.) Information Dynamics, pp. 15–33. Plenum Press, New York (1991)
Hertz, J., Krogh, A., Palmer, R.G.: Introduction to the Theory of Neural Computation. Addison–Wesley, Redwood City (1991)
Jeffrey, J.: Chaos game representation of gene structure. Nucleic Acids Research 18(8), 2163–2170 (1990)
Kenyon, R., Peres, Y.: Measures of full dimension on affine invariant sets. Ergodic Theory and Dynamical Systems 16, 307–323 (1996)
Kolen, J.F.: Recurrent networks: state machines or iterated function systems? In: Mozer, M.C., Smolensky, P., Touretzky, D.S., Elman, J.L., Weigend, A.S. (eds.) Proceedings of the 1993 Connectionist Models Summer School, pp. 203–210. Erlbaum Associates, Hillsdale (1994)
Manolios, P., Fanelli, R.: First order recurrent neural networks and deterministic finite state automata. Neural Computation 6(6), 1155–1173 (1994)
McCauley, J.L.: Chaos, Dynamics and Fractals: an algorithmic approach to deter ministic chaos. Cambridge University Press, Cambridge (1994)
Tiňo, P.: Spatial representation of symbolic sequences through iterative function system. IEEE Transactions on Systems, Man, and Cybernetics Part A: Systems and Humans 29(4), 386–393 (1999)
Tiňo, P., Dorffner, G.: Recurrent neural networks with iterated function systems dynamics. In: International ICSC/IFAC Symposium on Neural Computation, pp. 526–532 (1998)
Tiňo, P., Horne, B.G., Giles, C.L., Collingwood, P.C.: Finite state machines and recurrent neural networks – automata and dynamical systems approaches. In: Dayhoff, J.E., Omidvar, O. (eds.) Neural Networks and Pattern Recognition, pp. 171–220. Academic Press, London (1998)
Tiňo, P., Koteles, M.: Extracting Finite state representations from recurrent neural networks trained on chaotic symbolic sequences. IEEE Transactions on Neural Networks 10(2), 284–302 (1999)
Tiňo, P., Sajda, J.: Learning and extracting initial mealy machines with a modular neural network model. Neural Computation 7(4), 822–844 (1995)
Tiňo, P., Vojtek, V.: Modeling complex sequences with recurrent neural networks. In: Smith, G.D., Steele, N.C., Albrecht, R.F. (eds.) Artificial Neural Networks and Genetic Algorithms, pp. 459–463. Springer, New York (1998)
Oliver, J.L., Bernaola-Galván, P., Guerrero-Garcia, J., Román Roldan, R.: Entropic profiles of DNA sequences through chaos-game-derived images. Journal of Theor. Biology 160, 457–470 (1993)
Omlin, C.W., Giles, C.L.: Extraction of rules from discrete-time recurrent neural networks. Neural Networks 9(1), 41–51 (1996)
Renyi, A.: On the dimension and entropy of probability distributions. Acta Math. Hung. 10, 193 (1959)
Roman-Roldan, R., Bernaola-Galvan, P., Oliver, J.L.: Entropic feature for sequence pattern through iteration function systems. Pattern Recognition Letters 15, 567–573 (1994)
Ron, D., Singer, Y., Tishby, N.: The power of amnesia. In: Advances in Neural Information Processing Systems, pp. 176–183. Morgan Kaufmann, San Francisco (1994)
Ron, D., Singer, Y., Tishby, N.: The power of amnesia. Machine Learning 25, 117–150 (1996)
Tabor, W.: Dynamical automata. Technical Report TR98-1694, Cornell University, Computer Science Department (1998)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2000 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Tiňo, P., Dorffner, G., Schittenkopf, C. (2000). Understanding State Space Organization in Recurrent Neural Networks with Iterative Function Systems Dynamics. In: Wermter, S., Sun, R. (eds) Hybrid Neural Systems. Hybrid Neural Systems 1998. Lecture Notes in Computer Science(), vol 1778. Springer, Berlin, Heidelberg. https://doi.org/10.1007/10719871_18
Download citation
DOI: https://doi.org/10.1007/10719871_18
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-67305-7
Online ISBN: 978-3-540-46417-4
eBook Packages: Springer Book Archive