Abstract
In this paper, we proposed a different genetic model for optimizing both network architecture and connection weights of Discrete-Time Recurrent Neural Networks in evolutionary process. Empirical studies show that our model can efficiently generate the appropiate network size and topology for small applications. We have used two experiments: a parity function and a finite state machine for detection of sequences.
Preview
Unable to display preview. Download preview PDF.
References
Almeida, L., “A learning rule for asynchronous perceptrons with feedback in a combinatorial environment”, IEEE First Conference on Neural Networks, vol. 2, pp. 609–618, 1987.
Almeida, L., “Backpropagation in Perceptrons with Feedback”, in Neural Computers, (Eds. Roff Eckmiller and C.v.d. Malsburg), pp. 199–208, Springer-Verlag, 1989.
Fahlman, S.E. and Lebiere, C., “The cascade-correlation learning architecture”, in Advances in Neural Information Processing Systems (Ed. D. S. Touretzky), vol. 2, pp. 524–532, Morgan Kauffmann, 1990.
Fontanari, J.F. and Meir, R., “Evolving a learning algorithm for the binary perceptron”, Network, vol. 2, pp. 353–359, 1991.
Frean, M., “The upstart algorithm: a method for constructing and training feed-forward neural networks”. Neural Computation, vol. 2, pp. 198–209, 1990.
Gruau, F.C., “Cellular Encoding of Genetic Neural Networks”, Technical Report #92-21, Laboratoire de l'Informatique du Parallelisme, Ecole Normale Superieure de Lyon, Mai 1992.
Harp, S.A. and Samad, T. and Guha, A., “Towards the genetic synthesis of neural networks”, in Third International Conference on Genetic Algorithms (Ed. J. D. Schaffer), pp. 360–369, Morgan Kauffmann, 1989.
Hirose, Y. and Yamashita, K. and Hijiya, S., “Back-propagation algorithm which varies the number of hidden units”, Neural Networks, vol. 4, pp. 61–66, 1991.
Holland, J.H. “Adaptation in Natural and Artificial Systems”, University of Michigan press, Ann Arbor, 1975.
Kitano, H., “Designing neural networks using genetic algorithms with graph generation system”, Complex Systems, vol. 4, pp. 461–476, 1990.
Pineda, F.J., “Generalization of Back-propagation to Recurrent Neural Networks”, Physical Review Letters, vol. 59, pp. 2229–2232, American Physical Society, 1987.
Pineda, F.J., “Dynamics and Architecture for Neural Computation”, Journal of Complexity, vol. 4, pp. 216–245, Academic Press, 1988.
Rumelhart, D.E., Hinton, G.E. and Williams, R.J., “Learning internal representations by error propagation”, Parallel Distributed Proccessing, vol. 1, pp. 310–362. MIT Press, 1986.
Schraudolph, N.N., Computer Science & Engeniering Department, University of California, San Diego, La Jolla, CA 92093-0114.
Whitley, D., Starkweather, T. and Bogart, C., “Genetic algorithms and neural networks: optimizing connections and connectivity”. Parallel Computing, vol. 14, pp. 347–361, 1990.
Williams, R.J. and Zisper, D., “A learning algorithm for continually running fully recurrent neural networks”, Neural Computation, vol. 1, pp. 270–280, 1989.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1993 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Marín, F.J., Sandoval, F. (1993). Genetic synthesis of discrete-time recurrent neural network. In: Mira, J., Cabestany, J., Prieto, A. (eds) New Trends in Neural Computation. IWANN 1993. Lecture Notes in Computer Science, vol 686. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-56798-4_144
Download citation
DOI: https://doi.org/10.1007/3-540-56798-4_144
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-56798-1
Online ISBN: 978-3-540-47741-9
eBook Packages: Springer Book Archive