Abstract
Elman presented a network with a context layer for the time-series processing. The context layer is connected to the hidden layer for the next calculation of the time series, which keeps the output of the hidden layer. In this paper, the context layer is reformed to the internal memory layer, which is connected from the hidden layer with the connection weights to make the internal memory. Then, the internal memory plays an important role of the learning of the time series. We developed a new learning algorithm, called the time-delayed back-propagation learning, for the internal memory. The ability of the network with the internal memory layer is demonstrated by applying the simple sinusoidal time-series.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Elman, J.L.: Finding Structure in Time. Cognitive science 14, 179–211 (1990)
Elman, J.L.: Learning and Development in Neural Networks: The Importance of Starting Small. Cognition 48, 71–99 (1993)
Koskela, T., Lehtokangas, M., Saarinen, J., Kaski, K.: Time Series Prediction with Multilayer Perceptron, FIR and Elman Neural Networks. In: Proc. of the World Congress on Neural Networks, pp. 491–496. INNS Press, San Diego (1996)
Cholewo, T.J., Zurada, J.M.: Sequential Network Construction for Time Series Prediction. In: Proc. of the IEEE Intl. Joint Conf. on Neural Networks, Houston, Texas, USA, pp. 2034–2039 (1997)
Giles, C.L., Lawrence, S., Tsoi, A.: Noisy Time Series Prediction Using a Recurrent Neural Network and Grammatical Inference. In: Machine learning, vol. 44(1/2), pp. 161–183. Springer Science+Business Media B.V, Formerly Kluwer Academic Publishers B.V, Boston (2001)
Iwasa, K., Deguchi, T., Ishii, N.: Acquisition of the Time-Series Information in the Network with Internal Memory (in Japanese). IEICE Technical Report NC2001–71, 7–12 (2001)
Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning internal representations by error propagation. In: Rumelhart, D.E., McClelland, J.L. (eds.) Parallel Distributed Processing: Explorations in the Microstructure of Cognition, vol. 1, pp. 318–362. MIT Press, Cambridge (1986)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Deguchi, T., Ishii, N. (2006). Delayed Learning on Internal Memory Network and Organizing Internal States. In: Wang, J., Yi, Z., Zurada, J.M., Lu, BL., Yin, H. (eds) Advances in Neural Networks - ISNN 2006. ISNN 2006. Lecture Notes in Computer Science, vol 3971. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11759966_75
Download citation
DOI: https://doi.org/10.1007/11759966_75
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-34439-1
Online ISBN: 978-3-540-34440-7
eBook Packages: Computer ScienceComputer Science (R0)