Abstract
Local Models for regression have focused a great deal of attention in recent years. They have proved to be more efficient than global models and especially when dealing with chaotic time series. Many models have been proposed to cluster time series and they have been combined with several predictors. In this paper we present an extension for recurrent neural networks in allowing to apply them to local models and we discuss the obtained results.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning internal representations by error propagation. Parallel distributed processing: explorations in the microstructure of cognition: Foundations 1, 318–362 (1986)
Walter, J., Riter, H., Schulten, K.: Nonlinear prediction with self-organizing maps. In: International Joint Conference on Neural Networks (IJCNN), vol. 1, pp. 589–594 (1990)
Simon, G., Lee, J.A., Cottrell, M., Verleysen, M.: Forecasting the CATS benchmark with the Double Vector Quantization method. Neurocomputing 70(13-15), 2400–2409 (2007)
Vesanto, J.: Using the som and local models in time-series prediction. Technical report, Helsinki University of Technology (1997)
Gray, R.M., Neuhoff, D.L.: Quantization. IEEE Transactions on Information Theory 44(6), 2325–2383 (1998)
Kohonen, T.: Self-organization and associative memory, 3rd edn. Springer, New York (1989)
Boné, R.: Recurrent Neural Networks For Time Series Forecasting. PhD thesis, Université de Tours, Tours, FRANCE (2000)
Chudy, L., Farkas, I.: Prediction of chaotic time-series using dynamic cell structuresand local linear models. Neural Network World 8, 481–489 (1998)
Gers, F.A., Eck, D., Schmidhuber, J.: Applying LSTM to Time Series Predictable through Time-Window Approaches. In: Dorffner, G., Bischof, H., Hornik, K. (eds.) ICANN 2001. LNCS, vol. 2130, pp. 669–676. Springer, Heidelberg (2001)
Simon, G.: Méthodes non linéaires pour séries temporelles: prédiction par Double Quantification Vectorielle et sélection du délai en hautes dimensions. PhD thesis, FSA/ELEC - Département d’électricité (2006)
Barreto, G.A., Araujo, A.F.R.: Identification and control of dynamical systems using the self-organizing map. IEEE Transactions on Neural Networks 15(5), 1244–1259 (2004)
Varsta, M., Millán, J.D.R., Heikkonen, J.: A recurrent self-organizing map for temporal sequence processing. In: Gerstner, W., Hasler, M., Germond, A., Nicoud, J.-D. (eds.) ICANN 1997. LNCS, vol. 1327, pp. 421–426. Springer, Heidelberg (1997)
Jacobs, R.A., Jordan, M.I., Nowlan, S.J., Hinton, G.E.: Adaptive mixtures of local experts. Neural Computation 3(1), 79–87 (1991)
Cottrell, M., Girard, B., Rousset, P.: Long term forecasting by combining Kohonen algorithm and standard prevision. In: Gerstner, W., Hasler, M., Germond, A., Nicoud, J.-D. (eds.) ICANN 1997. LNCS, vol. 1327, pp. 993–998. Springer, Heidelberg (1997)
Chappell, G.J., Taylor, J.G.: The temporal kohonen map. Neural Networks 6(3), 441–445 (1993)
Varsta, M., Heikkonen, J., Lampinen, J., Del Millán, J.R.: Temporal kohonen map and the recurrent self-organizing map: Analytical and experimental comparison. Neural Processing Letters 13(3), 237–251 (2001)
Voegtlin, T.: Recursive self-organizing maps. Neural Networks 15(8-9), 979–991 (2002)
James, D.L., Miikkulainen, R.: Sardnet: A self-organizing feature map for sequences. In: Advances in Neural Information Processing Systems, vol. 7, pp. 577–584. MIT Press, Cambridge (1995)
Mizushima, F., Toyoshima, T.: Language learnability by feedback self-organizing maps. In: King, I., Wang, J., Chan, L.-W., Wang, D. (eds.) ICONIP 2006. LNCS, vol. 4234, pp. 228–236. Springer, Heidelberg (2006)
Euliano, N.R., Principe, J.C.: A Spatio-Temporal Memory Based on SOMs with Activity Diffusion. In: Oja (ed.) Kohonen Maps, pp. 253–266. Elsevier, Amsterdam (1999)
Strickert, M., Hammer, B.: Merge som for temporal data. Neurocomputing 64, 39–71 (2005); Trends in Neurocomputing: 12th European Symposium on Artificial Neural Networks (2004)
Jordan, M.I.: Serial order: A parallel distributed processing approach. Technical report, California Univsity, San Diego, La Jolla Institute for Cognitive Science (1986)
Elman, J.L.: Finding structure in time. Cognitive Science 14(2), 179–211 (1990)
Mackey, M.C., Glass, L.: Oscillation and chaos in physiological control systems. Science 197(4300), 287–289 (1977)
Weigend, A.S., Nix, D.A.: Prediction with confidence intervals (local error bars). In: Proceedings of internaltional Conference on Neural Information Processing (ICONIP 1994), pp. 847–852 (1994)
Eric Wan, A.: Finite Impulse Response Neural Networks for Autoregressive Time Series Prediction. PhD thesis, Stanford (1993)
Koskela, T., Varsta, M., Heikkonen, J., Kaski, K.: Recurrent SOM with local linear models in time series prediction. In: 6th european Symposium on Artificial Neural Networks. ESANN 1998, pp. 167–172 (1998)
Bakker, R., Schouten, J.C., Giles, C.L., Takens, F., van den Bleek, C.M.: Learning chaotic attractor by neural networks. Neural Computation 12 (2000)
Wan, E.A.: Finite impulse response neural networks for autoregressive time series prediction. In: Proceeding in the NATO Advanced Workshop on Time Series Prediction and Analysis (1993)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Cherif, A., Cardot, H., Boné, R. (2009). Recurrent Neural Networks as Local Models for Time Series Prediction. In: Leung, C.S., Lee, M., Chan, J.H. (eds) Neural Information Processing. ICONIP 2009. Lecture Notes in Computer Science, vol 5864. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-10684-2_88
Download citation
DOI: https://doi.org/10.1007/978-3-642-10684-2_88
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-10682-8
Online ISBN: 978-3-642-10684-2
eBook Packages: Computer ScienceComputer Science (R0)