Skip to main content

Recurrent Neural Networks as Local Models for Time Series Prediction

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5864))

Abstract

Local Models for regression have focused a great deal of attention in recent years. They have proved to be more efficient than global models and especially when dealing with chaotic time series. Many models have been proposed to cluster time series and they have been combined with several predictors. In this paper we present an extension for recurrent neural networks in allowing to apply them to local models and we discuss the obtained results.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning internal representations by error propagation. Parallel distributed processing: explorations in the microstructure of cognition: Foundations 1, 318–362 (1986)

    Google Scholar 

  2. Walter, J., Riter, H., Schulten, K.: Nonlinear prediction with self-organizing maps. In: International Joint Conference on Neural Networks (IJCNN), vol. 1, pp. 589–594 (1990)

    Google Scholar 

  3. Simon, G., Lee, J.A., Cottrell, M., Verleysen, M.: Forecasting the CATS benchmark with the Double Vector Quantization method. Neurocomputing 70(13-15), 2400–2409 (2007)

    Article  Google Scholar 

  4. Vesanto, J.: Using the som and local models in time-series prediction. Technical report, Helsinki University of Technology (1997)

    Google Scholar 

  5. Gray, R.M., Neuhoff, D.L.: Quantization. IEEE Transactions on Information Theory 44(6), 2325–2383 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  6. Kohonen, T.: Self-organization and associative memory, 3rd edn. Springer, New York (1989)

    Google Scholar 

  7. Boné, R.: Recurrent Neural Networks For Time Series Forecasting. PhD thesis, Université de Tours, Tours, FRANCE (2000)

    Google Scholar 

  8. Chudy, L., Farkas, I.: Prediction of chaotic time-series using dynamic cell structuresand local linear models. Neural Network World 8, 481–489 (1998)

    Google Scholar 

  9. Gers, F.A., Eck, D., Schmidhuber, J.: Applying LSTM to Time Series Predictable through Time-Window Approaches. In: Dorffner, G., Bischof, H., Hornik, K. (eds.) ICANN 2001. LNCS, vol. 2130, pp. 669–676. Springer, Heidelberg (2001)

    Chapter  Google Scholar 

  10. Simon, G.: Méthodes non linéaires pour séries temporelles: prédiction par Double Quantification Vectorielle et sélection du délai en hautes dimensions. PhD thesis, FSA/ELEC - Département d’électricité (2006)

    Google Scholar 

  11. Barreto, G.A., Araujo, A.F.R.: Identification and control of dynamical systems using the self-organizing map. IEEE Transactions on Neural Networks 15(5), 1244–1259 (2004)

    Article  Google Scholar 

  12. Varsta, M., Millán, J.D.R., Heikkonen, J.: A recurrent self-organizing map for temporal sequence processing. In: Gerstner, W., Hasler, M., Germond, A., Nicoud, J.-D. (eds.) ICANN 1997. LNCS, vol. 1327, pp. 421–426. Springer, Heidelberg (1997)

    Google Scholar 

  13. Jacobs, R.A., Jordan, M.I., Nowlan, S.J., Hinton, G.E.: Adaptive mixtures of local experts. Neural Computation 3(1), 79–87 (1991)

    Article  Google Scholar 

  14. Cottrell, M., Girard, B., Rousset, P.: Long term forecasting by combining Kohonen algorithm and standard prevision. In: Gerstner, W., Hasler, M., Germond, A., Nicoud, J.-D. (eds.) ICANN 1997. LNCS, vol. 1327, pp. 993–998. Springer, Heidelberg (1997)

    Google Scholar 

  15. Chappell, G.J., Taylor, J.G.: The temporal kohonen map. Neural Networks 6(3), 441–445 (1993)

    Article  Google Scholar 

  16. Varsta, M., Heikkonen, J., Lampinen, J., Del Millán, J.R.: Temporal kohonen map and the recurrent self-organizing map: Analytical and experimental comparison. Neural Processing Letters 13(3), 237–251 (2001)

    Article  MATH  Google Scholar 

  17. Voegtlin, T.: Recursive self-organizing maps. Neural Networks 15(8-9), 979–991 (2002)

    Article  Google Scholar 

  18. James, D.L., Miikkulainen, R.: Sardnet: A self-organizing feature map for sequences. In: Advances in Neural Information Processing Systems, vol. 7, pp. 577–584. MIT Press, Cambridge (1995)

    Google Scholar 

  19. Mizushima, F., Toyoshima, T.: Language learnability by feedback self-organizing maps. In: King, I., Wang, J., Chan, L.-W., Wang, D. (eds.) ICONIP 2006. LNCS, vol. 4234, pp. 228–236. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  20. Euliano, N.R., Principe, J.C.: A Spatio-Temporal Memory Based on SOMs with Activity Diffusion. In: Oja (ed.) Kohonen Maps, pp. 253–266. Elsevier, Amsterdam (1999)

    Chapter  Google Scholar 

  21. Strickert, M., Hammer, B.: Merge som for temporal data. Neurocomputing 64, 39–71 (2005); Trends in Neurocomputing: 12th European Symposium on Artificial Neural Networks (2004)

    Article  Google Scholar 

  22. Jordan, M.I.: Serial order: A parallel distributed processing approach. Technical report, California Univsity, San Diego, La Jolla Institute for Cognitive Science (1986)

    Google Scholar 

  23. Elman, J.L.: Finding structure in time. Cognitive Science 14(2), 179–211 (1990)

    Article  Google Scholar 

  24. Mackey, M.C., Glass, L.: Oscillation and chaos in physiological control systems. Science 197(4300), 287–289 (1977)

    Article  Google Scholar 

  25. Weigend, A.S., Nix, D.A.: Prediction with confidence intervals (local error bars). In: Proceedings of internaltional Conference on Neural Information Processing (ICONIP 1994), pp. 847–852 (1994)

    Google Scholar 

  26. Eric Wan, A.: Finite Impulse Response Neural Networks for Autoregressive Time Series Prediction. PhD thesis, Stanford (1993)

    Google Scholar 

  27. Koskela, T., Varsta, M., Heikkonen, J., Kaski, K.: Recurrent SOM with local linear models in time series prediction. In: 6th european Symposium on Artificial Neural Networks. ESANN 1998, pp. 167–172 (1998)

    Google Scholar 

  28. Bakker, R., Schouten, J.C., Giles, C.L., Takens, F., van den Bleek, C.M.: Learning chaotic attractor by neural networks. Neural Computation 12 (2000)

    Google Scholar 

  29. Wan, E.A.: Finite impulse response neural networks for autoregressive time series prediction. In: Proceeding in the NATO Advanced Workshop on Time Series Prediction and Analysis (1993)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Cherif, A., Cardot, H., Boné, R. (2009). Recurrent Neural Networks as Local Models for Time Series Prediction. In: Leung, C.S., Lee, M., Chan, J.H. (eds) Neural Information Processing. ICONIP 2009. Lecture Notes in Computer Science, vol 5864. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-10684-2_88

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-10684-2_88

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-10682-8

  • Online ISBN: 978-3-642-10684-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics