Skip to main content

Long-Term Prediction of a Sine Function Using a LSTM Neural Network

  • Chapter
  • First Online:

Part of the book series: Studies in Computational Intelligence ((SCI,volume 667))

Abstract

In the past years, efforts have been made to improve the efficiency of long-term time series forecasting. However, when the involved series is highly oscillatory and nonlinear, this is still an open problem. Given the fact that signals may be approximated as linear combinations of sine functions, the study of the behavior of an adaptive dynamical model able to reproduce a sine function may be relevant for long-term prediction. In this chapter, we present an analysis of the modeling and prediction abilities of the “Long Short-Term Memory” (LSTM) recurrent neural network, when the input signal has a discrete sine function shape. Previous works have shown that LSTM is able to learn relevant events among long-term lags, however, its oscillatory abilities have not been analyzed enough. In our experiments, we found that some configurations of LSTM were able to model the signal, accurately predicting up to 400 steps forward. However, we also found that similar architectures did not perform properly when experiments were repeated, probably due to the fact that the LSTM architectures got over trained and the learning algorithm got trapped in a local minimum.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Strauss, D.G., Poole, J.E., Wagner, G.S., Selvester, R.H., Miller, J.M., Anderson, J., Johnson, G., McNulty, S.E., Mark, D.B., Lee, K.L., et al.: An ECG index of myocardial scar enhances prediction of defibrillator shocks: an analysis of the sudden cardiac death in heart failure trial. Heart Rhythm 8(1) (2011) 38–45.

    Google Scholar 

  2. Pavlidis, N., Tasoulis, D., Vrahatis, M.N.: Financial forecasting through unsupervised clustering and evolutionary trained neural networks. In: Evolutionary Computation, 2003. CEC’03. The 2003 Congress on. Volume 4, IEEE (2003) 2314–2321.

    Google Scholar 

  3. Cao, Q., Ewing, B.T., Thompson, M.A.: Forecasting wind speed with recurrent neural networks. European Journal of Operational Research 221(1) (2012) 148–154.

    Google Scholar 

  4. Pilinkiene, V.: Selection of market demand forecast methods: Criteria and application. Engineering Economics 58(3) (2015).

    Google Scholar 

  5. De Gooijer, J.G., Hyndman, R.J.: 25 years of time series forecasting. International journal of forecasting 22(3) (2006) 443–473.

    Google Scholar 

  6. Taieb, S.B., Bontempi, G., Atiya, A.F., Sorjamaa, A.: A review and comparison of strategies for multi-step ahead time series forecasting based on the NN5 forecasting competition. Expert systems with applications 39(8) (2012) 7067–7083.

    Google Scholar 

  7. Judd, K., Small, M.: Towards long-term prediction. Physica D: Nonlinear Phenomena 136(1) (2000) 31–44.

    Google Scholar 

  8. Crone, S.F., Hibon, M., Nikolopoulos, K.: Advances in forecasting with neural networks Empirical evidence from the NN3 competition on time series prediction. International Journal of Forecasting 27(3) (2011) 635–660.

    Google Scholar 

  9. Cheng, H., Tan, P.N., Gao, J., Scripps, J.: Multistep-ahead time series prediction. In: Advances in knowledge discovery and data mining. Springer (2006) 765–774.

    Google Scholar 

  10. Gómez-Gil, P., Ramírez-Cortes, J.M., Hernández, S.E.P., Alarcón-Aquino, V.: A neural network scheme for long-term forecasting of chaotic time series. Neural Processing Letters 33(3) (2011) 215–233.

    Google Scholar 

  11. Park, D.C., Tran, C.N., Lee, Y.: Multiscale bilinear recurrent neural networks and their application to the long-term prediction of network traffic. In: Advances in Neural Networks-ISNN 2006. Springer (2006) 196–201.

    Google Scholar 

  12. Menezes, J.M.P., Barreto, G.A.: Long-term time series prediction with the narx network: an empirical evaluation. Neurocomputing 71(16) (2008) 3335–3343.

    Google Scholar 

  13. Alarcon-Aquino, A., Barria, J.A.: Multiresolution FIR neural-network-based learning algorithm applied to network traffic prediction. IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews 36(2) (2006) 208–220.

    Google Scholar 

  14. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8) (November 1997) 1735–1780.

    Google Scholar 

  15. Graves, A., Rahman Mohamed, A., Hinton, G.: Speech recognition with deep re- current neural networks (2013).

    Google Scholar 

  16. Graves, A., Liwicki, M., Fernández, S., Bertolami, R., Bunke, H., Schmidhuber, J.: A novel connectionist system for unconstrained handwriting recognition. IEEE Trans. Pattern Anal. Mach. Intell. 31(5) (May 2009) 855–868.

    Google Scholar 

  17. Sak, H., Senior, A.W., Rao, K., Irsoy, O., Graves, A., Beaufays, F., Schalkwyk, J.: Learning acoustic frame labeling for speech recognition with recurrent neural networks. In: 2015 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2015, South Brisbane, Queensland, Australia, April 19-24, 2015. (2015) 4280–4284.

    Google Scholar 

  18. Gers, F.: Long short-term memory in recurrent neural networks. Thesis No. 2366. Ecole Polytechnique Federale de Lausanne. Doctoral Thesis. Lausane, EPFL.(2001).

    Google Scholar 

  19. Gers, F.A., Schraudolph, N.N., Schmidhuber, J.: Learning precise timing with LSTM recurrent networks. The Journal of Machine Learning Research 3 (2002) 115–143.

    Google Scholar 

  20. Riedmiller, M., Braun, H.: A direct adaptive method for faster backpropagation learning: the RPROP algorithm. In: Neural Networks, 1993, IEEE International Conference on. (1993) 586–591 vol. 1.

    Google Scholar 

  21. Picard, R.R., Cook, R.D.: Cross-validation of regression models. Journal of the American Statistical Association 79(387) (1984) 575–583.

    Google Scholar 

  22. Igel, C., Husken, M.: Improving the RPROP learning algorithm. In: Proceedings of the second international ICSC symposium on neural computation (NC 2000). Volume 2000, Citeseer (2000) 115–121.

    Google Scholar 

  23. Igel, C., Husken, M.: Empirical evaluation of the improved RPROP learning algorithms. Neurocomputing 50 (2003) 105 – 123.

    Google Scholar 

  24. Riedmiller, M., Braun, H.: A direct adaptive method for faster backpropagation learning: The rprop algorithm. In: Neural Networks, 1993, IEEE International Conference on, IEEE (1993) 586–591.

    Google Scholar 

  25. Schaul, T., Bayer, J., Wierstra, D., Sun, Y., Felder, M., Sehnke, F., Ruckstie, T., Schmidhuber, J.: PyBrain. Journal of Machine Learning Research 11 (2010) 743–746.

    Google Scholar 

  26. Tessier, T.H.: Long range forecasting: From crystal ball to computer. Journal of Accountancy (pre-1986) 146(000005) (1978) 87.

    Google Scholar 

  27. Andrawis, R.R., Atiya, A.F., El-Shishiny, H.: Forecast combinations of computational intelligence and linear models for the nn5 time series forecasting competition. International Journal of Forecasting 27(3) (2011) 672–688.

    Google Scholar 

  28. Rognvaldsson, T.S.: A simple trick for estimating the weight decay parameter. In: Neural networks: Tricks of the trade. Springer (1998) 71–92.

    Google Scholar 

  29. Graves, A., Fernández, S., Gómez, F., Schmidhuber, J.: Connectionist temporal classification: labelling unsegmented sequence data with recurrent neural networks. In: Proceedings of the 23rd international conference on Machine learning, ACM (2006) 369–376.

    Google Scholar 

  30. Cho, K., Van Merriënboer, B., Gülçehre, C., Bahdanau, D., Bougares, F., Schwenk, H., Bengio, Y.: Learning phrase representations using RNN encoder–decoder for statistical machine translation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), Doha, Qatar, Association for Computational Linguistics (October 2014) 1724–1734.

    Google Scholar 

Download references

Acknowledgments

This research has been supported by Consejo Nacional de Ciencia y Tecnología (CONACYT) México, grant No. CB-2010-155250.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Magdiel Jiménez-Guarneros .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this chapter

Cite this chapter

Jiménez-Guarneros, M., Gómez-Gil, P., Fonseca-Delgado, R., Ramírez-Cortés, M., Alarcón-Aquino, V. (2017). Long-Term Prediction of a Sine Function Using a LSTM Neural Network. In: Melin, P., Castillo, O., Kacprzyk, J. (eds) Nature-Inspired Design of Hybrid Intelligent Systems. Studies in Computational Intelligence, vol 667. Springer, Cham. https://doi.org/10.1007/978-3-319-47054-2_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-47054-2_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-47053-5

  • Online ISBN: 978-3-319-47054-2

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics