Skip to main content
Log in

Reservoir weights learning based on adaptive dynamic programming and its application in time series classification

  • S.I.: Deep Learning for Time Series Data
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Time series classification (TSC) has been addressed and analysed through a wide spectrum of algorithms. Nevertheless, few have considered the notion of spiking or non-spiking neural networks (NNs) and their performance in TSC tasks. Seminal Reservoir Computing (s-RC) with random connected recurrent neural networks is categorized among the fastest and most efficient end-to-end NNs that have been applied to TSC problems. Although the s-RC architecture is absolutely suited for dynamic (temporal) data processing, it fails to achieve significant improvement compared to state-of-the-art fully trainable NNs. Along this thread, the present study proposes a novel algorithm for training the reservoir by fusing nonlinear optimal control theory with reservoir computing (RC) theory, which opens a new approach to optimizing RC predicted values (estimated class) in a specific timestamp along the desired trajectory (true class). For this purpose, TSC tasks were reformulated as a nonlinear optimal control problem, conducive to an approximate solution to a learning rule for the reservoir of spiking or non-spiking neurons, using the adaptive/approximate dynamic programming (ADP) method. The proposed framework that known as Trainable Reservoir Computing (t-RC) involves an online actor–critic method which is used to project the effect of the output error into the reservoir’s parameters adjusting rule so as to ensure the classification error is minimized. To evaluate the TSC adaptability of the newly proposed RC framework and state-of-the-art NN-based methods, varying experiments on 22 univariate and multivariate time series datasets (UCR and UEA datasets) were performed. The findings divulge that the proposed framework outperforms other RC methods in learning capacity and accuracy and attains classification accuracy comparable with the best fully trainable deep neural networks.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Fawaz HI, Forestier G, Weber J, Idoumghar L, Muller P-A (2019) Deep learning for time series classification: a review. Data Min Knowl Discov 33:917–963

    Article  MathSciNet  Google Scholar 

  2. Abanda A, Mori U, Lozano JA (2019) A review on distance based time series classification. Data Min Knowl Discov 33:378–412

    Article  MathSciNet  Google Scholar 

  3. Ding H, Trajcevski G, Scheuermann P, Wang X, Keogh E (2008) Querying and mining of time series data: experimental comparison of representations and distance measures. Proc VLDB Endow 1(2):1542–1552

    Article  Google Scholar 

  4. Felzenszwalb PF, Zabih R (2011) Dynamic programming and graph algorithms in computer vision. IEEE Trans Pattern Anal Mach Intell 33(4):721–740

    Article  Google Scholar 

  5. Xu Y, Yan C, Feng J, Ying G, Dunwei G (2018) SVMs classification based two-side cross domain collaborative filtering by inferring intrinsic user and item features. Knowl-Based Syst 141:80–91

    Article  Google Scholar 

  6. Yu X, Yang J, Xie Z (2014) Training SVMs on a bound vectors set based on Fisher projection. Front Comp Sci 8:793–806

    Article  MathSciNet  Google Scholar 

  7. Baydogan MG, Runger G, Tuv E (2013) A Bag-of-features framework to classify time series. IEEE Trans Pattern Anal Mach Intell 35(11):2796–2802

    Article  Google Scholar 

  8. Schäfer P (2015) The BOSS is concerned with time series classification in the presence of noise. Data Min Knowl Discov 29:1505–1530

    Article  MathSciNet  Google Scholar 

  9. J Grabocka, N Schilling, M Wistuba, L Schmidt-Thieme (2014) Learning time-series shapelets," in ACM SIGKDD international conference on Knowledge discovery and data mining

  10. Bagnall A, Lines J, Bostrom A, Large J, Keogh E (2016) The great time series classification bake off: a review and experimental evaluation of recent algorithmic advances. Data Min Knowl Disc 31(3):606–660

    Article  MathSciNet  Google Scholar 

  11. C.-K Ngan (2019) Time Series Analysis: Data, Methods, and Applications, IntechOpen

  12. Srivastava N, Hinton G, Krizhevsky A, Sutskever I, Salakhutdinov R (2014) Dropout: A simple way to prevent neural networks from overfitting. J Mach Learn Res 15(1):1929–1958

    MathSciNet  MATH  Google Scholar 

  13. Z Cui, W Chen, Y Chen (2016) Multi-Scale Convolutional Neural Networks for Time Series Classification, arXiv: 1603.06995

  14. Z Wang, W Yan, T Oates (2017) Time series classification from scratch with deep neural networks: A strong baseline," in International Joint Conference on Neural Networks (IJCNN), Anchorage

  15. J Serra, A Pascual, A Karatzoglou (2018) Towards a universal neural network encoder for time series, Artificial Intelligence Research and Development: Current Challenges, New Trends and Applications, pp. 308–120

  16. I Sutskever, J Martens, G Hinton (2011) Generating text with recurrent neural networks," in Proceeding ICML 11 Proceedings of the 28th International Conference on International, Washington

  17. P Tanisaro, G Heidemann (2016) Time Series Classification Using Time Warping Invariant Echo State Networks," in IEEE International Conference on Machine Learning and Applications (ICMLA), Anaheim

  18. Kasabov N (2014) NeuCube: a spiking neural network architecture for mapping, learning and understanding of spatio-temporal brain data. Neural Netw 52:62–76

    Article  Google Scholar 

  19. D Rumelhart, GE Hinton, RJ Williams (1987) Learning Internal representations by error propagation, in parallel distributed processing: explorations in the microstructure of cognition, Cambridge, MIT Press

  20. Williams RJ, Zipser D (1989) A learning algorithm for continually running fully recurrent neural networks. Neural Comput 1(2):270–280

    Article  Google Scholar 

  21. DP Hunt, D Parry (2016) Using echo state networks to classify unscripted, real-world punctual activity," in IEEE International Conference on Machine Learning and Applications (ICMLA), Anaheim, CA

  22. Trentin E, Scherer S, Schwenker F (2015) Emotion recognition from speech signals via a probabilistic echo-state network. Pattern Recogn Lett 66:4–12

    Article  Google Scholar 

  23. Gao L, Deng X, Yang W (2021) Smart city infrastructure protection: real-time threat detection employing online reservoir computing architecture. Neural Comput Appl. https://doi.org/10.1007/s00521-021-05733-0

    Article  Google Scholar 

  24. Bacciu D, Barsocchi P, Chessa S, Gallicchio C, Micheli A (2013) An experimental characterization of reservoir computing in ambient assisted living applications. Neural Comput Appl 2014(24):1451–1464

    Google Scholar 

  25. H Jaeger (2002) Tutorial on training recurrent neural networks, covering BPPT,RTRL, EKF and the echo state network approach.," German National Research Center for Information

  26. A Alalshekmubarak, LS Smith (2013) A Novel Approach Combining Recurrent Neural Network and Support Vector Machines For Time Series Classification," in Innovations in Information Technology (IIT)

  27. F Bianchi, S Scardapane, S Løkse, R Jenssen (2018) Bidirectional deep-readout echo state networks, in European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, Bruges

  28. Lukoševičius M (2012) Neural networks: tricks of the trade, in a practical guide to applying echo state networks. Springer, Berlin

    Google Scholar 

  29. F Jiang, H Berry, M Schoenauer (2008) Supervised and evolutionary learning of echo state networks, in Parallel Problem Solving from Nature (PPSN)

  30. Rodan A, Tino P (2010) Minimum complexity echo state network. IEEE Trans Neural Networks 200(1):131–144

    Article  Google Scholar 

  31. Kasabov N, Dhoble K, Nuntalid N, Indiveri G (2013) Dynamic evolving spiking neural networks for on-line spatio- and spectro-temporal pattern recognition. Neural Netw 41:188–201

    Article  Google Scholar 

  32. Kasabov N, Scott NM, Tu E, Marks S, Sengupta N, Capecci E, Othman M (2016) Evolving spatio-temporal data machines based on the NeuCube neuromorphic framework: Design methodology and selected applications. Neural Netw 78:1–14

    Article  Google Scholar 

  33. S Thorpe, G Jacques (1998) Rank Order Coding, Computational Neuroscience, pp. 113–118

  34. Caporale N, Dan Y (2008) Spike timing-dependent plasticity: a hebbian learning rule. Annu Rev Neurosci 31:25–46

    Article  Google Scholar 

  35. Eliasmith C (2004) Neural engineering: computation, representation, and dynamics in neurobiological systems. MIT press, Cambrige

    Google Scholar 

  36. Sussillo D, Abbott LF (2009) Generating coherent patterns of activity from chaotic neural networks. Neuron 63(4):544–557

    Article  Google Scholar 

  37. A Gilra, W Gerstner (2017) Predicting non-linear dynamics by stable local learning in a recurrent spiking neural network, eLife

  38. Tanaka G, Yamane T, Heroux JB, Nakane R, Kanazawa N, Takeda S, Hirose A (2019) Recent advances in physical reservoir computing: a review. Neural Netw 115:100–123

    Article  Google Scholar 

  39. H Jaeger (2001) The echo state approach to analysing and training recurrent neural networks, German National Research Center for Information Technology, Bonn, Germany

  40. Maass W, Nachtschlaeger T, Markram H (2002) Real-time computing without stable states: a new framework for neural computation based on perturbations. Neural Comput 14(11):2531–2560

    Article  Google Scholar 

  41. Gallicchio C, Micheli A, Pedrelli L (2017) Deep reservoir computing: a critical experimental analysis. Neurocomputing 268:87–99

    Article  Google Scholar 

  42. Li Q, Wu Z, Ling R, Feng L, Liu K (2020) Multi-reservoir echo state computing for solar irradiance prediction: a fast yet efficient deep learning approach. Appl Soft Comput J 95:106481

    Article  Google Scholar 

  43. Tu E, Kasabov N, Yang J (2017) Mapping temporal variables into the neucube for improved pattern recognition, predictive modelling and understanding of stream data. IEEE Trans Neural Networks Learn Syst 28(6):1305–1317

    Article  MathSciNet  Google Scholar 

  44. Bashir Alvi F, Pears R, Kasabov N (2018) An evolving spatio-temporal approach for gender and age classifcation with spiking neural networks. Evolv Syst 9(2):145–156

    Article  Google Scholar 

  45. Lewis F, Syrmos V (1992) Optimal control. Wiley, New York

    Google Scholar 

  46. Miller TW, Sutton RS, Werbos PJ (1995) Neural networks for control. MIT Press, Cambridge

    Book  Google Scholar 

  47. Vamvoudakis K, Lewis FL (2010) Online actor–critic algorithm to solve the continuous-time infinite horizon optimal control problem. Automatica 46:878–888

    Article  MathSciNet  Google Scholar 

  48. Kiumarsi B, Lewis FL (2014) Actor–critic-based optimal tracking for partially unknown nonlinear discrete-time systems. IEEE Trans Neural Netw Learn Syst 26(1):140–151

    Article  MathSciNet  Google Scholar 

  49. Gao W, Jiang Z-P (2016) Adaptive dynamic programming and adaptive optimal output regulation of linear systems. IEEE Trans Autom Control 61(12):4164–4169

    Article  MathSciNet  Google Scholar 

  50. Kamalapurkar R, Dinh H, Bhasin S, Dixon WE (2015) Approximate optimal trajectory tracking for continuous-time nonlinear systems. Automatica 15:40–48

    Article  MathSciNet  Google Scholar 

  51. Zhao J (2020) Neural networks-based optimal tracking control for nonzero-sum games of multi-player continuous-time nonlinear systems via reinforcement learning. Neurocomputing 412:167–176

    Article  Google Scholar 

  52. Zha Z, Wang B, Tang X (2020) Evaluate, explain, and explore the state more exactly: an improved Actor-Critic algorithm for complex environment,. Neural Comput Adv Appl. https://doi.org/10.1007/s00521-020-05663-3

    Article  Google Scholar 

  53. Khalaf M, Lewis FL (2005) Nearly optimal control laws for nonlinear systems with saturating actuators using a neural network HJB approach. Automatica 41(5):779–791

    Article  MathSciNet  Google Scholar 

  54. Y Chen, E Keogh, B Hu, N Begum, A Bagnall, A Mueen, G Batista, The UCR time series classification archive.," 2015. [Online]. Available: www.cs.ucr.edu/~eamonn/time_series_data/. [Accessed 20 May 2020]

  55. Zhao B, Lu H, Chen S, Liu J, Wu D (2017) Convolutional neural networks for time series classification. J Syst Eng Electron 28(1):162–169

    Article  Google Scholar 

  56. Lecun Y, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324

    Article  Google Scholar 

  57. D Rasmussen (2018) NengoDL: Combining deep learning and neuromorphic modelling methods, arXiv:1805.11144

Download references

Funding

The authors declare that they received no funding for this work.

Author information

Authors and Affiliations

Authors

Contributions

MM designed and conducted all the experiments, and drafted the manuscript. Prof. MMH and Prof. MME contributed to the guidance of the research, presented valuable comments on the design of the experiments and revised the manuscript. All authors have read and approved this manuscript.

Corresponding authors

Correspondence to Mohammad Mehdi Homayounpour or Mohammad Mehdi Ebadzadeh.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Code availability

Code available at https://github.com/hamed-modiri/TimeSeriesAnalysis.

Data availability

Datasets available at http://www.timeseriesclassification.com/dataset.php.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Modiri, M., Homayounpour, M.M. & Ebadzadeh, M.M. Reservoir weights learning based on adaptive dynamic programming and its application in time series classification. Neural Comput & Applic 34, 13201–13217 (2022). https://doi.org/10.1007/s00521-021-06827-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-021-06827-5

Keywords

Navigation