Skip to main content

Pattern Matching in Sequential Data Using Reservoir Projections

  • Conference paper
  • First Online:
Advances in Neural Networks – ISNN 2019 (ISNN 2019)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11554))

Included in the following conference series:

  • 2035 Accesses

Abstract

A relevant problem on data science is to define an efficient and reliable algorithm for finding specific patterns in a given signal. This type of problems often appears in medical applications, biophysical systems, complex systems, financial analysis, and several other domains. Here, we introduce a new model based in the ability of Recurrent Neural Networks (RNNs) for modelling time series. The technique encodes temporal information of the reference signal and the given query in a feature space. This encoding is done using a RNN. In the feature space, we apply similarity techniques for analysing differences among the projected points. The proposed method presents advantages with respect of state of art, it can produce good results using less computational costs. We discuss the proposal over three benchmark datasets.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Basterrech, S.: Empirical analysis of the necessary and sufficient conditions of the echo state property. In: 2017 International Joint Conference on Neural Networks, IJCNN 2017, Anchorage, AK, USA, 14–19 May 2017, pp. 888–896 (2017). https://doi.org/10.1109/IJCNN.2017.7965946

  2. Basterrech, S., Rubino, G.: Echo state queueing networks: a combination of reservoir computing and random neural networks. Probab. Eng. Inf. Sci. 31, 457–476 (2017). https://doi.org/10.1017/S0269964817000110

    Google Scholar 

  3. Bengio, Y.: Learning deep architectures for AI. Found. Trends Mach. Learn. 2(1), 1–127 (2009). https://doi.org/10.1561/2200000006

    Google Scholar 

  4. Butcher, J.B., Verstraeten, D., Schrauwen, B., Day, C.R., Haycock, P.W.: Reservoir computing and extreme learning machines for non-linear time-series data analysis. Neural Netw. 38, 76–89 (2013)

    Google Scholar 

  5. Christos Faloutsos, M. Ranganathan, Y.M.: Fast subsequence matching in time-series databases. In: SIGMOD Conference, pp. 419–429 (1994)

    Google Scholar 

  6. Funahashi, K., Nakamura, Y.: Approximation of dynamical systems by continuous time recurrent neural networks. Neural Netw. 6, 801–806 (1993)

    Google Scholar 

  7. Gallicchio, C., Micheli, A.: Architectural and Markovian factors of echo state networks. Neural Netw. 24(5), 440–456 (2011). https://doi.org/10.1016/j.neunet.2011.02.002

    Google Scholar 

  8. Gallicchio, C., Micheli, A., Pedrelli, L.: Deep reservoir computing: a critical experimental analysis. Neurocomputing 268(Supplement C), 87–99 (2017). https://doi.org/10.1016/j.neucom.2016.12.089, http://www.sciencedirect.com/science/article/pii/S0925231217307567. advances in artificial neural networks, machine learning and computational intelligence

  9. Hyndman, R.: Time series data library. http://robjhyndman.com/TSDL

  10. Jaeger, H.: The “echo state” approach to analysing and training recurrent neural networks. Technical Report 148, German National Research Center for Information Technology (2001)

    Google Scholar 

  11. Jaeger, H.: Tutorial on training recurrent neural networks, covering BPPT, RTRL, EKF and the “echo state network” approach, Technical Report 148, German National Research Center for Information Technology (2002)

    Google Scholar 

  12. Jaeger, H., Lukos̆evic̆ius, M., Popovici, D., Siewert, U.: Optimization and applications of echo state networks with leaky-integrator neurons. Neural Netw. 20(3), 335–352 (2007)

    Google Scholar 

  13. Keogh, E., Smyth, P.: A probabilistic approach to fast pattern matching in time series databases. AAAI Technical Report WS-98-07, pp. 52–57 (1998)

    Google Scholar 

  14. Lukos̆evic̆ius, M.: On self-organizing reservoirs and their hierarchies. Technical Report 25, Jacobs University, Bremen (2010)

    Google Scholar 

  15. Lukos̆evic̆ius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Comput. Sci. Rev. 3, 127–149 (2009). https://doi.org/10.1016/j.cosrev2009.03.005

    Google Scholar 

  16. Maass, W.: Noisy spiking neurons with temporal coding have more computational power than sigmoidal neurons. Technical Report TR-1999-037, Institute for Theorical Computer Science. Technische Universitaet Graz. Graz, Austria (1999). http://www.igi.tugraz.at/psfiles/90.pdf

  17. Manjunath, G., Jaeger, H.: Echo state property linked to an input: exploring a fundamental characteristic of recurrent neural networks. Neural Comput. 25(3), 671–696 (2013). https://doi.org/10.1162/NECO_a_00411

    Google Scholar 

  18. Martens, J., Sutskever, I.: Training deep and recurrent networks with hessian-free optimization. In: Montavon, G., Orr, G.B., Müller, K.-R. (eds.) Neural Networks: Tricks of the Trade. LNCS, vol. 7700, pp. 479–535. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-35289-8_27

    Google Scholar 

  19. Mueen, A., et al.: The fastest similarity search algorithm for time series subsequences under euclidean distance, August 2017. http://www.cs.unm.edu/~mueen/FastestSimilaritySearch.html

  20. Rodan, A., Tino, P.: Simple deterministically constructed cycle reservoirs with regular jumps. Neural Comput. 24, 1822–1852 (2012). https://doi.org/10.1162/NECO_a_00297

    Google Scholar 

  21. Rumelhart, D.E., Hinton, G.E., McClelland, J.L.: A general framework for parallel distributed processing. In: Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Computational Models of Cognition and Perception, vol. 1, chap. 2, pp. 45–76. MIT Press, Cambridge (1986)

    Google Scholar 

  22. Schmidhuber, J., Wierstra, D., Gagliolo, M., Gomez, F.: Training recurrent networks by Evolino. Neural Netw. 19, 757–779 (2007)

    Google Scholar 

  23. Siegelmann, H.T., Sontag, E.D.: Turing computability with neural nets. Appl. Math. Lett. 4(6), 77–80 (1991). https://doi.org/10.1016/0893-9659(91)90080-F

    Google Scholar 

  24. Verstraeten, D., Schrauwen, B., D’Haene, M., Stroobandt, D.: An experimental unification of reservoir computing methods. Neural Netw. 20(3), 287–289 (2007)

    Google Scholar 

  25. Wainrib, G., Galtier, M.N.: A local Echo State Property through the largest Lyapunov exponent. Neural Netw. 76, 39–45 (2016)

    Google Scholar 

  26. Wang, D., Li, M.: Stochastic configuration networks: fundamentals and algorithms. IEEE Trans. Cybern. 47(10), 3466–3479 (2017). https://doi.org/10.1109/TCYB.2017.2734043

    Google Scholar 

  27. LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521, 436–444 (2015)

    Google Scholar 

  28. Yildiza, I.B., Jaeger, H., Kiebela, S.J.: Re-visiting the echo state property. Neural Netw. 35, 1–9 (2012). https://doi.org/10.1016/j.neunet.2012.07.005

    Google Scholar 

  29. Zhang, B., Miller, D.J., Wang, Y.: Nonlinear system modeling with random matrices: echo state networks revisited. Neural Netw. 76, 39–45 (2016)

    Google Scholar 

Download references

Acknowledgements

This work was supported by the projects SP2019/135 and SP2019/141 of the Student Grant System, VSB-Technical University of Ostrava, Czech Republic, and by the Ministry of Education, Youth and Sports from the Specific Research Projects (SP2019/135 and SP2019/141) and by The Technology Agency of the Czech Republic in the frame of the project TN01000024 National Competence Center-Cybernetics and Artificial Intelligence.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sebastián Basterrech .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Basterrech, S. (2019). Pattern Matching in Sequential Data Using Reservoir Projections. In: Lu, H., Tang, H., Wang, Z. (eds) Advances in Neural Networks – ISNN 2019. ISNN 2019. Lecture Notes in Computer Science(), vol 11554. Springer, Cham. https://doi.org/10.1007/978-3-030-22796-8_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-22796-8_19

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-22795-1

  • Online ISBN: 978-3-030-22796-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics