Skip to main content
Log in

Visualising deep network time-series representations

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Despite the popularisation of machine learning models, more often than not, they still operate as black boxes with no insight into what is happening inside the model. There exist a few methods that allow to visualise and explain why a model has made a certain prediction. Those methods, however, allow visualisation of the link between the input and output of the model without presenting how the model learns to represent the data used to train the model as whole. In this paper, a method that addresses that issue is proposed, with a focus on visualising multi-dimensional time-series data. Experiments on a high-frequency stock market dataset show that the method provides fast and discernible visualisations. Large datasets can be visualised quickly and on one plot, which makes it easy for a user to compare the learned representations of the data. The developed method successfully combines known techniques to provide an insight into the inner workings of time-series classification models.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Code availability

Publicly available at https://github.com/CptPirx/Time-Viz.

Notes

  1. https://github.com/jsilter/parametric_tsne.

References

  1. Daily market summary. http://www.nasdaqtrader.com/Trader.aspx?id=DailyMarketSummary. Accessed 18 May 2020

  2. Tran DT, Iosifids A, Kanniainen J, Gabbouj M (2019) Temporal attention augmented bilinear network for financial time-series data analysis. IEEE Trans Neural Netw Learn Syst 30(5):1407–1418

    Article  Google Scholar 

  3. Kercheval AN, Zhang Y (2015) Modelling high-frequency limit order book dynamics with support vector machines. Quant Finance 15(8):1315–1329

    Article  MathSciNet  Google Scholar 

  4. Passalis N, Tefas A, Kanniainen J, Gabbouj M, Iosifidis A (2020) Deep adaptive input normalization for time series forecasting. IEEE Trans Neural Netw Learn Syst 31(9):3760–3765

    Article  Google Scholar 

  5. Mäkinen M, Kanniainen J, Gabbouj M, Iosifidis A (2018) Forecasting of jump arrivals in stock prices: new attention-based network architecture using limit order book data. Quant Finance 19(12):2033–2050

    Article  MathSciNet  Google Scholar 

  6. Niu T, Wang J, Lu H, Yang W, Du P (2020) Developing a deep learning framework with two-stage feature selection for multivariate financial time series forecasting. Expert Syst Appl. https://doi.org/10.1016/j.eswa.2020.113237

    Article  Google Scholar 

  7. Nousi P, Tsantekidis A, Passalis N, Ntakaris A, Kanniainen J, Tefas A, Gabbouj M, Iosifidis A (2018) Machine learning for forecasting mid price movement using limit order book data. IEEE Access 7:64722–64736

    Article  Google Scholar 

  8. Qin Y, Song D, Cheng H, Cheng W, Jiang G, Cottrell GW (2017) A dual-stage attention-based recurrent neural network for time series prediction. In: Proceedings of the 26th international joint conference on artificial intelligence, IJCAI’17, pp 2627–2633

  9. Passalis N, Tefas A, Kanniainen J, Gabbouj M, Iosifidis A (2019) Deep temporal logistic bag-of-features for forecasting high frequency limit order book time series. In: International conference on acoustics, speech and signal processing. ICASSP, pp 7545–7549. IEEE

  10. Van Wijk JJ (2006) Views on visualization. IEEE Trans Vis Comput Graph 12(4):433–1000

    Article  Google Scholar 

  11. Fulcher BD, Little MA, Jones NS (2013) Highly comparative time-series analysis: the empirical structure of time series and their methods. J R Soc Interface 10(83):20130048

    Article  Google Scholar 

  12. Fulcher BD, Jones NS (2014) Highly comparative feature-based time-series classification. IEEE Trans Knowl Data Eng 26(12):3026–3037

    Article  Google Scholar 

  13. Hyndman RJ, Wang E, Laptev N (2016) Large-scale unusual time series detection. In: Proceedings—15th IEEE international conference on data mining workshop, ICDMW 2015, pp 1616–1619

  14. Kang Y, Hyndman RJ, Smith-Miles K (2017) Visualising forecasting algorithm performance using time series instance spaces. Int J Forecast 33(2):345–358

    Article  Google Scholar 

  15. Nguyen M, Purushotham S, To H, Shahabi C(2016) m-TSNE: a framework for visualizing high-dimensional multivariate time series. In: Proceedings of the 2016 workshop on visual analytics in healthcare, pp 22–29

  16. van der Maaten L (2009) Learning a parametric embedding by preserving local structure. In: Proceedings of the twelfth international conference on artificial intelligence and statistics, proceedings of machine learning research, vol 5, pp 384–391. Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA

  17. Kingma DP, Ba J (2015) Adam: a method for stochastic optimization. abs/1412.6980

  18. Murtagh F (1991) Multilayer perceptrons for classification and regression. Neurocomputing 2(5):183–197

    Article  MathSciNet  Google Scholar 

  19. LeCun Y, Haffner P, Bottou L, Bengio Y (1999) Object recognition with gradient-based learning. In: Lecture notes in computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol 1681, pp 319–345. https://doi.org/10.1007/3-540-46805-6_19

  20. Gers FA, Schmidhuber J, Cummins F (1999) Learning to forget: continual prediction with LSTM. Neural Comput 12:2451–2471

    Article  Google Scholar 

  21. van der Maaten L, Hinton G (2008) Visualizing data using t-SNE. J Mach Learn Res 9:2579–2605

    MATH  Google Scholar 

  22. Ntakaris A, Magris M, Kanniainen J, Gabbouj M, Iosifids A (2018) Benchmark dataset for mid-price forecasting of limit order book data with machine learning methods. J Forecast 37(8):852–866

    Article  MathSciNet  Google Scholar 

  23. Cont R (2011) Statistical modeling of high-frequency financial data. IEEE Signal Process Mag 28(5):16–25

    Article  Google Scholar 

Download references

Acknowledgements

Alexandros Iosifidis acknowledges funding from the Project DISPA (Grant 9041-00004B) funded by the Independent Research Fund Denmark.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Błażej Leporowski.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Leporowski, B., Iosifidis, A. Visualising deep network time-series representations. Neural Comput & Applic 33, 16489–16498 (2021). https://doi.org/10.1007/s00521-021-06244-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-021-06244-8

Keywords

Navigation