Abstract
Despite the popularisation of machine learning models, more often than not, they still operate as black boxes with no insight into what is happening inside the model. There exist a few methods that allow to visualise and explain why a model has made a certain prediction. Those methods, however, allow visualisation of the link between the input and output of the model without presenting how the model learns to represent the data used to train the model as whole. In this paper, a method that addresses that issue is proposed, with a focus on visualising multi-dimensional time-series data. Experiments on a high-frequency stock market dataset show that the method provides fast and discernible visualisations. Large datasets can be visualised quickly and on one plot, which makes it easy for a user to compare the learned representations of the data. The developed method successfully combines known techniques to provide an insight into the inner workings of time-series classification models.
Similar content being viewed by others
Code availability
Publicly available at https://github.com/CptPirx/Time-Viz.
References
Daily market summary. http://www.nasdaqtrader.com/Trader.aspx?id=DailyMarketSummary. Accessed 18 May 2020
Tran DT, Iosifids A, Kanniainen J, Gabbouj M (2019) Temporal attention augmented bilinear network for financial time-series data analysis. IEEE Trans Neural Netw Learn Syst 30(5):1407–1418
Kercheval AN, Zhang Y (2015) Modelling high-frequency limit order book dynamics with support vector machines. Quant Finance 15(8):1315–1329
Passalis N, Tefas A, Kanniainen J, Gabbouj M, Iosifidis A (2020) Deep adaptive input normalization for time series forecasting. IEEE Trans Neural Netw Learn Syst 31(9):3760–3765
Mäkinen M, Kanniainen J, Gabbouj M, Iosifidis A (2018) Forecasting of jump arrivals in stock prices: new attention-based network architecture using limit order book data. Quant Finance 19(12):2033–2050
Niu T, Wang J, Lu H, Yang W, Du P (2020) Developing a deep learning framework with two-stage feature selection for multivariate financial time series forecasting. Expert Syst Appl. https://doi.org/10.1016/j.eswa.2020.113237
Nousi P, Tsantekidis A, Passalis N, Ntakaris A, Kanniainen J, Tefas A, Gabbouj M, Iosifidis A (2018) Machine learning for forecasting mid price movement using limit order book data. IEEE Access 7:64722–64736
Qin Y, Song D, Cheng H, Cheng W, Jiang G, Cottrell GW (2017) A dual-stage attention-based recurrent neural network for time series prediction. In: Proceedings of the 26th international joint conference on artificial intelligence, IJCAI’17, pp 2627–2633
Passalis N, Tefas A, Kanniainen J, Gabbouj M, Iosifidis A (2019) Deep temporal logistic bag-of-features for forecasting high frequency limit order book time series. In: International conference on acoustics, speech and signal processing. ICASSP, pp 7545–7549. IEEE
Van Wijk JJ (2006) Views on visualization. IEEE Trans Vis Comput Graph 12(4):433–1000
Fulcher BD, Little MA, Jones NS (2013) Highly comparative time-series analysis: the empirical structure of time series and their methods. J R Soc Interface 10(83):20130048
Fulcher BD, Jones NS (2014) Highly comparative feature-based time-series classification. IEEE Trans Knowl Data Eng 26(12):3026–3037
Hyndman RJ, Wang E, Laptev N (2016) Large-scale unusual time series detection. In: Proceedings—15th IEEE international conference on data mining workshop, ICDMW 2015, pp 1616–1619
Kang Y, Hyndman RJ, Smith-Miles K (2017) Visualising forecasting algorithm performance using time series instance spaces. Int J Forecast 33(2):345–358
Nguyen M, Purushotham S, To H, Shahabi C(2016) m-TSNE: a framework for visualizing high-dimensional multivariate time series. In: Proceedings of the 2016 workshop on visual analytics in healthcare, pp 22–29
van der Maaten L (2009) Learning a parametric embedding by preserving local structure. In: Proceedings of the twelfth international conference on artificial intelligence and statistics, proceedings of machine learning research, vol 5, pp 384–391. Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA
Kingma DP, Ba J (2015) Adam: a method for stochastic optimization. abs/1412.6980
Murtagh F (1991) Multilayer perceptrons for classification and regression. Neurocomputing 2(5):183–197
LeCun Y, Haffner P, Bottou L, Bengio Y (1999) Object recognition with gradient-based learning. In: Lecture notes in computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol 1681, pp 319–345. https://doi.org/10.1007/3-540-46805-6_19
Gers FA, Schmidhuber J, Cummins F (1999) Learning to forget: continual prediction with LSTM. Neural Comput 12:2451–2471
van der Maaten L, Hinton G (2008) Visualizing data using t-SNE. J Mach Learn Res 9:2579–2605
Ntakaris A, Magris M, Kanniainen J, Gabbouj M, Iosifids A (2018) Benchmark dataset for mid-price forecasting of limit order book data with machine learning methods. J Forecast 37(8):852–866
Cont R (2011) Statistical modeling of high-frequency financial data. IEEE Signal Process Mag 28(5):16–25
Acknowledgements
Alexandros Iosifidis acknowledges funding from the Project DISPA (Grant 9041-00004B) funded by the Independent Research Fund Denmark.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Leporowski, B., Iosifidis, A. Visualising deep network time-series representations. Neural Comput & Applic 33, 16489–16498 (2021). https://doi.org/10.1007/s00521-021-06244-8
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-021-06244-8