Skip to main content
Log in

Effect of Sparse Representation of Time Series Data on Learning Rate of Time-Delay Neural Networks

  • Published:
Circuits, Systems, and Signal Processing Aims and scope Submit manuscript

Abstract

In this paper, we examine how sparsifying input to a time-delay neural network (TDNN) can significantly improve the learning time and accuracy of the TDNN for time series data. The sparsifying of input is done through a sparse transform input layer. Many applications that involve prediction or forecasting of the state of a dynamic system can be formulated as a time series forecasting problem. Here, the task is to forecast some state variable, which is represented as a time series in applications such as weather forecasting, energy consumption prediction or predicting future state of a moving object. While there are many tools for time-delay forecasting, TDNNs have recently received more attention. We show that through applying a sparsifying input transform layer to the TDNN, we can considerably improve the learning time and accuracy. Through analyzing the learning process, we demonstrate the mathematical reasons for this improvement. Experiments with several datasets are used to show the improvement and the reason behind it. We use data from national weather forecast datasets, vehicle speed time series and synthetic data. Several different sparse representations are evaluated including principal component analysis (PCA), discrete cosine transform (DCT) and a mixture of DCT and Haar transforms. It is observed that the higher sparsity leads to better performance. The relative simplicity of TDNNs, compared with deep networks, and the use of sparse transforms for quicker learning open up possibilities for online learning in small embedded devices that do not have powerful computing capabilities.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Code availability

MATLAB neural network toolbox is used in this research, which is commercially available.

References

  1. H. Abdi, L.J. Williams:Principal component analysis. Wiley Interdisciplinary Reviews: Computational Statistics. 2 (4): 433–459. doi:https://doi.org/10.1002/wics.101. (2010).

  2. A. Aggarwal, M.M. Tripathi: A novel hybrid approach using wavelet transform, time series time delay neural network, and error predicting algorithm for day-ahead electricity price forecasting. International Conf. on Computer Apps In Elec. Eng.-Recent Advances (CERA).199–204(2017) doi: https://doi.org/10.1109/CERA.2017.8343326

  3. M. Aharon, M. Elad and A. Bruckstein: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation. IEEE Trans on Signal Proc. 54 (11): 4311–4322, 2006

  4. R. Asadi, S.A. Kareem: Review of Feed Forward Neural Network classification preprocessing techniques. Proceedings of the 3rd Int. Conf. on Math. Sciences, AIP Conf. Proc. 1602, 567 (2014)

  5. A. Aussem, F. Murtagh, Combining Neural Network Forecasts on Wavelet-transformed Time Series. Connect. Sci. 9, 113–122 (1997)

    Article  Google Scholar 

  6. K. Benmahdjoub, Z. Ameur, M. Boulifa, Forecasting of rainfall using time delay neural network in Tizi-Ouzou (Algeria). Energy. Procedia. 36, 1138–1146 (2013)

    Article  Google Scholar 

  7. H. Cecotti, A. Gräser: Time Delay Neural Network with Fourier transform for multiple channel detection of Steady-State Visual Evoked Potentials for Brain-Computer Interfaces. 16th European Signal Processing Conf. 1–5 (2008).

  8. P. Chen, A. Niu, D. Liu, W. Jiang, B. Ma: Time Series Forecasting of Temperatures using SARIMA: An Example from Nanjing. 2018 IOP Conf. Ser.: Mater. Sci. Eng. 394: 052024 (2018)

  9. H. Demuth, M. Beale, Neural Network Toolbox User’s Guide (Mathworks, Natic, 2002).

    Google Scholar 

  10. Dingus, T. A., Klauer, S. G., Neale, V. L., Peterson, A., Lee, S. E., Sudweeks, Hankey, J., Ramsey, D., Gupta, S., Bucher, C., Doerzaph, Z. R., Jarmeland, J., & Knipling, R. R. 2006. The 100-Car naturalistic driving study. Phase II – Results of the 100-Car field experiment. DC, National Highway Traffic Safety Administration. 2006

  11. Y.P. Fallah, M.K. Khandani, Context and Network Aware Comm. Strategies for Connected Vehicle Safety Applications. IEEE Intell. Trans. Sys. Mag. 8(4), 92–101 (2016)

    Article  Google Scholar 

  12. K. Girish, S.K Jha: Time-delay neural networks for time series prediction: an application to the monthly wholesale price of oilseeds in India. Neural Computing and Apps 24(3). Springer Link 2014,

  13. M.T. Hagan, M. Menhaj, Training feedforward networks with the Marquardt algorithm. IEEE Trans. Neural Net. 5, 989–993 (1994)

    Article  Google Scholar 

  14. "haar". Fourier.eng.hmc.edu. (2013). http://fourier.eng.hmc.edu/e161/lectures/Haar/node1.html

  15. S. Hochreiter, J. Schmidhuber, Long short-term memory. Neural. Comput. 9(8), 1735–1780 (1997). https://doi.org/10.1162/neco.1997.9.8.1735

    Article  Google Scholar 

  16. Huang, D., Bai, X-R.: A Wavelet Neural Network Optimal Control Model for Traffic-Flow Prediction in Intelligent Transport Systems. Adv. Intelligent Computing Theories and Apps. With Aspects of A.I.. ICIC 2007. LNCS, vol 4682. 1233–1244 (2007).

  17. Ch-L. Huang, Y.P. Fallah, R. Sengupta, H. Krishnan: “Adaptive Intervehicle Communication Control for Cooperative Safety Systems”, IEEE Network, ISSN 0890–8044.vol 23. Issue 1.6 – 13(2010)

  18. https://www.ncdc.noaa.gov/data-access/marineocean-data/extended-reconstructed-sea-surface-temperature-ersst-v5

  19. M.K. Khandani, W.B. Mikhael.: Using Mixed DCT and Haar Transforms for Efficient Compression of Car Trajectory Data. IEEE 61 international Midwest Symp.On Circuits and sys., 692–695, 2018

  20. M.k. Khandani, W.B. Mikhael: Efficient Time Series Forecasting Using Time Delay Neural Networks with Domain Pre-Transforms. IEEE MWSCAS, 682–685(2019)

  21. M.K. Khandani, W. Mikhael, Y.P. Fallah, K.G. Popstojanova.: Data-Based Analysis of Sampling and Estimation Methods for Vehicle Tracking Over Wireless Networks. IEEE (DASC/PiCom/DataCom/CyberSciTech). 202–207 (2017)

  22. M.K. Khandani, W.B. Mikhael: A Study on Network Size Reduction Using Sparse Input Representation in Time Delay Neural Networks. 2020 IEEE 63rd International Midwest Symposium on Circuits and Systems (MWSCAS) 864–867 (2020) doi: https://doi.org/10.1109/MWSCAS48704.2020.9184438.

  23. T-W. Kim, J.B. Valdes: Nonlinear Model for Drought Forecasting Based on a Conjunction of Wavelet Transforms and Neural Networks. ASCE, Journal of Hydrologic Engineering. vol 8 issue 6 (2003), doi.org/https://doi.org/10.1061/(ASCE)1084-0699(2003)8:6(319)

  24. B. Lee, Y.S. Tarng, Application of the discrete wavelet transform to the monitoring of tool failure in end milling using the spindle motor current. Int. J. Adv. Manuf. Tech. 15(4), 238–243 (1999). https://doi.org/10.1007/s001700050062(1999)

    Article  Google Scholar 

  25. X. Li, H. Shen, L. Zhang, H. Zhang, Q. Yuan and G. Yang: Recovering Quantitative Remote Sensing Products Contaminated by Thick Clouds and Shadows Using Multitemporal Dictionary Learning. IEEE Transac. on Geoscience and Remote Sensing.vol.52.no.11.7086–7098(2014) doi: https://doi.org/10.1109/TGRS.2014.2307354.

  26. Li, X., Shen H., Zhang L., Li H. : Sparse-based reconstruction of missing information in remote sensing images from spectral/temporal complementary information. ISPRS Journal of Photogrammetry and Remote Sensing, vol 106. 1–15(2015)

  27. X. Li, F. Li, X. Zhang, C. Yang and W. Gui: Exponential Stability Analysis for Delayed Semi-Markovian Recurrent Neural Networks: A Homogeneous Polynomial Approach. IEEE Transactions on Neural Networks and Learning Systems. vol. 29. no. 12. 6374–6384 (2018) doi: https://doi.org/10.1109/TNNLS.2018.2830789.Volume 29 issue 12, Number 1, January 2018.

  28. B. Liu, W. Zhang,X. Xu, D. Chen: Time Delay Recurrent Neural Network for Speech Recognition. Published under licence by IOP Publishing Ltd, Journal of Physics: Conference Series, Volume 1229, 2019 3rd International Conference on Machine Vision and Information Technology. 22–24 (2019)

  29. S.B. Mahongo, M.C. Deo, Using artificial neural networks to forecast monthly and seasonal sea surface temperature anomalies in the western Indian Ocean. Int. J. Ocean Climate Syst. 4, 133–150 (2013). https://doi.org/10.1260/1759-3131.4.2.133

    Article  Google Scholar 

  30. W.B. Mikhael, A. Ramaswamy: An Efficient Representation of Nonstationary Signals Using Mixed-Transforms with Applications to Speech IEEE Trans. on Cir. and Sys-11. Analog and Digital Signal Proc 42(6):393 - 401

  31. NGSIM Homepage. FHWA. http://ngsim.fhwa.dot.gov.

  32. Z. Pan, H. Bolouri.: High Speed Face Recognition Based on Discrete Cosine Transforms and Neural Networks. http://citeseer.ist.psu.edu/270448.html. submitted to IEEE trans. On PAMI 1999.

  33. K. Patil, M.C. Deo, Prediction of daily sea surface temperature using efficient neural networks. Ocean Dyn. 67, 357–368 (2017). https://doi.org/10.1007/s10236-017-1032-9

    Article  Google Scholar 

  34. E. Pisoni, F. Pastor, M. Volta, Artificial Neural Networks to reconstruct incomplete satellite data: application to the Mediterranean Sea Surface Temperature. Nonlinear Proc. Geophys. 15(1), 61–70 (2008)

    Article  Google Scholar 

  35. Shaw P.J.A.: Multivariate statistics for the Environmental Sciences, Hodder-Arnold. ISBN0–340–80763–6 (2003)

  36. Y. Wang, C. Xu, C. Xu, D. Tao, Packing Convolutional Neural Networks in the Frequency Domain. IEEE Trans. Patt. An. Machine. Intel. 41(10), 2495–2510 (2019)

    Article  Google Scholar 

  37. J. Wright, A.Y. Yang, A. Ganesh, S.S. Sastry, Y. Ma, Robust Face Recognition via Sparse Representation. IEEE Trans. Patt. An. Machine Intell. 31(2), 210–227 (2009). https://doi.org/10.1109/TPAMI.2008.79

    Article  Google Scholar 

  38. X. Jiang, H. Adeli, Dynamic Wavelet Neural Network Model for Traffic Flow Forecasting. J. Trans. Eng. 131(10), 771–779 (2005)

    Article  Google Scholar 

  39. J. Yang, J. Wright, T.S. Huang, Y. Ma, Image Super-Resolution Via Sparse Representation. IEEE Transac. Image. Proc. 19(11), 2861–2873 (2010). https://doi.org/10.1109/TIP.2010.2050625

    Article  MathSciNet  MATH  Google Scholar 

  40. R. Yang, Z. Zhang, P. Shi, Exponential Stability on Stochastic Neural Networks With Discrete Interval and Distributed Delays. IEEE Transac. Neural Networks. 21(1), 169–175 (2010). https://doi.org/10.1109/TNN.2009.2036610

    Article  Google Scholar 

Download references

Funding

Not Applicable.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Masoumeh Kalantari Khandani.

Ethics declarations

Conflicts of interest

Not Applicable.

Availability of data and material

The datasets generated during the current study are available from the corresponding author on reasonable request. The public datasets analyzed during the current study are publicly available from sources in references [10, 18, 31].

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kalantari Khandani, M., Mikhael, W.B. Effect of Sparse Representation of Time Series Data on Learning Rate of Time-Delay Neural Networks. Circuits Syst Signal Process 40, 3007–3032 (2021). https://doi.org/10.1007/s00034-020-01610-8

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00034-020-01610-8

Keywords

Navigation