Skip to main content

Embracing Data Irregularities in Multivariate Time Series with Recurrent and Graph Neural Networks

  • Conference paper
  • First Online:
Intelligent Systems (BRACIS 2023)

Abstract

Data collection in many engineering fields involves multivariate time series gathered from a sensor network. These sensors often display differing sampling rates, missing data, and various irregularities. To manage these issues, complex preprocessing mechanisms are required, which become coupled with any statistical model trained with the transformed data. Modeling the motion of seabed-anchored floating platforms from measurements is a typical example for that. We propose and analyze a model that uses both recurrent and graph neural networks to handle irregularly sampled multivariate time series, while maintaining low computational cost. In this model, each time series is represented as a node in a heterogeneous graph, where edges depict the relationships between each measured variable. The time series are encoded using independent recurrent neural networks. A graph neural network then propagates information across the time series using attention layers. The outcome is a set of updated hidden representations used by the recurrent neural networks to create forecasts in an autoregressive manner. This model can generate forecasts for all input time series simultaneously while remaining lightweight. We argue that this architecture opens up new possibilities as the model can be integrated into low-capacity systems without needing expensive GPU clusters for inference.

We gratefully acknowledge the support from ANP/PETROBRAS, Brazil (project N. 21721-6), CNPq (grants 310085/2020-9 and 310127/2020-3), CAPES (finance code 001) and C4AI-USP (FAPESP grant 2019/07665-4 and IBM Corporation).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 79.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Adhikari, D., Jiang, W., Zhan, J.: Imputation using information fusion technique for sensor generated incomplete data with high missing gap. Microprocess. Microsyst., 103636 (2021). https://doi.org/10.1016/j.micpro.2020.103636

  2. Brody, S., Alon, U., Yahav, E.: How attentive are graph attention networks? CoRR abs/2105.14491 (2021). https://arxiv.org/abs/2105.14491

  3. Bronstein, M.M., Bruna, J., Cohen, T., Velickovic, P.: Geometric Deep Learning Grids, Groups, Graphs, Geodesics, and Gauges. CoRR abs/2104.13478, 160 (2021)

    Google Scholar 

  4. Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation, September 2014. https://doi.org/10.48550/arXiv.1406.1078. arXiv:1406.1078 [cs, stat]

  5. Emmanuel, T., Maupong, T., Mpoeleng, D., Semong, T., Mphago, B., Tabona, O.: A survey on missing data in machine learning. J. Big Data 8(1), 140 (2021). https://doi.org/10.1186/s40537-021-00516-9

  6. Hartman, E.J., Keeler, J.D., Kowalski, J.M.: Layered neural networks with Gaussian hidden units as universal approximations. Neural Comput. 2(2), 210–215 (1990). https://doi.org/10.1162/neco.1990.2.2.210

  7. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9, 1735–80 (1997). https://doi.org/10.1162/neco.1997.9.8.1735

  8. Kazemi, S.M., et al.: Time2Vec: learning a vector representation of time (2019). https://doi.org/10.48550/arXiv.1907.05321. arXiv:1907.05321 [cs]

  9. Makridakis, S., Spiliotis, E., Assimakopoulos, V.: Statistical and machine learning forecasting methods: concerns and ways forward. PLoS ONE 13(3), e0194889 (2018). https://doi.org/10.1371/journal.pone.0194889. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0194889. Publisher: Public Library of Science

  10. Netto, C.F.D., et al.: Modeling oceanic variables with dynamic graph neural networks, June 2022. https://doi.org/10.48550/arXiv.2206.12746. arXiv:2206.12746 [cs]

  11. Rossi, E., Chamberlain, B., Frasca, F., Eynard, D., Monti, F., Bronstein, M.: Temporal graph networks for deep learning on dynamic graphs. arXiv:2006.10637 [cs, stat], October 2020. arXiv: 2006.10637

  12. Rossi, E., Kenlay, H., Gorinova, M.I., Chamberlain, B.P., Dong, X., Bronstein, M.: On the unreasonable effectiveness of feature propagation in learning on graphs with missing node features, May 2022. https://doi.org/10.48550/arXiv.2111.12128. arXiv:2111.12128 [cs]

  13. Saad, A.M., et al.: FPSO mooring line failure detection based on predicted motion. In: Proceedings of the ASME 2021 40th International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers Digital Collection, October 2021. https://doi.org/10.1115/OMAE2021-62413

  14. Scarselli, F., Gori, M., Tsoi, A.C., Hagenbuchner, M., Monfardini, G.: The graph neural network model. IEEE Trans. Neural Netw. 20(1), 61–80 (2009). https://doi.org/10.1109/TNN.2008.2005605. http://ieeexplore.ieee.org/document/4700287/

  15. Schlichtkrull, M., Kipf, T.N., Bloem, P., van den Berg, R., Titov, I., Welling, M.: Modeling relational data with graph convolutional networks. In: Gangemi, A., et al. (eds.) ESWC 2018. LNCS, vol. 10843, pp. 593–607. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-93417-4_38

    Chapter  Google Scholar 

  16. Tallec, C., Ollivier, Y.: Can recurrent neural networks warp time? In: International Conference on Learning Representation 2018, February 2018. https://openreview.net/forum?id=SJcKhk-Ab

  17. Vaswani, A., et al.: Attention is all you need. In: Guyon, I., et al. (eds.) Advances in Neural Information Processing Systems, vol. 30. Curran Associates, Inc. (2017)

    Google Scholar 

  18. Veličković, P., Cucurull, G., Casanova, A., Romero, A., Lió, P., Bengio, Y.: Graph attention networks. arXiv:1710.10903 [cs, stat], February 2018. http://arxiv.org/abs/1710.10903

  19. Wei, W.W.S.: Multivariate Time Series Analysis and Applications. Wiley, p. 528 (2019)

    Google Scholar 

  20. Willmott, C.J.: On the validation of models. Phys. Geogr. 2(2), 184–194 (1981). https://doi.org/10.1080/02723646.1981.10642213. Publisher: Taylor & Francis _eprint

  21. Wu, Z., Pan, S., Chen, F., Long, G., Zhang, C., Yu, P.S.: A comprehensive survey on graph neural networks. IEEE Trans. Neural Netw. Learn. Syst. 32(1), 4–24 (2021). https://doi.org/10.1109/TNNLS.2020.2978386. https://ieeexplore.ieee.org/document/9046288/

  22. Zhang, X., Zeman, M., Tsiligkaridis, T., Zitnik, M.: Graph-guided network for irregularly sampled multivariate time series, March 2022. https://doi.org/10.48550/arXiv.2110.05357. http://arxiv.org/abs/2110.05357 [cs]

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marcel Rodrigues de Barros .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

de Barros, M.R. et al. (2023). Embracing Data Irregularities in Multivariate Time Series with Recurrent and Graph Neural Networks. In: Naldi, M.C., Bianchi, R.A.C. (eds) Intelligent Systems. BRACIS 2023. Lecture Notes in Computer Science(), vol 14195. Springer, Cham. https://doi.org/10.1007/978-3-031-45368-7_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-45368-7_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-45367-0

  • Online ISBN: 978-3-031-45368-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics