Skip to main content
Log in

A multi-graph convolutional network based wearable human activity recognition method using multi-sensors

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Wearable human activity recognition (WHAR) using multi-sensors is a promising research area in ubiquitous and wearable computing. Existing WHAR methods usually interact features learned from multi-sensor data by using convolutional neural networks or fully connected networks, which may ignore the prior relationships among multi-sensors. In this paper, we propose a novel method, called MG-WHAR, which employs graphs to model the relationships among multi-sensors. Specifically, we construct three types of graphs: a body structure based graph, a sensor modality based graph, and a data pattern based graph. In each graph, the nodes represent sensors, and the edges are set according to the relationships among sensors. MG-WHAR, utilizing a multi-graph convolutional network, conducts feature interactions by leveraging the relationships among multi-sensors. This strategy not only enhances model performance but also results in a model with fewer parameters. Compared to the state-of-the-art WHAR methods, our method increases weighted F1-score by 3.2% on Opportunity dataset, 1.9% on Realdisp dataset, and 2.6% on DSADS dataset, while maintaining lower computational complexity.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Notes

  1. https://archive.ics.uci.edu/ml/datasets/opportunity+activity+recogni-tion

  2. https://archive.ics.uci.edu/ml/datasets/REALDISP+Activity+Recog-nition+Dataset

  3. https://archive.ics.uci.edu/ml/datasets/daily+and+sports+activities

  4. https://github.com/LuoYingSong/MG-WHAR

References

  1. Abedin A, Ehsanpour M, Shi Q et al (2021) Attend and discriminate: Beyond the state-of-the-art for human activity recognition using wearable sensors. Proc ACM Interact Mob Wearable and Ubiquitous Technol 5(1):1–22

    Article  Google Scholar 

  2. Baños O, Damas M, Pomares H, et al (2012) A benchmark dataset to evaluate sensor displacement in activity recognition. In: Proceedings of the ACM Conference on ubiquitous computing, pp 1026–1035

  3. Barshan B, Yüksek MC (2014) Recognizing daily and sports activities in two open source machine learning environments using body-worn sensor units. Comput J 57(11):1649–1667

    Article  Google Scholar 

  4. Blei DM, Ng AY, Jordan MI (2003) Latent dirichlet allocation. J Mach Learn Res 3(3):993–1022

    MATH  Google Scholar 

  5. Bruna J, Zaremba W, Szlam A, et al (2014) Spectral networks and locally connected networks on graphs. In: Proceedings of the international conference on learning representations, pp 1–14

  6. Cao Z, Li S, Liu Y, et al (2015) A novel neural topic model and its supervised extension. In: Proceedings of the AAAI Conference on artificial intelligence, pp 2210–2216

  7. Challa SK, Kumar A, Semwal VB (2022) A multibranch CNN-BiLSTM model for human activity recognition using wearable sensor data. Vis Comput 38(12):4095–4109

    Article  Google Scholar 

  8. Defferrard M, Bresson X, Vandergheynst P (2016) Convolutional neural networks on graphs with fast localized spectral filtering. In: Proceedings of the advances in neural information processing systems, pp 3844–3852

  9. Geng X, Li Y, Wang L, et al (2019) Spatiotemporal multi-graph convolution network for ride-hailing demand forecasting. In: Proceedings of the AAAI Conference on artificial intelligence, pp 3656–3663

  10. Guo H, Chen L, Shen Y, et al (2014) Activity recognition exploiting classifier level fusion of acceleration and physiological signals. In: Proceedings of the ACM International joint conference on pervasive and ubiquitous computing: Adjunct publication, pp 63–66

  11. Guo H, Chen L, Peng L, et al (2016) Wearable sensor based multimodal human activity recognition exploiting the diversity of classifier ensemble. In: Proceedings of the ACM International joint conference on pervasive and ubiquitous computing, pp 1112–1123

  12. Guo M, Wang Z, Yang N et al (2018) A multisensor multiclassifier hierarchical fusion model based on entropy weight for human activity recognition using wearable inertial sensors. IEEE Trans Hum Mach Syst 49(1):105–111

    Article  Google Scholar 

  13. Hammerla NY, Halloran S, Ploetz T (2016) Deep, convolutional, and recurrent models for human activity recognition using wearables. J Sci Comput 61(2):454–476

    Google Scholar 

  14. Hammond DK, Vandergheynst P, Gribonval R (2011) Wavelets on graphs via spectral graph theory. Appl Comput Harmonic Anal 30(2):129–150

    Article  MathSciNet  MATH  Google Scholar 

  15. Han J, He Y, Liu J, et al (2019) GraphConvLSTM: Spatiotemporal learning for activity recognition with wearable sensors. In: Proceedings of the IEEE Global communications conference, pp 1–6

  16. Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780

    Article  Google Scholar 

  17. Hu R, Chen L, Miao S, et al (2023) SWL-Adapt: An unsupervised domain adaptation model with sample weight learning for cross-user wearable human activity recognition. In: Proceedings of the AAAI Conference on artificial intelligence, pp 6012–6020

  18. Huynh T, Fritz M, Schiele B (2008) Discovery of activity patterns using topic models. In: Proceedings of the international conference on ubiquitous computing, pp 10–19

  19. Kingma DP, Ba J (2015) Adam: A method for stochastic optimization. In: Proceedings of the international conference on learning representations, pp 1–15

  20. Kondo K, Hasegawa T (2021) Sensor-based human activity recognition using adaptive class hierarchy. IEEE Sens J 21(22):7743–7762

    Google Scholar 

  21. Kunze K, Lukowicz P (2008) Dealing with sensor displacement in motion-based onbody activity recognition systems. In: Proceedings of the international conference on ubiquitous computing, pp 20–29

  22. Lara OD, Pérez AJ, Labrador MA et al (2012) Centinela: A human activity recognition system based on acceleration and vital sign data. Pervasive Mob Comput 8(5):717–729

    Article  Google Scholar 

  23. Li C, Hou Y, Wang P et al (2019) Multiview-based 3-D action recognition using deep networks. IEEE Trans Hum Mach Syst 49(1):95–104

    Article  Google Scholar 

  24. Li M, Rozgić V, Thatte G et al (2010) Multimodal physical activity recognition by fusing temporal and cepstral information. IEEE Trans Neural Syst Rehabil Eng 18(4):369–380

    Article  Google Scholar 

  25. Lian J, Zhou X, Zhang F, et al (2018) xDeepFM: Combining explicit and implicit feature interactions for recommender systems. In: Proceedings of the ACM SIGKDD International conference on knowledge discovery & data mining, pp 1754–1763

  26. Liu S, Yao S, Huang Y et al (2020) Handling missing sensors in topology-aware IoT applications with gated graph neural network. Proc ACM Interact Mob Wearable Ubiquitous Technol 4(3):1–31

    Article  Google Scholar 

  27. Liu S, Yao S, Li J et al (2020) GlobalFusion: A global attentional deep learning framework for multisensor information fusion. Proc ACM Interact Mob Wearable Ubiquitous Technol 4(1):1–27

    Article  Google Scholar 

  28. Mohamed A, Lejarza F, Cahail S, et al (2022) HAR-GCNN: Deep graph CNNs for human activity recognition from highly unlabeled mobile sensor data. In: Proceedings of IEEE International conference on pervasive computing and communications workshops and other affiliated events, pp 335–340

  29. Mondal R, Mukherjee D, Singh PK et al (2021) A new framework for smartphone sensor-based human activity recognition using graph neural network. IEEE Sens J 21(10):11461–11468

    Article  Google Scholar 

  30. Murahari VS, Plötz T (2018) On attention models for human activity recognition. In: Proceedings of the ACM International Symposium on Wearable Computers, pp 100–103

  31. Nian A, Zhu X, Xu X, et al (2022) HGCNN: Deep graph convolutional network for sensor-based human activity recognition. In: Proceedings of international conference on big data and information analytics, pp 422–427

  32. Ordóñez FJ, Roggen D (2016) Deep convolutional and LSTM recurrent neural networks for multimodal wearable activity recognition. IEEE Sens J 16(1):115–140

    Google Scholar 

  33. Paszke A, Gross S, Massa F, et al (2019) PyTorch: An imperative style, high-performance deep learning library. In: Proceedings of the advances in neural information processing systems, pp 8026–8037

  34. Peng L, Chen L, Wu X et al (2017) Hierarchical complex activity representation and recognition using topic model and classifier level fusion. IEEE Trans Biomed Eng 64(6):1369–1379

    Article  Google Scholar 

  35. Peng L, Chen L, Ye Z et al (2018) AROMA: A deep multi-task learning based simple and complex human activity recognition method using wearable sensors. Proc ACM Interact Mob Wearable Ubiquitous Technol 2(2):1–16

    Article  Google Scholar 

  36. Peng L, Chen L, Wu M et al (2019) Complex activity recognition using acceleration, vital sign, and location data. IEEE Trans Mob Comput 18(7):1488–1498

    Article  Google Scholar 

  37. Riley KF, Hobson MP, Bence SJ (1999) Mathematical methods for physics and engineering. Am J Phys 67(2):165–169

    Article  Google Scholar 

  38. Roggen D, Calatroni A, Rossi M, et al (2010) Collecting complex activity datasets in highly rich networked sensor environments. In: Proceedings of the international conference on networked sensing systems, pp 233–240

  39. Si C, Chen W, Wang W, et al (2019) An attention enhanced graph convolutional LSTM network for skeleton-based action recognition. In: Proceedings of the IEEE/CVF Conference on computer vision and pattern recognition, pp 1227–1236

  40. Song C, Lin Y, Guo S, et al (2020) Spatial-temporal synchronous graph convolutional networks: A new framework for spatial-temporal network data forecasting. In: Proceedings of the AAAI conference on artificial intelligence, pp 914–921

  41. Wang C, Gao Y, Mathur A et al (2021) Leveraging activity recognition to enable protective behavior detection in continuous data. Proc ACM Interact Mob Wearable Ubiquitous Technol 5(2):1–27

    Article  Google Scholar 

  42. Yan S, Xiong Y, Lin D (2018) Spatial temporal graph convolutional networks for skeleton-based action recognition. In: Proceedings of the AAAI Conference on artificial intelligence, pp 7444–7452

  43. Yang J, Nguyen MN, San PP, et al (2015) Deep convolutional neural networks on multichannel time series for human activity recognition. In: Proceedings of the international joint conference on artificial intelligence, pp 3995–4001

  44. Yang P, Yang C, Lanfranchi V et al (2022) Activity graph based convolutional neural network for human activity recognition using acceleration and gyroscope data. IEEE Trans Ind Inf 18(10):6619–6630

    Article  Google Scholar 

  45. Yao S, Hu S, Zhao Y, et al (2017) DeepSense: A unified deep learning framework for time-series mobile sensing data processing. In: Proceedings of the international conference on world wide web, pp 351–360

  46. Zeng M, Nguyen LT, Yu B, et al (2014) Convolutional neural networks for human activity recognition using mobile sensors. In: Proceedings of the international conference on mobile computing, applications and services, pp 197–205

Download references

Funding

None

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ling Chen.

Ethics declarations

Conflicts of interest

The authors have no relevant financial or non-financial interests to disclose.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chen, L., Luo, Y., Peng, L. et al. A multi-graph convolutional network based wearable human activity recognition method using multi-sensors. Appl Intell 53, 28169–28185 (2023). https://doi.org/10.1007/s10489-023-04997-4

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-023-04997-4

Keywords

Navigation