Skip to main content
Log in

Multi-model weighted voting method based on convolutional neural network for human activity recognition

  • 1232: Human-centric Multimedia Analysis
  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

In recent years, human activity recognition (HAR) has been widely used in medical rehabilitation, smart home and other fields. Currently, the recognition performance highly depends on feature extraction and effective algorithm. On the one hand, traditional manual feature extraction and classification algorithms hinder the improvement of HAR. On the other hand, the latest deep learning technology can automatically process data and extract features, but it faces the problems of poor feature quality and information loss. In order to solve this problem, this paper proposes a new recognition method using only wearable sensor data. In the feature extraction stage, the axis information of each sensor is extracted separately into one-dimensional data, and information of all axes is integrated into a two-dimensional graph. Then, two deep convolutional neural network models are designed to train the features based on one-dimensional data and two-dimensional graph respectively. Finally, weighted voting method is used to get the classification results. Experiments have shown that the average recognition accuracy of the method in this paper is about 3% higher than that of other HAR deep neural network methods, which shown the advantage of the method in this paper in obtaining better recognition result with limited data.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Data availability

The datasets analysed during the current study are available in followings: (1) “USC-HAD: A daily activity dataset for ubiquitous activity recognition using wearable sensors,” https://dl.acm.org/doi/pdf/10.1145/2370216.2370438. (2) “A public domain dataset for human activity recognition using smartphones,” https://www.researchgate.net/publication/298412360_A_Public_Domain_Dataset_for_Human_Activity_Recognition_using_Smartphones. (3) “Utd-MHAD: A multimodal dataset for human action recognition utilizing a depth camera and a wearable inertial sensor,” https://ieeexplore.ieee.org/document/7350781

References

  1. Gupta S (2021) Deep learning based human activity recognition (HAR) using wearable sensor data. International Journal of Information Management Data Insights 1(2):100046

    Article  Google Scholar 

  2. Raziani S, Azimbagirad M (2022) Deep CNN hyperparameter optimization algorithms for sensor-based human activity recognition. Neuroscience Informatics 2(3):100078

    Article  Google Scholar 

  3. Zhang X (2021) Application of human motion recognition utilizing deep learning and smart wearable device in sports. Int J Syst Assur Eng Manag. 12:835–843

    Article  Google Scholar 

  4. Host K, Ivašić-Kos M (2022) An overview of human action recognition in sports based on computer vision. Heliyon 8(6):e09633

    Article  Google Scholar 

  5. Raeis H, Kazemi M, Shirmohammadi S (2021) Human activity recognition with device-free sensors for well-Being assessment in smart homes. IEEE Instrumentation & Measurement Magazine. 24(6):46–57

    Article  Google Scholar 

  6. Azar SM, Atigh MG, Nickabadi A, Alahi A (2019) Convolutional relational machine for group activity recognition, in IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 7884-7893

  7. Javed AR, Faheem R, Asim M, Baker T, Beg MO (2021) A smartphone sensors-based personalized human activity recognition system for sustainable smart cities. Sustainable Cities and Society 71:102970

    Article  Google Scholar 

  8. Bulling A, Blanke U, Schiele B (2014) A tutorial on human activity recognition using body-worn inertial sensors. ACM Computing Surveys 46(3):1–33

    Article  Google Scholar 

  9. Silva DF, Souza VMA, Batista GEAPA (2013) Time series classification using compression distance of recurrence plots, in IEEE 13th International Conference on Data Mining, pp. 687-696

  10. Zhang M, Sawchuk AA (2012) USC-HAD: A daily activity dataset for ubiquitous activity recognition using wearable sensors, in Proceedings of ACM International Conference on Ubiquitous Computing Workshop on Situation, Activity and Goal Awareness, 1036-1043

  11. D. Anguita, A. Ghio, L. Oneto, X. Parra, J. L. Reyes-Ortiz (2013) A public domain dataset for human activity recognition using smartphones, in 21st European symposium on artificial neural networks, computational intelligence and machine learning, 437-442

  12. Chen C, Jafari R, Kehtarnavaz N (2015) Utd-MHAD: A multimodal dataset for human action recognition utilizing a depth camera and a wearable inertial sensor, in IEEE International Conference on Image Processing (ICIP), 168-172

  13. Wang J, Chen Y, Hao S et al (2019) Deep learning for sensor-based activity recognition: A survey. Pattern Recognition Letters 119:3–11

    Article  Google Scholar 

  14. Zappi P, Lombriser C, Stiefmeier T et al. (2008) Activity recognition from on-body sensors: Accuracy-power trade-off by dynamic sensor selection, in Lecture Notes in Computer Science, 17–33

  15. Maurer U, Smailagic A, Siewiorek DP et al. (2006) Activity recognition and monitoring using multiple sensors on different body positions, in International Workshop on Wearable and Implantable Body Sensor Networks, 113-116

  16. Catal C, Tufekci S, Pirmit E et al (2015) On the use of ensemble of classifiers for accelerometer-based activity recognition. Applied Soft Computing 37:1018–1022

    Article  Google Scholar 

  17. Lara ÓD, Pérez AJ, Labrador MA et al (2012) Centinela: A human activity recognition system based on acceleration and vital sign data. Pervasive and Mobile Computing 8(5):717–729

    Article  Google Scholar 

  18. Feng Z, Mo L, Li M (2015) A random forest-based ensemble method for activity recognition, in 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 5074-5077

  19. Wang A, Chen G, Yang J et al (2016) A comparative study on human activity recognition using inertial sensors in a smartphone. IEEE Sensors Journal 16(11):4566–4578

    Article  Google Scholar 

  20. Prossegger M, Bouchachia A (2014) Multi-resident activity recognition using incremental decision trees, in Adaptive and Intelligent Systems, 182–191

  21. Ronao CA, Cho S-B (2014) Human activity recognition using smartphone sensors with two-stage continuous hidden Markov models, in 10th International Conference on Natural Computation (ICNC), 681-686

  22. Chen Z, Zhang L, Cao Z, Guo J (2018) Distilling the knowledge from handcrafted features for human activity recognition. IEEE Transactions on Industrial Informatics 14(10):4334–4342

    Article  Google Scholar 

  23. Hammerla NY, Halloran S, Plötz T (2016) Deep, convolutional, and recurrent models for human activity recognition using wearables. Journal of Scientific Computing 61(2):454–476

    Google Scholar 

  24. Jiang W, Yin Z (2015) Human activity recognition using wearable sensors by deep convolutional neural networks, in Acm International Conference on Multimedia(ACM), 1307-1310

  25. Yang P, Yang C, Lanfranchi V, Ciravegna F (2022) Activity graph based convolutional neural network for human activity recognition using acceleration and gyroscope data. IEEE Transactions on Industrial Informatics 18(10):6619–6630

    Article  Google Scholar 

  26. Tao D, Wen Y, Hong R (2016) Multicolumn bidirectional long short-term memory for mobile devices-based human activity recognition. IEEE Internet of Things Journal 3(6):1124–1134

    Article  Google Scholar 

  27. Ordóñez F, Roggen D (2016) Deep convolutional and LSTM recurrent neural networks for multimodal wearable activity recognition. Sensors 16(1):115

    Article  Google Scholar 

  28. Xu C, Chai D, He J et al (2019) Innohar: A deep neural network for complex human activity recognition. IEEE Access 7:9893–9902

    Article  Google Scholar 

  29. Chen Z, Jiang C, Xie L (2019) A novel ensemble elm for human activity recognition using smartphone sensors. IEEE Transactions on Industrial Informatics 15(5):2691–2699

    Article  Google Scholar 

  30. Chen Y, Xue Y (2015) A deep learning approach to human activity recognition based on single accelerometer, IEEE Intl Conf Syst Man Cybern, 1488-1492

  31. Jordao A, Nazare AC, Sena J et al. (2018) Human activity recognition based on wearable sensor data: A standardization of the state-of-the-art, CoRR, abs/1806.05226

  32. Reiss A, Hendeby G, Stricker D (2013) A competitive approach for human activity recognition on smartphones, in European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, 455-460

  33. Arshad M, Sabri MA, Ashraf F et al (2022) Hybrid machine learning techniques to detect real time human activity using UCI Dataset. EAI Endorsed Transactions on Internet of Things 7(26):170006

    Article  Google Scholar 

  34. Hsu Y-L, Lin S-L, Chou P-H et al. (2017) Application of nonparametric weighted feature extraction for an inertial-signal-based Human Activity Recognition System, International Conference on Applied System Innovation (ICASI), 1718-1720

  35. Xia K, Huang J, Wang H (2020) LSTM-CNN Architecture for Human Activity Recognition, IEEE Access, pp. 56855-56866

  36. Lyu L, He X, Law YW et al. (2017) Privacy-preserving collaborative deep learning with application to human activity recognition, Proceedings of the 2017 ACM on Conference on Information and Knowledge Management, 1219-1228

  37. Kwapisz JR, Weiss GM, Moore SA (2011) Activity recognition using cell phone accelerometers. ACM SIGKDD Explorations Newsletter 12(2):74–82

    Article  Google Scholar 

  38. Casale P, Pujol O, Radeva P (2011) Human activity recognition from accelerometer data using a wearable device, in Pattern Recognition and Image Analysis, 289–296

  39. Kim H, Kim M, Lee S, Choi YS (2012) An analysis of eating activities for automatic food type recognition, in Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, 1-5

Download references

Acknowledgments

This work was supported by Guangzhou Science and Technology Project (Grant No. 201904010107), Guangdong Provincial Natural Science Foundation of China (Grant No. 2019A1515010793).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhongliang Pan.

Ethics declarations

Conflicts of interest

If the authors have no relevant financial interests in the manuscript and no other potential conflicts of interest to disclose, a statement to this effect should be included.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ouyang, K., Pan, Z. Multi-model weighted voting method based on convolutional neural network for human activity recognition. Multimed Tools Appl 83, 73305–73328 (2024). https://doi.org/10.1007/s11042-023-17500-5

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-023-17500-5

Keywords