ABSTRACT
Sensor-based human activity monitoring and detection have become an emerging field of intense research and development in recent years due to its immense applications in wide area of human endeavors. Human activity recognition integrates diverse sensors with machine learning algorithms to provide contextual information on relative activity details for health-related feedbacks and lifestyles changes. However, there are varieties of sensors for implementing human activity recognition with diverse capabilities and types of activities they provide best performances. Also, the diverse nature of human activities and nature in which they are performed by the individual makes them challenging to recognize. Therefore, determining the impact of these sensors in human activity recognition using machine learning techniques are of immense advantages in human activity monitoring and detection. The objective of this paper is to comprehensively evaluate the performance of single and multi-sensor fusion for human activity recognition using accelerometer and gyroscope sensors. Firstly, the performances of these sensors were extensively analyzed individually using seven classification algorithms. Secondly, we conducted a comprehensive experimental evaluation of sensor fusion attached on the same location and on different locations of the body. The extensive evaluation with 10-fold cross validation demonstrates that highest average F-measures for single sensor and fusion are 0.908 and 0.938 with Random Forest and Voting ensemble algorithms respectively. Furthermore, the fusion of heterogeneous sensors attached to different locations of the body shows Chest and Hip sensors fusion achieves an average F-measure of 0.942 and classification accuracy of 94.23% using Random Forest algorithm. The outcome of our experimental evaluation shows the significant impact of multi-sensor fusion for human activity monitoring and detection.
- L. Cao, Y. Wang, B. Zhang, Q. Jin, and A. V. Vasilakos, "GCHAR: An efficient Group-based Context--aware human activity recognition on smartphone," Journal of Parallel and Distributed Computing, 2017.Google Scholar
- E. Garcia-Ceja, C. E. Galván-Tejada, and R. Brena, "Multi-view stacking for activity recognition with sound and accelerometer data," Information Fusion, vol. 40, pp. 45--56, 2018/03/01/ 2018. Google ScholarDigital Library
- Y. Sun and B. Wang, "Indoor corner recognition from crowdsourced trajectories using smartphone sensors," Expert Systems with Applications, vol. 82, pp. 266--277, 2017. Google ScholarDigital Library
- H. F. Nweke, Y. W. Teh, G. Mujtaba, and M. A. Al-garadi, "Data fusion and multiple classifier systems for human activity detection and health monitoring: Review and open research directions," Information Fusion, vol. 46, pp. 147--170, 2019/03/01/ 2019.Google ScholarDigital Library
- A. K. Chowdhury, D. Tjondronegoro, V. Chandran, and S. G. Trost, "Ensemble Methods for Classification of Physical Activities from Wrist Accelerometry," Medicine and science in sports and exercise, vol. 49, p. 1965, 2017.Google Scholar
- O. Baños, M. Damas, H. Pomares, and I. Rojas, "Activity recognition based on a multi-sensor meta-classifier," in International Work-Conference on Artificial Neural Networks, 2013, pp. 208--215. Google ScholarDigital Library
- A. Bulling, U. Blanke, and B. Schiele, "A tutorial on human activity recognition using body-worn inertial sensors," ACM Comput. Surv., vol. 46, pp. 1--33, 2014. Google ScholarDigital Library
- M. Janidarmian, A. R. Fekr, K. Radecka, and Z. Zilic, "A Comprehensive Analysis on Wearable Acceleration Sensors in Human Activity Recognition," Sensors, vol. 17, p. 26, Mar 2017.Google ScholarCross Ref
- C. Zhu and W. Sheng, "Multi-sensor fusion for human daily activity recognition in robot-assisted living," in 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 2009, pp. 303--304. Google ScholarDigital Library
- S. Spinsante, A. Angelici, J. Lundstrom, M. Espinilla, I. Cleland, and C. Nugent, "A Mobile Application for Easy Design and Testing of Algorithms to Monitor Physical Activity in the Workplace," Mobile Information Systems, 2016.Google Scholar
- M. Shoaib, S. Bosch, H. Scholten, P. J. Havinga, and O. D. Incel, "Towards detection of bad habits by fusing smartphone and smartwatch sensors," in Pervasive Computing and Communication Workshops (PerCom Workshops), 2015 IEEE International Conference on, 2015, pp. 591--596.Google Scholar
- K. Altun and B. Barshan, "Human Activity Recognition Using Inertial/Magnetic Sensor Units," in Human Behavior Understanding: First International Workshop, HBU 2010, Istanbul, Turkey, August 22, 2010. Proceedings, A. A. Salah, T. Gevers, N. Sebe, and A. Vinciarelli, Eds., ed Berlin, Heidelberg: Springer Berlin Heidelberg, 2010, pp. 38--51. Google ScholarDigital Library
- F. J. Ordonez and D. Roggen, "Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition," Sensors (Basel), vol. 16, Jan 18 2016.Google Scholar
- H. F. Nweke, Y. W. Teh, M. A. Al-garadi, and U. R. Alo, "Deep learning algorithms for human activity recognition using mobile and wearable sensor networks: State of the art and research challenges," Expert Systems with Applications, vol. 105, pp. 233--261, 2018/09/01/ 2018.Google ScholarCross Ref
- M. Shoaib, S. Bosch, O. D. Incel, H. Scholten, and P. J. Havinga, "Complex Human Activity Recognition Using Smartphone and Wrist-Worn Motion Sensors," Sensors (Basel), vol. 16, p. 426, Mar 24 2016.Google ScholarCross Ref
- H. Leutheuser, D. Schuldhaus, and B. M. Eskofier, "Hierarchical, multi-sensor based classification of daily life activities: comparison with state-of-the-art algorithms using a benchmark dataset," PloS one, vol. 8, p. e75196, 2013.Google ScholarCross Ref
- O. Banos, J.-M. Galvez, M. Damas, H. Pomares, and I. Rojas, "Window Size Impact in Human Activity Recognition," Sensors, vol. 14, p. 6474, 2014.Google ScholarCross Ref
- M. Hall, E. Frank, G. Holmes, B. Pfahringer, P. Reutemann, and I. H. Witten, "The WEKA data mining software: an update," ACM SIGKDD explorations newsletter, vol. 11, pp. 10--18, 2009. Google ScholarDigital Library
- E. Zdravevski, P. Lameski, V. Trajkovik, A. Kulakov, I. Chorbev, R. Goleva, et al., "Improving Activity Recognition Accuracy in Ambient-Assisted Living Systems by Automated Feature Engineering," Ieee Access, vol. 5, pp. 5262--5280, 2017.Google ScholarCross Ref
- G. Mujtaba, L. Shuib, R. G. Raj, R. Rajandram, and K. Shaikh, "Prediction of cause of death from forensic autopsy reports using text classification techniques: A comparative study," Journal of forensic and legal medicine, vol. 57, pp. 41--50, 2018.Google ScholarCross Ref
Index Terms
- Analysis of Multi-Sensor Fusion for Mobile and Wearable Sensor Based Human Activity Recognition
Recommendations
Emotion-relevant activity recognition based on smart cushion using multi-sensor fusion
Highlights- Use emotion-relevant activities as a new information source to support emotion recognition.
AbstractMore and more common activities are leading to a sedentary lifestyle forcing us to sit several hours every day. In-seat actions contain significant hidden information, which not only reflects the current physical health status but also ...
A Wearable Multi-Sensor Fusion Approach for Gender Recognition based on Deep Learning
ICBRA '23: Proceedings of the 2023 10th International Conference on Bioinformatics Research and ApplicationsHuman activity recognition (HAR) has gained significant attention over the last decade due to its usefulness in various fields, including healthcare, sports, rehabilitation, and wearable technology. HAR involves using sensors, such as wearables, to ...
Smartphone Sensor Fusion based Activity Recognition System for Elderly Healthcare
MobileHealth '15: Proceedings of the 2015 Workshop on Pervasive Wireless HealthcareWith recent advancements in the tele-monitoring and ambient assisted living technology, human activity recognition (HAR) has proven enormously important in elderly healthcare. With the rapid increase in the use of smartphones embedded with a wide ...
Comments