Loading [MathJax]/extensions/MathMenu.js
GT-WHAR: A Generic Graph-Based Temporal Framework for Wearable Human Activity Recognition With Multiple Sensors | IEEE Journals & Magazine | IEEE Xplore

GT-WHAR: A Generic Graph-Based Temporal Framework for Wearable Human Activity Recognition With Multiple Sensors


Abstract:

Using wearable sensors to identify human activities has elicited significant interest within the discipline of ubiquitous computing for everyday facilitation. Recent rese...Show More

Abstract:

Using wearable sensors to identify human activities has elicited significant interest within the discipline of ubiquitous computing for everyday facilitation. Recent research has employed hybrid models to better leverage the modal information of sensors and temporal information, enabling improved performance for wearable human activity recognition. Nevertheless, the lack of effective exploitation of human structural information and limited capacity for cross-channel fusion remains a major challenge. This study proposes a generic design, called GT-WHAR, to accommodate the varying application scenarios and datasets while performing effective feature extraction and fusion. Firstly, a novel and unified representation paradigm, namely Body-Sensing Graph Representation, has been proposed to represent body movement by a graph set, which incorporates structural information by considering the intrinsic connectivity of the skeletal structure. Secondly, the newly designed Body-Node Attention Graph Network employs graph neural networks to extract and fuse the cross-channel information within the graph set. Eventually, the graph network has been embedded in the proposed Bidirectional Temporal Learning Network, facilitating the extraction of temporal information in conjunction with the learned structural features. GT-WHAR outperformed the state-of-the-art methods in extensive experiments conducted on benchmark datasets, proving its validity and efficacy. Besides, we have demonstrated the generality of the framework through multiple research questions and provided an in-depth investigation of various influential factors.
Page(s): 3912 - 3924
Date of Publication: 02 April 2024
Electronic ISSN: 2471-285X

Funding Agency:


References

References is not available for this document.