Abstract:
Hand gesture recognition with wearables typically focuses on the characteristics of a single point on hand, but ignores the diversity of motion information over hand skel...Show MoreMetadata
Abstract:
Hand gesture recognition with wearables typically focuses on the characteristics of a single point on hand, but ignores the diversity of motion information over hand skeleton. As a result, current methods suffer from two key challenges to manage multiple hand joints: displacement detection and motion representation. This leads us to define a spatio-temporal framework, named STGauntlet, that explicitly characterizes the hand motion context of spatio-temporal relations among multiple joints and detects hand gestures in real-time. The framework introduces the Lie algebra to capture the inherent structural varieties of hand motions with spatio-temporal dependencies among multiple joints. In addition, we developed a hand-worn prototype with multiple motion sensors respectively attached to various joints on hand and collected 7000 samples of seven gestures from nine subjects. Our in-lab study shows that STGauntlet is capable of detecting gesture types together with their 3D tracking trajectory with 97.35% and 95.17% accuracies for subject dependent and independent recognition, respectively.
Date of Conference: 11-14 October 2020
Date Added to IEEE Xplore: 14 December 2020
ISBN Information: