Loading [a11y]/accessibility-menu.js
Egocentric articulated pose tracking for action recognition | IEEE Conference Publication | IEEE Xplore

Egocentric articulated pose tracking for action recognition


Abstract:

Many studies on action recognition from the third-person viewpoint have shown that articulated human pose can directly describe human motion and is invariant to view chan...Show More

Abstract:

Many studies on action recognition from the third-person viewpoint have shown that articulated human pose can directly describe human motion and is invariant to view change. However, conventional algorithms that estimate articulated human pose cannot handle ego-centric images because they assume the whole figure appears in the image; only a few parts of the body appear in ego-centric images. In this paper, we propose a novel method to estimate human pose for action recognition from ego-centric RGB-D images. Our method can extract the pose by integrating hand detection, camera pose estimation, and time-series filtering with the constraint of body shape. Experiments show that joint positions are well estimated when the detection error of hands and arms decreases. We demonstrate that the accuracy of action recognition is improved by the feature of skeleton when the action contains unintended view changes.
Date of Conference: 18-22 May 2015
Date Added to IEEE Xplore: 13 July 2015
Electronic ISBN:978-4-9011-2214-6
Conference Location: Tokyo, Japan

Contact IEEE to Subscribe

References

References is not available for this document.