Loading [a11y]/accessibility-menu.js
Transfer Learning of Deep Neural Network Human Pose Estimator by Domain-Specific Data for Video Motion Capturing | IEEE Conference Publication | IEEE Xplore

Transfer Learning of Deep Neural Network Human Pose Estimator by Domain-Specific Data for Video Motion Capturing


Abstract:

In this study, we construct sport-specific datasets with keypoint and bounding box annotations, and present a transfer learning approach for a 2D pose estimation using th...Show More

Abstract:

In this study, we construct sport-specific datasets with keypoint and bounding box annotations, and present a transfer learning approach for a 2D pose estimation using the datasets. Recent progress in deep neural networks has advanced the technology for 2D human pose estimation in an RGB camera image, including approach such as OpenPose and HRNet. Video-based motion capture systems using such a 2D pose estimator reconstruct the 3D joint positions from estimations of multiple cameras. This type of the video motion capture does not require markers and is suitable for measuring sport motions. In a widely used dataset in 2D pose estimators, however, the samples of sport-specific images are insufficient for representing players’ skills in dynamic sport motions. A 2D pose estimator trained with such a dataset sometimes fails in the detection of a sport-specific human pose. The transfer learning in this study provides robust 2D pose estimation and 3D reconstruction of sport-specific motions such as a taekwondo kick motion.
Date of Conference: 28-30 May 2022
Date Added to IEEE Xplore: 27 June 2022
ISBN Information:

ISSN Information:

Conference Location: Long Beach, CA, USA

Contact IEEE to Subscribe

References

References is not available for this document.