Abstract:
Over the decades, fitness activities and extreme endurance events are expanding throughout the world. The number of available public skeletal repositories and recognition...Show MoreMetadata
Abstract:
Over the decades, fitness activities and extreme endurance events are expanding throughout the world. The number of available public skeletal repositories and recognition/evaluation benchmarks has grown rapidly since Microsoft manufactured a motion sensing device called Kinect. Kinect RGBD data has become a very useful representation of an indoor scene for solving activity/fitness recognition problems. The other alternative sensor which has been utilized widely in this area is the wearable inertial measurement unit (IMU) sensor. With numerous advance sensors with mass adoption, this technology represents a possible approach to surpass current activity recognition and evaluation research solutions. Nevertheless, there is a limited number of publicly available datasets where depth camera, inertial sensor, and RGB image data are captured at the same time. In this paper, we introduce NCTU-MFD (National Chiao Tung University Multisensor Fitness Dataset), a comprehensive, diverse multisensor dataset collected using Kinect RGBD sensor, wearable inertial sensors, and web cameras. The dataset contains 47131 RGB images, 47131 depth images, and 100 csv files including 47131 skeletal data (from 25 joints) collected from Kinect sensor. In addition, our dataset also contains acceleration and gyroscope data from IMU sensors, and 94262 RGB images (47131 images from each web camera). To demonstrate the possible use of our dataset, we conduct an experiment on evaluation of depth maps.
Date of Conference: 18-20 September 2019
Date Added to IEEE Xplore: 07 November 2019
ISBN Information:
Print on Demand(PoD) ISSN: 2576-8565