ABSTRACT
We present a new recording tool to capture synchronized video and skeletal data streams from cheap sensors such as the Microsoft Kinect2, and LeapMotion. While other recording tools act as virtual playback devices for testing on-line real-time applications, we target multi-media data collection for off-line processing. Images are encoded in common video formats, and skeletal data as flat text tables. This approach enables long duration recordings (e.g. over 30 minutes), and supports post-hoc mapping of the Kinect2 depth video to the color space if needed. By using common file formats, the data can be played back and analyzed on any other computer, without requiring sensor specific SDKs to be installed. The project is released under a 3-clause BSD license, and consists of an extensible C++11 framework, with support for the official Microsoft Kinect 2 and LeapMotion APIs to record, a command-line interface, and a Matlab GUI to initiate, inspect, and load Kinect2 recordings.
Supplemental Material
Available for Download
Read Me File
SenseCap: Synchronized Data Collection with Microsoft Kinect2 and LeapMotion. See Read Me File
- Z. Cheng, L. Qin, Y. Ye, Q. Huang, and Q. Tian. Human daily action analysis with multi-view and color-depth data. In Computer Vision--ECCV 2012. Workshops and Demonstrations, pages 52--61. Springer, 2012. Google ScholarDigital Library
- J. Han, L. Shao, D. Xu, and J. Shotton. Enhanced computer vision with microsoft kinect sensor: A review. Cybernetics, IEEE Transactions on, 43(5):1318--1334, 2013.Google Scholar
- S. Izadi et al. Kinectfusion: real-time 3d reconstruction and interaction using a moving depth camera. In Proceedings of the 24th annual ACM symposium on User interface software and technology, pages 559--568. ACM, 2011. Google ScholarDigital Library
- J. F. P. Kooij and J. C. van Gemert. Depth-aware motion magnification. In European Conference on Computer Vision (ECCV), 2016. Springer, (to appear).Google ScholarCross Ref
- G. Marin, F. Dominio, and P. Zanuttigh. Hand gesture recognition with leap motion and kinect devices. In Image Processing (ICIP), 2014 IEEE International Conference on, pages 1565--1569. IEEE, 2014.Google ScholarCross Ref
- A. Prest, C. Leistner, J. Civera, C. Schmid, and V. Ferrari. Learning object class detectors from weakly annotated video. In Computer Vision and Pattern Recognition (CVPR), 2012 IEEE Conference on, pages 3282--3289. IEEE, 2012. Google ScholarDigital Library
- O. Russakovsky, J. Deng, H. Su, J. Krause, S. Satheesh, S. Ma, Z. Huang, A. Karpathy, A. Khosla, M. Bernstein, et al. Imagenet large scale visual recognition challenge. International Journal of Computer Vision, 115(3):211--252, 2015. Google ScholarDigital Library
- J. Shotton, T. Sharp, A. Kipman, A. Fitzgibbon, M. Finocchio, A. Blake, M. Cook, and R. Moore. Real-time human pose recognition in parts from single depth images. Communications of the ACM, 56(1):116--124, 2013. Google ScholarDigital Library
- S. Song, S. P. Lichtenberg, and J. Xiao. Sun rgb-d: A rgb-d scene understanding benchmark suite. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 567--576, 2015.Google ScholarCross Ref
- F. Weichert, D. Bachmann, B. Rudak, and D. Fisseler. Analysis of the accuracy and robustness of the leap motion controller. Sensors, 13(5):6380--6393, 2013.Google ScholarCross Ref
Index Terms
- SenseCap: Synchronized Data Collection with Microsoft Kinect2 and LeapMotion
Recommendations
LeapLook: a free-hand gestural travel technique using the leap motion finger tracker
SUI '14: Proceedings of the 2nd ACM symposium on Spatial user interactionContactless motion sensing devices enable a new form of input that does not encumber the user with wearable tracking equipment. We present a novel travel technique using the Leap Motion finger tracker which adopts a 2DOF steering metaphor used in ...
Kinect-based Interactive Games with Joint Measurements to Assist Upper Limb Rehabilitation
i-CREATe 2018: Proceedings of the 12th International Convention on Rehabilitation Engineering and Assistive TechnologyIt has become increasingly important to enhance effectiveness of rehabilitation in recent years. The key is to increase patient willingness to attend rehabilitation. Virtual reality has been used in combination with medical treatment and technology. ...
Limber: exploring motivation in a workplace exergame
CSCW '13: Proceedings of the 2013 conference on Computer supported cooperative work companionLimber, in its current iteration, is a vision-based application that introduces gamification into the workplace. This ongoing effort to incentivize good posture, and regular body movements implements several changes to include; full body stretches (...
Comments