Abstract
Gesture recognition for human-robot interaction is a prerequisite for many social robotic tasks. One of the main technical difficulties is hand tracking in crowded and dynamic environments. Many existing methods have only been shown to work in clutter-free settings.
This paper proposes a sensor fusion based hand tracking algorithm for crowded environments. It is shown to significantly improve the accuracy of existing hand detectors, based on depth and RGB information. The main novelties of the proposed method include: a) a Monte-Carlo RGB update process to reduce false positives; b) online skin colour learning to cope with varying skin colour, clothing and illumination conditions; c) an asynchronous update method to integrate depth and RGB information for real-time applications. Tracking performance is evaluated in a number of controlled scenarios and crowded environments. All datasets used in this work have been made publicly available.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Shotton, J., Fitzgibbon, A., Cook, M., Sharp, T., Finocchio, M., Moore, R., Kipman, A., Blake, A.: Real-time human pose recognition in parts from single depth images. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1297–1304 (2011)
McKeague, S., Liu, J., Yang, G.-Z.: Hand and body association in crowded environments for human-robot interaction. In: IEEE International Conference on Robotics and Automation (ICRA) (2013)
Plagemann, C., Ganapathi, V., Koller, D., Thrun, S.: Real-time identification and localization of body parts from depth images. In: IEEE International Conference on Robotics and Automation (ICRA), pp. 3108–3113 (2010)
Li, Z., Kulic, D.: Local shape context based real-time endpoint body part detection and identification from depth images. In: Canadian Conference on Computer and Robot Vision, pp. 219–226 (2011)
Guyon, I., Athitsos, V., Jangyodsuk, P., Hamner, B., Escalante, H.J.: Chalearn gesture challenge: Design and first results. In: IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), pp. 1–6 (2012)
Donoser, M., Bischof, H.: Real time appearance based hand tracking. In: International Conference on Pattern Recognition (ICPR), pp. 1–4 (2008)
Oikonomidis, I., Kyriazis, N., Argyros, A.: Efficient model-based 3d tracking of hand articulations using kinect. In: British Machine Vision Conference (BMVC), pp. 101.1–101.11 (2011)
Chen, B., Huang, C., Tseng, T., Fu, L.: Robust head and hands tracking with occlusion handling for human machine interaction. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2141–2146 (2012)
Bretzner, L., Laptev, I., Lindeberg, T.: Hand gesture recognition using multi-scale colour features, hierarchical models and particle filtering. In: IEEE International Conference on Automatic Face and Gesture Recognition, pp. 423–428 (2002)
Sigalas, M., Baltzakis, H., Trahanias, P.: Gesture recognition based on arm tracking for human-robot interaction. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 5424–5429 (2010)
Wang, R., Popović, J.: Real-time hand-tracking with a color glove. ACM Trans. Graph. 28(3), 63:1–63:8 (2009)
Thrun, S., Burgard, W., Fox, D.: Probabilistic Robotics (Intelligent Robotics and Autonomous Agents). The MIT Press (2005)
Kovac, J., Peer, P., Solina, F.: Human skin color clustering for face detection. In: EUROCON. Computer as a Tool, vol. 2, pp. 144–148 (2003)
Stern, H., Efros, B.: Adaptive color space switching for face tracking in multi-colored lighting environments. In: IEEE International Conference on Automatic Face and Gesture Recognition, pp. 236–241 (2002)
Mitra, S., Acharya, T.: Gesture recognition: A survey. IEEE Transactions on Systems, Man, and Cybernetics 37, 311–324 (2007)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
McKeague, S., Liu, J., Yang, GZ. (2013). An Asynchronous RGB-D Sensor Fusion Framework Using Monte-Carlo Methods for Hand Tracking on a Mobile Robot in Crowded Environments. In: Herrmann, G., Pearson, M.J., Lenz, A., Bremner, P., Spiers, A., Leonards, U. (eds) Social Robotics. ICSR 2013. Lecture Notes in Computer Science(), vol 8239. Springer, Cham. https://doi.org/10.1007/978-3-319-02675-6_49
Download citation
DOI: https://doi.org/10.1007/978-3-319-02675-6_49
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-02674-9
Online ISBN: 978-3-319-02675-6
eBook Packages: Computer ScienceComputer Science (R0)