Skip to main content

A Novel Real-Time Gesture Recognition Algorithm for Human-Robot Interaction on the UAV

  • Conference paper
  • First Online:
Computer Vision Systems (ICVS 2017)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 10528))

Included in the following conference series:

  • 4181 Accesses

Abstract

This paper provides a new real-time gesture recognition technology for Unmanned Aerial Vehicle (UAV) Control. Despite of the tradition robot controlling system that uses the pre-defined program to control the UAV, this system allows the users to on-line design and control the UAV to finish the abrupt urgent task with different gestures. The system is composed of three parts: On-line personal feature training system, Gesture recognition system and UAV motion control system. In the first part, we collect and analyze user gestures, extract features data and train the recognition program in real time. In the second part, a multi-feature hierarchical filtering algorithm is applied to guarantee both the accuracy and real-time processing speed of our gesture recognition method. In the last part, the gesture recognition result is translated to a UAV through a data transmitter based on Mavlink protocol to achieve the human on-line control for the UAV. Through two extensive experiments, the effectiveness and efficiency of our method has been confirmed.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Akyol, S., Canzler, U., Bengler, K.: Gesture control for use in automobiles. In: MVA, Tokyo, Japan, pp. 349–352 (2000)

    Google Scholar 

  2. Wachs, J.P., Kolsch, M., Stern, H.: Vision-based hand-gesture applications. Commun. ACM 54, 60–71 (2011)

    Article  Google Scholar 

  3. Rautaray, S.S., Agrawal, A.: Interaction with virtual game through hand gesture recognition. In: Proceedings of the 2011 International Conference on Multimedia, Signal Processing and Communication Technologies (IMPACT), 17–19 December, Aligarh, Uttar Pradesh, pp. 244–247 (2011)

    Google Scholar 

  4. Walter, R., Bailly, G., Muller, J.: StrikeAPose: revealing mid-air gestures on public displays. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, France, pp. 841–850 (2013)

    Google Scholar 

  5. Hackenberg, G., McCall, R., Broll, W.: Lightweight palm and finger tracking for real-time 3D gesture control. In: Proceedings of the 2011 IEEE Virtual Reality Conference (VR), Singapore, pp. 19–26 (2011)

    Google Scholar 

  6. Stenger, B., Mendoncca, P.R., Cipolla, R.: Model-based 3D tracking of an articulated hand. In: Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2001, Kauai, HI, USA, vol. 2, pp. 310–315 (2001)

    Google Scholar 

  7. Rehg, J.M., Kanade, T.: Model-based tracking of self-occluding articulated objects. In: Proceedings of the Fifth International Conference on Computer Vision, Cambridge, MA, USA, pp. 612–617 (1995)

    Google Scholar 

  8. Chang, W.Y., Chen, C.S., Hung, Y.P.: Appearance-guided particle filtering for articulated hand tracking. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Diego, CA, USA, vol. 1, pp. 235–242 (2005)

    Google Scholar 

  9. Wu, Y., Huang, T.S.: View-independent recognition of hand postures. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Hilton Head Island, SC, USA, vol. 2, pp. 88–94 (2000)

    Google Scholar 

  10. Minnen, D., Zafrulla, Z.: Towards robust cross-user hand tracking and shape recognition. In: Proceedings of the 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops), Barcelona, pp. 1235–1241 (2011)

    Google Scholar 

  11. Hincapie-Ramos, J.D., Guo, X., Moghadasian, P.: Consumed endurance: a metric to quantify arm fatigue of mid-air interactions. In: Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems, Toronto, ON, Canada, pp. 1063–1072 (2014)

    Google Scholar 

Download references

Acknowledgements

This work is supported by the Program of “One Hundred Talented People” of the Chinese Academy of Sciences under Award No. Y3F11001 and National Natural Science Foundation of China under Award No. 61573338, No. U1609210.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chunsheng Hua .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Chen, B., Hua, C., Han, J., He, Y. (2017). A Novel Real-Time Gesture Recognition Algorithm for Human-Robot Interaction on the UAV. In: Liu, M., Chen, H., Vincze, M. (eds) Computer Vision Systems. ICVS 2017. Lecture Notes in Computer Science(), vol 10528. Springer, Cham. https://doi.org/10.1007/978-3-319-68345-4_46

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-68345-4_46

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-68344-7

  • Online ISBN: 978-3-319-68345-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics