skip to main content
10.1145/2994310.2994348acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmindtrekConference Proceedingsconference-collections
research-article

How would you gesture navigate a drone?: a user-centered approach to control a drone

Published: 17 October 2016 Publication History

Abstract

Gestural interaction with flying drones is now on the rise; however, little work has been done to reveal the gestural preferences from users directly. In this paper, we present an elicitation study to help in realizing user-defined gestures for drone navigation. We apply a user-centered approach in which we collected data from 25 participants performing gestural interactions for twelve drone actions of which ten are navigational actions. The analyses of 300 gesture data collected from our participants reveal a user-defined gestural set of possible suitable gestures to control a drone. We report results that can be used by software developers, engineers or designers; and included a taxonomy for the set of user-defined gestures, gestural agreement scores, time performances and subjective ratings for each action. Finally, we discuss the gestural set with implementation insights and conclude with future directions.

References

[1]
Jessica R. Cauchard, Jane L. E, Kevin Y. Zhai, and James A. Landay. Drone & me: An exploration into natural human-drone interaction. In the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing, UbiComp '15, pages 361--365, New York, NY, USA, 2015. ACM.
[2]
David Efron. Gesture and Environment. King's Crown Press, Morningside Heights, New York, 1941.
[3]
Paul Ekman and Wallace Friesen. The repertoire of nonverbal behavior: Categories, origins, usage, and coding. Semiotica, 1:49--98, 1969.
[4]
Jakob Engel, Jürgen Sturm, and Daniel Cremers. Camera-based navigation of a low-cost quadrocopter. In the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 2815--2821, Oct 2012.
[5]
John Paulin Hansen, Alexandre Alapetite, I. Scott MacKenzie, and Emilie Møllenbach. The use of gaze to control drones. In the Symposium on Eye Tracking Research and Applications, ETRA '14, pages 27--34, New York, NY, USA, 2014. ACM.
[6]
Eleanor Jones, Jason Alexander, Andreas Andreou, Pourang Irani, and Sriram Subramanian. Gestext: Accelerometer-based gestural text-entry systems. In the SIGCHI Conference on Human Factors in Computing Systems, CHI '10, pages 2173--2182, New York, NY, USA, 2010. ACM.
[7]
Maria Karam and M.C. Schraefel. A taxonomy of gestures in human computer interactions. University of Southampton, 2005.
[8]
Felix Kistler and Elisabeth André. User-defined body gestures for an interactive storytelling scenario. In Human-Computer Interaction - INTERACT 2013, volume 8118 of Lecture Notes in Computer Science, pages 264--281. Springer Berlin Heidelberg, 2013.
[9]
Christian Kray, Daniel Nesbitt, John Dawson, and Michael Rohs. User-defined gestures for connecting mobile phones, public displays, and tabletops. In the 12th International Conference on Human Computer Interaction with Mobile Devices and Services, MobileHCI '10, pages 239--248, New York, NY, USA, 2010. ACM.
[10]
Ekaterina Kurdyukova, Matthias Redlin, and Elisabeth André. Studying user-defined ipad gestures for interaction in multi-display environment. In International Conference on Intelligent User Interfaces, pages 1--6, 2012.
[11]
Karl LaFleur, Kaitlin Cassady, Alexander Doud, Kaleb Shades, Eitan Rogin, and Bin He. Quadcopter control in three-dimensional space using a noninvasive motor imagery-based brain---computer interface. Journal of Neural Engineering, 10(4):046003, 2013.
[12]
David McNeill. So you think gestures are nonverbal? Psychological Review, 92(3):350--371, 1985.
[13]
David McNeill. Head and Mind: What Gestures Reveal About Thought. University of Chicago University of Chicago Press, 1992.
[14]
Jawad Nagi, Alessandro Giusti, Gianni A. Di Caro, and Luca M. Gambardella. Human control of uavs using face pose estimates and hand gestures. In the 2014 ACM/IEEE International Conference on Human-robot Interaction, HRI '14, pages 252--253, New York, NY, USA, 2014. ACM.
[15]
Jamie Ng, Tze-Jan Sim, Yao-Sheng Foo, and Vanessa Yeo. Gesture-based interaction with virtual 3d objects on large display: What makes it fun? In the SIGCHI Conference on Human Factors in Computing Systems, CHI EA '09, pages 3751--3756, New York, NY, USA, 2009. ACM.
[16]
Wai Shan Ng and Ehud Sharlin. Collocated interaction with flying robots. In 20th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN 2011), pages 143--149, July 2011.
[17]
Mohammad Obaid, Markus Häring, Felix Kistler, René Bühling, and Elisabeth André. User-defined body gestures for navigational control of a humanoid robot. In the 4th International Conference on Social Robotics, volume 7621 of Lecture Notes in Computer Science, pages 367--377. Springer Berlin Heidelberg, 2012.
[18]
Mohammad Obaid, Felix Kistler, Markus Häring, René Bühling, and Elisabeth André. A framework for user-defined body gestures to control a humanoid robot. International Journal of Social Robotics, 6(3):383--396, 2014.
[19]
Kevin Pfeil, Seng Lee Koh, and Joseph LaViola. Exploring 3d gesture metaphors for interaction with unmanned aerial vehicles. In the 2013 International Conference on Intelligent User Interfaces, IUI '13, pages 257--266, New York, NY, USA, 2013. ACM.
[20]
Jaime Ruiz, Yang Li, and Edward Lank. User-defined motion gestures for mobile interaction. In the 2011 annual Conference on Human Factors in Computing Systems, CHI '11, pages 197--206, New York, NY, USA, 2011. ACM.
[21]
Dan Saffer. Designing Gestural Interfaces. O'Reilly Media, Sebastopol, 2009.
[22]
Maha Salem, Stefan Kopp, Ipke Wachsmuth, Katharina Rohlfing, and Frank Joublin. Generation and evaluation of communicative robot gesture. International Journal of Social Robotics, 4(2):201--217, 2012.
[23]
Andrea Sanna, Fabrizio Lamberti, Gianluca Paravati, and Federico Manuri. A kinect-based natural interface for quadrotor control. Entertainment Computing, 4(3):179 -- 186, 2013.
[24]
Han Sloetjes and Peter Wittenburg. Annotation by category: Elan and iso dcr. In the 6th International Conference on Language Resources and Evaluation (LREC'08). European Language Resources Association (ELRA), 2008.
[25]
R. Stiefelhagen, C. Fugen, R. Gieselmann, H. Holzapfel, K. Nickel, and A. Waibel. Natural human-robot interaction using speech, head pose and gestures. In the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2004)., volume 3, pages 2422 -- 2427 vol.3, 2004.
[26]
Halit Bener Suay and Sonia Chernova. Humanoid robot control using depth camera. In the 6th International Conference on Human-Robot Interaction, HRI '11, pages 401--402, New York, NY, USA, 2011. ACM.
[27]
Joao M. Teixeira, Ronaldo Ferreira, Matheus Santos, and Veronica Teichrieb. Teleoperation using google glass and ar, drone for structural inspection. In the 2014 XVI Symposium on Virtual and Augmented Reality, pages 28--36, May 2014.
[28]
Radu-Daniel Vatavu. User-defined gestures for free-hand tv control. In the 10th European Conference on Interactive TV and Video, EuroiTV '12, pages 45--48, New York, NY, USA, 2012. ACM.
[29]
Jacob O. Wobbrock, Meredith Ringel Morris, and Andrew D. Wilson. User-defined gestures for surface computing. In the SIGCHI Conference on Human Factors in Computing Systems, pages 1083--1092. ACM, 2009.
[30]
Mahisorn Wongphati, Hirotaka Osawa, and Michita Imai. User-defined gestures for controlling primitive motions of an end effector. Advanced Robotics, 29(4):225--238, 2015.
[31]
Jakub Zlotowski, Ewald Strasser, and Christoph Bartneck. Dimensions of anthropomorphism: From humanness to humanlikeness. In the 2014 ACM/IEEE International Conference on Human-Robot Interaction, HRI '14, pages 66--73, New York, NY, USA, 2014. ACM.

Cited By

View all
  • (2024)Controlling the Rooms: How People Prefer Using Gestures to Control Their Smart HomesProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642687(1-18)Online publication date: 11-May-2024
  • (2024)Continuous Hand Gestures Detection and Recognition in Emergency Human-Robot Interaction Based on the Inertial Measurement UnitIEEE Transactions on Instrumentation and Measurement10.1109/TIM.2024.344038173(1-15)Online publication date: 2024
  • (2024)Using gesture and speech communication modalities for safe human-drone interaction in constructionAdvanced Engineering Informatics10.1016/j.aei.2024.10282762(102827)Online publication date: Oct-2024
  • Show More Cited By

Index Terms

  1. How would you gesture navigate a drone?: a user-centered approach to control a drone

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    AcademicMindtrek '16: Proceedings of the 20th International Academic Mindtrek Conference
    October 2016
    483 pages
    ISBN:9781450343671
    DOI:10.1145/2994310
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 17 October 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. drone
    2. gesture
    3. interaction
    4. quadcopter
    5. study
    6. user-defined

    Qualifiers

    • Research-article

    Conference

    AcademicMindtrek'16
    AcademicMindtrek'16: Academic Mindtrek Conference 2016
    October 17 - 18, 2016
    Tampere, Finland

    Acceptance Rates

    Overall Acceptance Rate 110 of 207 submissions, 53%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)97
    • Downloads (Last 6 weeks)7
    Reflects downloads up to 27 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Controlling the Rooms: How People Prefer Using Gestures to Control Their Smart HomesProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642687(1-18)Online publication date: 11-May-2024
    • (2024)Continuous Hand Gestures Detection and Recognition in Emergency Human-Robot Interaction Based on the Inertial Measurement UnitIEEE Transactions on Instrumentation and Measurement10.1109/TIM.2024.344038173(1-15)Online publication date: 2024
    • (2024)Using gesture and speech communication modalities for safe human-drone interaction in constructionAdvanced Engineering Informatics10.1016/j.aei.2024.10282762(102827)Online publication date: Oct-2024
    • (2024)Emotion Appropriateness in Human–Drone InteractionInternational Journal of Social Robotics10.1007/s12369-023-01094-x16:3(579-597)Online publication date: 22-Jan-2024
    • (2023)Age-Based Differences in Drone Control Gestures: An Exploratory StudyProceedings of the 35th Australian Computer-Human Interaction Conference10.1145/3638380.3638401(49-58)Online publication date: 2-Dec-2023
    • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
    • (2023)Abacus GesturesProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36108987:3(1-30)Online publication date: 27-Sep-2023
    • (2023)Towards a Consensus Gesture Set: A Survey of Mid-Air Gestures in HCI for Maximized Agreement Across DomainsProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581420(1-24)Online publication date: 19-Apr-2023
    • (2023)I Need a Third Arm! Eliciting Body-based Interactions with a Wearable Robotic ArmProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581184(1-15)Online publication date: 19-Apr-2023
    • (2023)Hybrid Face and Eye Gesture Tracking Algorithm for Tello EDU RoboMaster TT Quadrotor Drone2023 Innovations in Power and Advanced Computing Technologies (i-PACT)10.1109/i-PACT58649.2023.10434449(1-6)Online publication date: 8-Dec-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media