skip to main content
10.1145/2578153.2578156acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

The use of gaze to control drones

Published: 26 March 2014 Publication History

Abstract

This paper presents an experimental investigation of gaze-based control modes for unmanned aerial vehicles (UAVs or "drones"). Ten participants performed a simple flying task. We gathered empirical measures, including task completion time, and examined the user experience for difficulty, reliability, and fun. Four control modes were tested, with each mode applying a combination of x-y gaze movement and manual (keyboard) input to control speed (pitch), altitude, rotation (yaw), and drafting (roll). Participants had similar task completion times for all four control modes, but one combination was considered significantly more reliable than the others. We discuss design and performance issues for the gaze-plus-manual split of controls when drones are operated using gaze in conjunction with tablets, near-eye displays (glasses), or monitors.

References

[1]
Alapetite, A., Hansen, J. P., & MacKenzie, I. S. 2012. Demo of gaze controlled flying. In Proceedings of the 7th Nordic Conference on Human-Computer Interaction: Making Sense Through Design. New York: ACM. 773--774.
[2]
Bates, R., & Istance, H. 2002. Zooming interfaces!: enhancing the performance of eye controlled pointing devices. In Proceedings of the fifth international ACM conference on Assistive technologies. New York: ACM. 119--126.
[3]
Cahillane, M., Baber, C., & Morin, C. 2012. Human Factors in UAV. Sense and Avoid in UAS: Research and Applications, 61, 119.
[4]
Duchowski, A. T., Pelfrey, B., House, D. H., & Wang, R. 2011. Measuring gaze depth with an eye tracker during stereoscopic display. In Proceedings of the ACM SIGGRAPH Symposium on Applied Perception in Graphics and Visualization New York: ACM. 15--22.
[5]
Goodrich, M. A., & Schultz, A. C. 2007. Human-robot interaction: a survey. Foundations and Trends in Human-Computer Interaction, 1(3), 203--275.#
[6]
Hansen, J. P., Agustin, J. S., & Skovsgaard, H. 2011. Gaze interaction from bed. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications. New York: ACM. 11:1--11:4.
[7]
Isokoski, P., Joos, M., Spakov, O., & Martin, B. 2009. Gaze controlled games. Univers. Access Inf. Soc., 8(4), 323--337.
[8]
Istance, H., Vickers, S., & Hyrskykari, A. 2009. Gaze-based interaction with massively multiplayer on-line games. In Proceedings of the 27th international conference extended abstracts on Human factors in computing systems. New York: ACM. 4381--4386.
[9]
Latif, H. O., Sherkat, N., & Lotfi, A. 2009. Teleoperation through eye gaze (TeleGaze): a multimodal approach. In Robotics and Biomimetics (ROBIO), 2009 IEEE International Conference on. New York: IEEE. 711--716.
[10]
Mollenbach, E., Stefansson, T., & Hansen, J. P. 2008. All eyes on the monitor: gaze based interaction in zoomable, multi-scaled information-spaces. In Proceedings of the 13th international conference on Intelligent user interfaces. New York: ACM. 373--376.
[11]
Mouloua, M., Gilson, R., & Hancock, P. 2003. Human-centered design of unmanned aerial vehicles. Ergonomics in Design: The Quarterly of Human Factors Applications, 11(1), 6--11.
[12]
Nacke, L. E., Stellmach, S., Sasse, D., & Lindley, C. A. 2010. Gameplay experience in a gaze interaction game. Retrieved from http://arxiv.org/abs/1004.0259
[13]
Nielsen, A. M., Petersen, A. L., & Hansen, J. P. 2012. Gaming with gaze and losing with a smile. In Proceedings of the Symposium on Eye Tracking Research and Applications New York: ACM. 365--368.
[14]
Nonami, K., Kendoul, F., Suzuki, S., Wang, W., & Nakazawa, D. 2010. Autonomous Flying Robots: Unmanned Aerial Vehicles and Micro Aerial Vehicles. Springer Publishing Company, Incorporated.
[15]
Noonan, D. P., Mylonas, G. P., Darzi, A., & Yang, G.-Z. 2008. Gaze contingent articulated robot control for robot assisted minimally invasive surgery. In Intelligent Robots and Systems, 2008. IROS 2008. New York: IEEE. 1186--1191.
[16]
Oh, H., Won, D. Y., Huh, S. S., Shim, D. H., Tahk, M. J., & Tsourdos, A. 2011. Indoor UAV Control Using Multi-Camera Visual Feedback. Journal of Intelligent & Robotic Systems, 61(1), 57--84.
[17]
Pfeil, K., Koh, S. L., & LaViola, J. 2013. Exploring 3d gesture metaphors for interaction with unmanned aerial vehicles. In Proceedings of the 2013 international conference on Intelligent user interfaces. New York: ACM. 257--266.
[18]
Rantanen, V., Verho, J., Lekkala, J., Tuisku, O., Surakka, V., & Vanhala, T. 2012. The effect of clicking by smiling on the accuracy of head-mounted gaze tracking. In Proceedings of the Symposium on Eye Tracking Research and Applications. New York: ACM. 345--348.
[19]
Stellmach, S., & Dachselt, R. 2012. Investigating gaze-supported multimodal pan and zoom. In Proceedings of the Symposium on Eye Tracking Research and Applications. New York: ACM. 357--360.
[20]
Stellmach, S., & Dachselt, R. 2013. Still looking: investigating seamless gaze-supported selection, positioning, and manipulation of distant targets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. New York: ACM. 285--294.
[21]
Tanriverdi, V., & Jacob, R. J. 2000. Interacting with eye movements in virtual environments. In Proceedings of the SIGCHI conference on Human factors in computing systems. New York: ACM. 265--272.
[22]
Valimont, R. B., & Chappell, S. L. 2011. Look where I'm going and go where I'm looking: Camera-up map for unmanned aerial vehicles. In Proceedings of the 6th international conference on Human-robot interaction (HRI '11). New York: ACM. 275--276.
[23]
Yu, Y., He, D., Hua, W., Li, S., Qi, Y., Wang, Y., & Pan, G. 2012. FlyingBuddy2: a brain-controlled assistant for the handicapped. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing. New York: ACM. 669--670.
[24]
Zhu, D., Gedeon, T., & Taylor, K. 2010. Head or gaze?: controlling remote camera for hands-busy tasks in teleoperation: a comparison. In Proceedings of the 22nd Conference of the Computer-Human Interaction Special Interest Group of Australia on Computer-Human Interaction. New York: ACM. 300--303.

Cited By

View all
  • (2024)Drone System Remotely Controlled by Human Eyes: A Consideration of its Effectiveness When Remotely Controlling a RobotJournal of Robotics and Mechatronics10.20965/jrm.2024.p105536:5(1055-1064)Online publication date: 20-Oct-2024
  • (2024)Continuous Hand Gestures Detection and Recognition in Emergency Human-Robot Interaction Based on the Inertial Measurement UnitIEEE Transactions on Instrumentation and Measurement10.1109/TIM.2024.344038173(1-15)Online publication date: 2024
  • (2024)GazeRace: Revolutionizing Remote Piloting with Eye-Gaze Control2024 IEEE International Conference on Systems, Man, and Cybernetics (SMC)10.1109/SMC54092.2024.10831987(410-415)Online publication date: 6-Oct-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '14: Proceedings of the Symposium on Eye Tracking Research and Applications
March 2014
394 pages
ISBN:9781450327510
DOI:10.1145/2578153
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 March 2014

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. UAV
  2. augmented or mixed reality systems
  3. drones
  4. gaze input
  5. gaze interaction
  6. head-mounted displays
  7. mobility
  8. multimodality
  9. robotics
  10. video gaming

Qualifiers

  • Research-article

Funding Sources

  • Danish National Advanced Technology Foundation

Conference

ETRA '14
ETRA '14: Eye Tracking Research and Applications
March 26 - 28, 2014
Florida, Safety Harbor

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

ETRA '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)83
  • Downloads (Last 6 weeks)4
Reflects downloads up to 17 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Drone System Remotely Controlled by Human Eyes: A Consideration of its Effectiveness When Remotely Controlling a RobotJournal of Robotics and Mechatronics10.20965/jrm.2024.p105536:5(1055-1064)Online publication date: 20-Oct-2024
  • (2024)Continuous Hand Gestures Detection and Recognition in Emergency Human-Robot Interaction Based on the Inertial Measurement UnitIEEE Transactions on Instrumentation and Measurement10.1109/TIM.2024.344038173(1-15)Online publication date: 2024
  • (2024)GazeRace: Revolutionizing Remote Piloting with Eye-Gaze Control2024 IEEE International Conference on Systems, Man, and Cybernetics (SMC)10.1109/SMC54092.2024.10831987(410-415)Online publication date: 6-Oct-2024
  • (2024)Distributed Connectivity-Maintenance Control of a Team of Unmanned Aerial Vehicles Using Supervised Deep Learning2024 10th International Conference on Control, Automation and Robotics (ICCAR)10.1109/ICCAR61844.2024.10569813(232-239)Online publication date: 27-Apr-2024
  • (2024)Control of a quadrotor on a mobile device using machine learning-based monocular gaze trackingPhysica Scripta10.1088/1402-4896/ad32f899:4(045409)Online publication date: 22-Mar-2024
  • (2023)Evaluation of a Remote-Controlled Drone System for Bedridden Patients Using Their Eyes Based on Clinical ExperimentTechnologies10.3390/technologies1101001511:1(15)Online publication date: 17-Jan-2023
  • (2023)Gaze-Augmented Drone NavigationProceedings of the Augmented Humans International Conference 202310.1145/3582700.3583702(363-366)Online publication date: 12-Mar-2023
  • (2023)UAV Control Using Eye Gestures: Exploring the Skies Through Your EyesProceedings of the Twenty-fourth International Symposium on Theory, Algorithmic Foundations, and Protocol Design for Mobile Networks and Mobile Computing10.1145/3565287.3617619(447-452)Online publication date: 23-Oct-2023
  • (2023) A Gaze-based Bilateral Teleoperation Framework for a Team of Mobile Robots * 2023 IEEE International Conference on Robotics and Biomimetics (ROBIO)10.1109/ROBIO58561.2023.10354979(1-6)Online publication date: 4-Dec-2023
  • (2023)Gaze Controlled Underwater Remotely Operated Vehicle (ROV) to Improve Accessibility in Maritime RoboticsOCEANS 2023 - Limerick10.1109/OCEANSLimerick52467.2023.10244706(1-6)Online publication date: 5-Jun-2023
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media