skip to main content
10.1145/3371382.3378311acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
abstract

Orthographic Vision-based Interface with Motion-tracking System for Robot Arm Teleoperation: A Comparative Study

Published: 01 April 2020 Publication History

Abstract

Robot teleoperation is crucial for many hazardous situations such as handling radioactive materials, undersea exploration and firefighting. Visual feedback is essential to increase the operator's situation awareness and thus accurately teleoperate a robot. In addition, the control interface is equally important as the visual feedback for effective teleoperation. In this paper, we propose a simple and cost-effective orthographic visual interface for the teleoperation system by visualizing the remote environment to provide depth information using only a single inexpensive webcam. Further, we provide a simple modification to the control interface (Leap Motion) to achieve a wider workspace and make it more convenient for the user. To realize the merits of the proposed system, a comparison between the modified Leap Motion interface and traditional control modalities (i.e., joystick and keyboard) is conducted using both the proposed orthographic vision system and a traditional binocular vision system. We conduct a user study (N = 10) to evaluate the effectiveness of this approach to teleoperate a 6-DoF arm robot to carry out a pick and place task. The results show that the combination of Leap Motion with the orthographic visual system outperforms all other combinations.

References

[1]
Jessie YC Chen, Ellen C Haas, and Michael J Barnes. 2007. Human performance issues and user interface design for teleoperated robots. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), Vol. 37, 6 (2007), 1231--1245.
[2]
David W Hainsworth. 2001. Teleoperation user interfaces for mining robotics. Autonomous Robots, Vol. 11, 1 (2001), 19--28.
[3]
Ayoung Hong, Heinrich H Bülthoff, and Hyoung Il Son. 2013. A visual and force feedback for multi-robot teleoperation in outdoor environments: A preliminary result. In Robotics and Automation (ICRA), 2013 IEEE International Conference on. IEEE, 1471--1478.
[4]
David Kent, Carl Saldanha, and Sonia Chernova. 2019. Leveraging depth data in remote robot teleoperation interfaces for general object manipulation. The International Journal of Robotics Research (2019), 0278364919888565.
[5]
Tomávs Kot and Petr Novák. 2018. Application of virtual reality in teleoperation of the military mobile robotic system TAROS. International Journal of Advanced Robotic Systems, Vol. 15, 1 (2018), 1729881417751545.
[6]
Corinna E Lathan and Michael Tracey. 2002. The effects of operator spatial perception and sensory feedback on human-robot teleoperation performance. Presence: Teleoperators & virtual environments, Vol. 11, 4 (2002), 368--377.
[7]
Guanyang Liu, Xuda Geng, Lingzhi Liu, and Yan Wang. 2018. Haptic Based Teleoperation with Master-slave Motion Mapping and Haptic Rendering for Space Exploration. Chinese Journal of Aeronautics (2018).
[8]
Jacques Marescaux, Joel Leroy, Michel Gagner, Francesco Rubino, Didier Mutter, Michel Vix, Steven E Butner, and Michelle K Smith. 2001. Transatlantic robot-assisted telesurgery. Nature, Vol. 413, 6854 (2001), 379.
[9]
Leap Motion. 2015. Leap motion controller. URl: https://www. leapmotion. com (2015).
[10]
Daniel Rakita, Bilge Mutlu, and Michael Gleicher. 2019. Remote Telemanipulation with Adapting Viewpoints in Visually Complex Environments. In Proceedings of the Robotics: Science and Systems .
[11]
A Reiter, V Nitsch, G Reinhart, and B F"arber. 2009. Effects of visual and haptic feedback on telepresent micro assembly tasks. In 3rd International Conference on Changeable, Agile, Reconfigurable and Virtual Production CARV .
[12]
Roque Saltaren, Rafael Aracil, Cesar Alvarez, Eugenio Yime, and Jose Maria Sabater. 2007. Field and service applications-Exploring deep sea by teleoperated robot-An Underwater Parallel Robot with High Navigation Capabilities. IEEE Robotics & Automation Magazine, Vol. 14, 3 (2007), 65--75.
[13]
Aaron Steinfeld, Terrence Fong, David Kaber, Michael Lewis, Jean Scholtz, Alan Schultz, and Michael Goodrich. 2006. Common metrics for human-robot interaction. In Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction. ACM, 33--40.
[14]
James S Tittle, Axel Roesler, and David D Woods. 2002. The remote perception problem. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Vol. 46. SAGE Publications Sage CA: Los Angeles, CA, 260--264.

Cited By

View all
  • (2024)The Space Between Us: Bridging Human and Robotic Worlds in Space ExplorationCompanion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610978.3640727(833-836)Online publication date: 11-Mar-2024
  • (2024)A review of external sensors for human detection in a human robot collaborative environmentJournal of Intelligent Manufacturing10.1007/s10845-024-02341-2Online publication date: 4-Apr-2024
  • (2023)Evaluation of 6 DOF Robotic Arm Using Leap Motion SensorInternational Journal of Circuits, Systems and Signal Processing10.46300/9106.2023.17.317(29-38)Online publication date: 6-Mar-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
HRI '20: Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction
March 2020
702 pages
ISBN:9781450370578
DOI:10.1145/3371382
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 April 2020

Check for updates

Author Tags

  1. control modalities
  2. interface design
  3. robot teleoperation

Qualifiers

  • Abstract

Conference

HRI '20
Sponsor:

Acceptance Rates

Overall Acceptance Rate 192 of 519 submissions, 37%

Upcoming Conference

HRI '25
ACM/IEEE International Conference on Human-Robot Interaction
March 4 - 6, 2025
Melbourne , VIC , Australia

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)21
  • Downloads (Last 6 weeks)0
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)The Space Between Us: Bridging Human and Robotic Worlds in Space ExplorationCompanion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610978.3640727(833-836)Online publication date: 11-Mar-2024
  • (2024)A review of external sensors for human detection in a human robot collaborative environmentJournal of Intelligent Manufacturing10.1007/s10845-024-02341-2Online publication date: 4-Apr-2024
  • (2023)Evaluation of 6 DOF Robotic Arm Using Leap Motion SensorInternational Journal of Circuits, Systems and Signal Processing10.46300/9106.2023.17.317(29-38)Online publication date: 6-Mar-2023
  • (2023)Research on Mixed Reality Visual Augmentation Method for Teleoperation Interactive SystemVirtual, Augmented and Mixed Reality10.1007/978-3-031-35634-6_35(490-502)Online publication date: 9-Jul-2023
  • (2021)Human-Robot Perception in Industrial Environments: A SurveySensors10.3390/s2105157121:5(1571)Online publication date: 24-Feb-2021

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media