skip to main content
10.1145/3371382.3378305acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
abstract

Legible Light Communications for Factory Robots

Published: 01 April 2020 Publication History

Abstract

This work focuses on methods to improve mobile robot legibility in factories using lights. Implementation and evaluation were done at a robotics company that manufactures factory robots that work in human spaces. Three new sets of communicative lights were created and tested on the robots, integrated into the company's software stack and compared to the industry default lights that currently exist on the robots. All three newly designed light sets outperformed the industry default. Insights from this work have been integrated into software releases across North America.

References

[1]
Rachid Alami, Alin Albu-Sch"affer, Antonio Bicchi, Rainer Bischoff, Raja Chatila, Alessandro De Luca, Agostino De Santis, Georges Giralt, Jérémie Guiochet, Gerd Hirzinger, et almbox. 2006. Safe and dependable physical human-robot interaction in anthropic domains: State of the art and challenges. In 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 1--16.
[2]
Kim Baraka, Ana Paiva, and Manuela Veloso. 2016a. Expressive lights for revealing mobile service robot state. In Robot 2015: Second Iberian Robotics Conference. Springer, 107--119.
[3]
Kim Baraka, Stephanie Rosenthal, and Manuela Veloso. 2016b. Enhancing human understanding of a mobile robot's state and actions using expressive lights. In 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE, 652--657.
[4]
Kim Baraka and Manuela M Veloso. 2018. Mobile service robot state revealing through expressive lights: formalism, design, and evaluation. International Journal of Social Robotics, Vol. 10, 1 (2018), 65--92.
[5]
Andrea Bauer, Dirk Wollherr, and Martin Buss. 2008. Human--robot collaboration: a survey. International Journal of Humanoid Robotics, Vol. 5, 01 (2008), 47--66.
[6]
Emily C Collins, Tony J Prescott, and Ben Mitchinson. 2015. Saying it with light: A pilot study of affective communication using the MIRO robot. In Conference on Biomimetic and Biohybrid Systems. Springer, 243--255.
[7]
Guy Hoffman and Cynthia Breazeal. 2007. Effects of anticipatory action on human-robot teamwork efficiency, fluency, and perception of team. In Proceedings of the ACM/IEEE international conference on Human-robot interaction. ACM, 1--8.
[8]
Heather Knight, Ravenna Thielstrom, and Reid Simmons. 2016. Expressive path shape: simple motion features that illustrate a robots attitude toward its goal. In 2016 IEEE International Conference on Intelligent Robots and Systems (IROS). IEEE .
[9]
Heather Knight, Manuela Veloso, and Reid Simmons. 2015. Taking candy from a robot: Speed features and candy accessibility predict human response. In Robot and Human Interactive Communication (RO-MAN), 2015 24th IEEE International Symposium on. IEEE, 355--362.
[10]
George Michalos, Sotiris Makris, Jason Spiliotopoulos, Ioannis Misios, Panagiota Tsarouchi, and George Chryssolouris. 2014. ROBO-PARTNER: Seamless human-robot cooperation for intelligent, flexible and safe operations in the assembly factories of the future. Procedia CIRP, Vol. 23 (2014), 71--76.
[11]
Sehoon Oh, Hanseung Woo, and Kyoungchul Kong. 2014. Frequency-shaped impedance control for safe human--robot interaction in reference tracking application. IEEE/ASME Transactions On Mechatronics, Vol. 19, 6 (2014), 1907--1916.
[12]
Daniel Szafir, Bilge Mutlu, and Terrence Fong. 2015. Communicating directionality in flying robots. In 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 19--26.
[13]
Astrid Weiss, Roland Buchner, Manfred Tscheligi, and Hanspeter Fischer. 2011. Exploring human-robot cooperation possibilities for semiconductor manufacturing. In 2011 International Conference on Collaboration Technologies and Systems (CTS). IEEE, 173--177.

Cited By

View all
  • (2024)Tutorial on Movement Notation: An Interdisciplinary Methodology for HRI to Reveal the Bodily Expression of Human Counterparts via Collecting Annotations from Dancers in a Shared Data RepositoryCompanion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610978.3638169(1332-1334)Online publication date: 11-Mar-2024
  • (2023)Investigating the Integration of Human-Like and Machine-Like Robot Behaviors in a Shared Elevator ScenarioProceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3568162.3576974(192-201)Online publication date: 13-Mar-2023
  • (2023)How to Communicate Robot Motion Intent: A Scoping ReviewProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580857(1-17)Online publication date: 19-Apr-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
HRI '20: Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction
March 2020
702 pages
ISBN:9781450370578
DOI:10.1145/3371382
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 April 2020

Check for updates

Author Tags

  1. expressive lighting
  2. factory robots
  3. human-robot interaction

Qualifiers

  • Abstract

Conference

HRI '20
Sponsor:

Acceptance Rates

Overall Acceptance Rate 192 of 519 submissions, 37%

Upcoming Conference

HRI '25
ACM/IEEE International Conference on Human-Robot Interaction
March 4 - 6, 2025
Melbourne , VIC , Australia

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)15
  • Downloads (Last 6 weeks)6
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Tutorial on Movement Notation: An Interdisciplinary Methodology for HRI to Reveal the Bodily Expression of Human Counterparts via Collecting Annotations from Dancers in a Shared Data RepositoryCompanion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610978.3638169(1332-1334)Online publication date: 11-Mar-2024
  • (2023)Investigating the Integration of Human-Like and Machine-Like Robot Behaviors in a Shared Elevator ScenarioProceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3568162.3576974(192-201)Online publication date: 13-Mar-2023
  • (2023)How to Communicate Robot Motion Intent: A Scoping ReviewProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580857(1-17)Online publication date: 19-Apr-2023
  • (2022)MoTiS Parameters for Expressive Multi-Robot Systems: Relative Motion, Timing, and SpacingInternational Journal of Social Robotics10.1007/s12369-022-00936-414:9(1965-1993)Online publication date: 17-Oct-2022
  • (2021)Generating Legible and Glanceable Swarm Robot Motion through Trajectory, Collective Behavior, and Pre-attentive Processing FeaturesACM Transactions on Human-Robot Interaction10.1145/344268110:3(1-25)Online publication date: 11-Jul-2021

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media