Skip to main content
Log in

Mobile Service Robot State Revealing Through Expressive Lights: Formalism, Design, and Evaluation

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

We consider mobile service robots that carry out tasks with, for, and around humans in their environments. Speech combined with on-screen display are common mechanisms for autonomous robots to communicate with humans, but such communication modalities may fail for mobile robots due to spatio-temporal limitations. To enable a better human understanding of the robot given its mobility and autonomous task performance, we introduce the use of lights to reveal the dynamic robot state. We contribute expressive lights as a primary modality for the robot to communicate to humans useful robot state information. Such lights are persistent, non-invasive, and visible at a distance, unlike other existing modalities. Current programmable light arrays provide a very large animation space, which we address by introducing a finite set of parametrized signal shapes while still maintaining the needed animation design flexibility. We present a formalism for light animation control and an architecture to map the representation of robot state to the parametrized light animation space. The mapping generalizes to multiple light strips and even other expression modalities. We demonstrate our approach on CoBot, a mobile multi-floor service robot, and evaluate its validity through several user studies. Our results show that carefully designed expressive lights on a mobile robot help humans better understand robot states and actions and can have a desirable impact on a collaborative human–robot behavior.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17

Similar content being viewed by others

Notes

  1. http://www.cmu.edu/randyslecture/bridge.html.

  2. http://opendmx.net/index.php/DMX512-A.

  3. http://www.colorkinetics.com/.

  4. https://learn.adafruit.com/adafruit-neopixel-uberguide/advanced-coding.

  5. http://www.colorkinetics.com/.

  6. https://github.com/adafruit/Adafruit_NeoPixel.

  7. https://github.com/kobotics/LED-animation.

  8. https://www.adafruit.com/products/1507.

  9. http://store-usa.arduino.cc/products/a000066.

  10. https://github.com/kobotics/LED-animation.

  11. https://github.com/adafruit/Adafruit_NeoPixel.

References

  1. Alves-Oliveira P, Di Tullio E, Ribeiro T, Paiva A (2014) Meet me halfway: eye behaviour as an expression of robot’s language. In: 2014 AAAI fall symposium series

  2. Baraka K, Paiva A, Veloso M (2016) Expressive lights for revealing mobile service robot state. In: Robot 2015: second Iberian robotics conference. Springer, pp 107–119

  3. Baraka K, Rosenthal S, Veloso M (2016) Enhancing human understanding of a mobile robot’s state and actions using expressive lights. In: Robot and human interactive communication (RO-MAN), 2016 25th IEEE international symposium on. IEEE, pp 652–657

  4. Bertin J (1983) Semiology of graphics. University of Wisconsin Press

  5. Betella A, Inderbitzin M, Bernardet U, Verschure PF (2013) Non-anthropomorphic expression of affective states through parametrized abstract motifs. In: Affective Computing and Intelligent Interaction (ACII), 2013 Humaine association conference on. IEEE, pp 435–441

  6. Bethel CL (2009) Robots without faces: non-verbal social human–robot interaction. Graduate theses and dissertations. http://scholarcommons.usf.edu/etd/1855

  7. Choi Y, Kim J, Pan P, Jeung J (2007) The considerable elements of the emotion expression using lights in apparel types. In: Proceedings of the 4th international conference on mobile technology, applications, and systems. ACM, pp 662–666

  8. De Lorenzo RA, Eilers MA (1991) Lights and siren: a review of emergency vehicle warning systems. Ann Emerg Med 20(12):1331–1335

    Article  Google Scholar 

  9. De Melo C, Paiva A (2007) Expression of emotions in virtual humans using lights, shadows, composition and filters. In: Affective computing and intelligent interaction. Springer, pp 546–557

  10. Dragan A (2015) Legible robot motion planning. Ph.D. Thesis, Robotics Institute, Carnegie Mellon University, Pittsburgh

  11. Funakoshi K, Kobayashi K, Nakano M, Yamada S, Kitamura Y, Tsujino H (2008) Smoothing human–robot speech interactions by using a blinking-light as subtle expression. In: Proceedings of the 10th international conference on multimodal interfaces. ACM, pp 293–296

  12. Gerathewohl SJ (1957) Conspicuity of flashing light signals: effects of variation among frequency, duration, and contrast of the signals. J Opt Soc Am 47(1):27–29

    Article  Google Scholar 

  13. Haddock SHD, Moline MA, Case JF (2010) Bioluminescence in the sea. Annu Rev Mar Sci 2(1):443–493. doi:10.1146/annurev-marine-120308-081028

    Article  Google Scholar 

  14. Harrison C, Horstman J, Hsieh G, Hudson S (2012) Unlocking the expressivity of point lights. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 1683–1692

  15. Holmes K (2016) The mood of the chinese internet lights up the facade of beijing’s water cube. http://motherboard.vice.com/blog/video-the-great-mood-building-of-china (n.d.). Accessed 11 Feb 2016

  16. Hoonhout J, Jumpertz L, Mason J, Bergman T (2013) Exploration into lighting dynamics for the design of more pleasurable luminaires. In: Proceedings of the 6th international conference on designing pleasurable products and interfaces. ACM, pp 185–192

  17. Jones DN (2016) Interactive light art show ’congregation’ opens at market square. http://www.post-gazette.com/local/city/2014/02/22/Interactive-light-art-show-opens-at-Pittsburghs-Market-Square/stories/201402220081 (2014). Accessed 11 Apr 2016

  18. Kim M, Lee, HS, Park JW, Jo SH, Chung MJ (2008) Determining color and blinking to support facial expression of a robot for conveying emotional intensity. In: Robot and Human interactive communication, 2008. RO-MAN 2008. The 17th IEEE international symposium on. IEEE, pp 219–224

  19. Knight H, Simmons R (2014) Expressive motion with \(x\), \(y\) and \(\theta \): Laban effort features for mobile robots. In: Robot and Human interactive communication, 2014 RO-MAN: The 23rd IEEE international symposium on. IEEE, pp 267–273

  20. Kobayashi K, Funakoshi K, Yamada S, Nakano M, Komatsu T, Saito Y (2011) Blinking light patterns as artificial subtle expressions in human–robot speech interaction. In: RO-MAN, 2011 IEEE. IEEE, pp 181–186

  21. Langmuir I, Westendorp WF (1931) A study of light signals in aviation and navigation. J Appl Phys 1(5):273–317

    Google Scholar 

  22. Lloyd JE (1971) Bioluminescent communication in insects. Annu Rev Entomol 16(1):97–122

    Article  Google Scholar 

  23. Mutlu B, Forlizzi J, Nourbakhsh I, Hodgins J (2006) The use of abstraction and motion in the design of social interfaces. In: Proceedings of the 6th conference on designing interactive systems. ACM, pp 251–260

  24. Feldmaier J, Marmat T, Kuhn J, Diepold K (2016) Evaluation of a RGB-LED-based emotion display for affective agents. CoRR. arXiv:1612.07303

  25. Perera V, Soetens R, Kollar T, Samadi M, Sun Y, Nardi D, van de Molengraft R, Veloso M (2015) Learning task knowledge from dialog and web access. Robotics 4(2):223–252

    Article  Google Scholar 

  26. Rea DJ, Young JE, Irani P (2012) The Roomba mood ring: an ambient-display robot. In: Proceedings of the seventh annual ACM/IEEE international conference on Human–Robot interaction. ACM, pp 217–218

  27. Rosenthal S, Biswas J, Veloso M (2010) An effective personal mobile robot agent through symbiotic Human–Robot interaction. In: Proceedings of AAMAS’10, the ninth international joint conference on autonomous agents and multi-agent systems. Toronto

  28. Schanda J (2007) Colorimetry: understanding the CIE system. Wiley, Hoboken

    Book  Google Scholar 

  29. Seitinger S, Taub DM, Taylor AS (2010) Light bodies: exploring interactions with responsive lights. In: Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction. ACM, pp 113–120

  30. Song S, Yamada S (2017) Expressing emotions through color, sound, and vibration with an appearance-constrained social robot. In: Proceedings of the 2017 ACM/IEEE international conference on human–robot interaction, HRI ’17. ACM, New York, pp 2–11

  31. Stricker R, Mller S, Einhorn E, Schrter C, Volkhardt M, Debes K, Gross HM (2012) Interactive mobile robots guiding visitors in a university building. In: RO-MAN. IEEE, pp 695–700

  32. Szafir D, Mutlu B, Fong T (2015) Communicating directionality in flying robots. In: Proceedings of the tenth annual ACM/IEEE international conference on human–robot interaction. ACM, pp 19–26 (2015)

  33. Veloso M, Biswas J, Coltin B, Rosenthal S (2015) CoBots: robust symbiotic autonomous mobile service robots. In: Proceedings of IJCAI’15, the international joint conference on artificial intelligence. Buenos Aires

  34. Wolfe JM, Horowitz TS (2004) What attributes guide the deployment of visual attention and how do they do it? Nat Rev Neurosci 5(6):495–501

    Article  Google Scholar 

  35. Wright A (2009) The colour affects system of colour psychology. In: AIC quadrennial congress, 2009

  36. Wright B, Rainwater L (1962) The meanings of color. J Gen Psychol 67(1):89–99

    Article  Google Scholar 

  37. Xia G, Tay J, Dannenberg R, Veloso M (2012) Autonomous robot dancing driven by beats and emotions of music. In: Proceedings of the 11th international conference on autonomous agents and multiagent systems-volume 1, pp 205–212

Download references

Acknowledgements

This research was partially supported by the FCT INSIDE ERI grant, FLT Grant Number 2015-143894, NSF Grant Number IIS-1012733, and ONR Grant N00014-09-1-1031. The views and conclusions contained in this document are those of the authors only. The authors declare that they have no conflict of interest. We would like to thank Ana Paiva and Stephanie Rosenthal for their guidance on the user studies, as well Joydeep Biswas and Richard Wang for their development and maintenance of the autonomous CoBot robots.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kim Baraka.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Baraka, K., Veloso, M.M. Mobile Service Robot State Revealing Through Expressive Lights: Formalism, Design, and Evaluation. Int J of Soc Robotics 10, 65–92 (2018). https://doi.org/10.1007/s12369-017-0431-x

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-017-0431-x

Keywords

Navigation