Skip to main content

Does the Safety Demand Characteristic Influence Human-Robot Interaction?

  • Conference paper
  • First Online:
  • 5654 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 9979))

Abstract

While it is increasingly common to have robots in real-world environments, many Human-Robot Interaction studies are conducted in laboratory settings. Evidence shows that laboratory settings have the potential to skew participants’ feelings of safety. This paper probes the consequences of this Safety Demand Characteristic and its impacts on the field of Human-Robot Interaction. We collected survey and video data from 19 participants who had varied consent forms describing different levels of risk for participating in the study. Participants were given a distractor task to prevent them from knowing the purpose of the study. We hypothesized that participants would feel less safe with the changed consent form and that participants’ views of the robot would change depending on the version of consent. The results showed that features of the robot were viewed by participants differently depending on the perceived risks of participating in the study, warranting further inspection.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Admoni, H., Dragan, A., Srinivasa, S.S., Scassellati, B.: Deliberate delays during robot-to-human handovers improve compliance with gaze communication. In: Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction, pp. 49–56. ACM (2014)

    Google Scholar 

  2. Bartneck, C., Croft, E., Kulic, D.: Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int. J. Soc. Robot. 1(1), 71–81 (2009)

    Article  Google Scholar 

  3. Kraft, K., Smart, W.D.: Seeing is comforting: effects of teleoperator visibility in robot-mediated health care. In: The Eleventh ACM/IEEE International Conference on Human Robot Interaction, HRI 2016, pp. 11–18. IEEE Press, Piscataway (2016). http://dl.acm.org/citation.cfm?id=2906831.2906836

  4. Lee, J., Moray, N.: Trust, control strategies and allocation of function in human-machine systems. Ergonomics 35(10), 1243–1270 (1992). doi:10.1080/00140139208967392

    Article  Google Scholar 

  5. Martin, D.: Doing psychology experiments. Cengage Learning (2007)

    Google Scholar 

  6. Moon, A., Troniak, D.M., Gleeson, B., Pan, M.K., Zheng, M., Blumer, B.A., MacLean, K., Croft, E.A.: Meet me where i’m gazing: how shared attention gaze affects human-robot handover timing. In: Proceedings of the 2014 ACM/IEEE International Conference on Human-Robot Interaction, pp. 334–341. ACM (2014)

    Google Scholar 

  7. Orne, M.T., Holland, C.H.: On the ecological validity of laboratory deceptions. Int. J. Psychiatry 6(4), 282–293 (1968)

    Google Scholar 

  8. Plaisant, C., Druin, A., Lathan, C., Dakhane, K., Edwards, K., Vice, J., Montemayor, J.: A storytelling robot for pediatric rehabilitation. In: Proceedings of the Fourth International ACM Conference on Assistive Technologies, pp. 50–55. Arlington (2000)

    Google Scholar 

  9. Yagoda, R.E., Gillan, D.J.: You want me to trust a robot? The development of a human-robot interaction trust scale. I. J. Soc. Robot. 4(3), 235–248 (2012). http://dblp.uni-trier.de/db/journals/ijsr/ijsr4.html#YagodaG12

    Google Scholar 

Download references

Acknowledgments

This material is based upon work supported by the National Aeronautics and Space Administration under Grant #NNX10AN23H issued through the Nevada Space Grant, the Office of Naval Research DURIP award #N00014-14-1-0776, the National Science Foundation #IIS-1528137, and the UNR NSF EPSCoR UROP Program #IIA-1301726. We appreciate the help from Mercedes Anderson, Gaetano Evangelista, and Nathan Yocum who helped administer the study.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jamie Poston .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing AG

About this paper

Cite this paper

Poston, J., Lucas, H., Carlson, Z., Feil-Seifer, D. (2016). Does the Safety Demand Characteristic Influence Human-Robot Interaction?. In: Agah, A., Cabibihan, JJ., Howard, A., Salichs, M., He, H. (eds) Social Robotics. ICSR 2016. Lecture Notes in Computer Science(), vol 9979. Springer, Cham. https://doi.org/10.1007/978-3-319-47437-3_83

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-47437-3_83

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-47436-6

  • Online ISBN: 978-3-319-47437-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics