Skip to main content
Log in

How do People Expect Humanoids to Respond to Touch?

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

With close interaction between humans and robots expected to become more and more frequent in the near future, tactile interaction is receiving increasing interest. Many advances were made in the fields of tactile sensing and touch classification. Robot’s reactions to touches are usually decided by the robot’s designers and fit to a particular purpose. However, very little investigation has been directed to the movements that common people expect from robots being touched. This paper provides an initial step in this direction. Responses that people expect from a humanoid being touched were collected. These responses were then classified by automatically grouping similar responses. This allows the identification of distinct types of responses. Evaluation of how this grouping matches common sense were then performed. Results showed strong correlation between the automatic grouping and common sense, providing support to the idea that the automatically identified types of responses correspond to a plausible classification of robot’s responses to touch.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16

Similar content being viewed by others

Notes

  1. For the sake of completeness we notice that some touch sensors have their own plates (for example, sensor number 36 in Fig. 2), while others are placed at the corners of bigger plates (for instance sensors 70,71 and 72).

  2. A clustering analysis based on the cartesian representation can be found in the supplementary material.

  3. We note that this higher level of hierarchy in the grouping is done in a somewhat arbitrary manner by the authors, and not by the algorithm itself. For the algorithm, the classification ends up at the division into 17 clusters.

  4. Or the expert motion developer, as will be explained further later.

  5. Here and in the following, we write the English translation of the actual wordings, which was given in Japanese.

References

  1. Abdi H, Nahavandi S, Najdovski Z, Maciejewski AA (2013) Fault-tolerant force in human and robot cooperation. Int J Soc Robot 5(1):103–116

    Article  Google Scholar 

  2. Akgun B, Cakmak M, Yoo JW, Thomaz AL (2012) Trajectories and keyframes for kinesthetic teaching: a human-robot interaction perspective. In: 7th ACM/IEEE international conference on human-robot interaction (HRI’12). Massachusetts, Boston, pp 391–398

  3. Albu-Schäffer A, Eiberger O, Fuchs M, Grebenstein M, Haddadin S, Ott C, Stemmer A, Wimböck T, Wolf S, Borst C, Hirzinger G (2011) Anthropomorphic soft robotics - from torque control to variable intrinsic compliance. In: Pradalier C, Siegwart R, Hirzinger G (eds) Robotics research, vol 70. Springer, Berlin/Heidelberg, pp 185–207

    Chapter  Google Scholar 

  4. Amirabdollahian F, Robins B, Dautenhahn K, Ji Z (2011) Investigating tactile event recognition in child-robot interaction for use in autism therapy. In: Engineering in medicine and biology society, EMBC, 2011 annual international conference of the IEEE, pp 5347–5351

  5. Argall BD, Billard AG (2010) A survey of tactile human-robot interactions. Robot Auton Syst 58(10):1159–1176

    Article  Google Scholar 

  6. Banerjee A, Dave RN (2004) Validating clusters using the hopkins statistic. In: Fuzzy systems, 2004. Proceedings. 2004 IEEE international conference on, vol 1, pp 149–153

  7. Bartneck C, Suzuki T, Kanda T, Nomura T (2007) The influence of peoples culture and prior experiences with aibo on their attitude towards robots. AI & SOCIETY 21(1–2):217–230. doi:10.1007/s00146-006-0052-7

    Google Scholar 

  8. Bauer C, Milighetti G, Yan W, Mikut R (2010) Human-like reflexes for robotic manipulation using leaky integrate-and-fire neurons. In: 2010 IEEE/RSJ international conference on intelligent robots and systems (IROS 2010). Taipei, pp 2572–2577

  9. Ben Amor H, Berger E, Vogt D, Jung B (2009) Kinesthetic bootstrapping: teaching motor skills to humanoid robots through physical interaction. In: Mertsching B (ed) KI 2009: advances in artificial intelligence, lecture notes in computer science. Springer, Berlin/Heidelberg, pp 492–499

    Chapter  Google Scholar 

  10. Breazeal C (2003) Emotion and sociable humanoid robots. Int J Hum Comput Stud 59(1–2):119–155 Applications of affective computing in human-computer interaction

    Article  Google Scholar 

  11. Cakmak M, Thomaz AL (2012) Designing robot learners that ask good questions. In: Proceedings of the seventh annual ACM/IEEE international conference on human-robot interaction, pp 17–24

  12. Dahiya R, Metta G, Valle M, Sandini G (2010) Tactile sensing—from humans to humanoids. IEEE Trans Robot 26(1):1–20

    Article  Google Scholar 

  13. Dahl TS, Swere EA, Palmer A (2011) Touch-triggered withdrawal reflexes for safer robots. In: Dautenhahn K, Saunders J (eds) New frontiers in human-robot interaction. John Benjamins Publishing Company, Amsterdam, pp 281–304

    Chapter  Google Scholar 

  14. DallaLibera F, Minato T, Fasel I, Ishiguro H, Menegatti E, Pagello E (2007) Teaching by touching: an intuitive method for development of humanoid robot motions. In: IEEE-RAS 7th international conference on humanoid robots (Humanoids 2007). Pittsburg, PA, pp 352–359

  15. DallaLibera F, Minato T, Fasel I, Ishiguro H, Pagello E, Menegatti E (2008) A new paradigm of humanoid robot motion programming based on touch interpretation. Robot Auton Syst 57(8):846–859

    Article  Google Scholar 

  16. Duchaine V, Lauzier N, Baril M, Lacasse MA, Gosselin C (2009) A flexible robot skin for safe physical human robot interaction. In: 2009 IEEE international conference on robotics and automation (ICRA 2009), Kobe, pp 3676–3681, doi:10.1109/ROBOT.2009.5152595

  17. Evers V, Maldonado H, Brodecki T, Hinds P (2008) Relational vs. group self-construal: Untangling the role of national culture in hri. In: human-robot interaction (HRI), 2008 3rd ACM/IEEE international conference on, pp 255–262

  18. Field T (2003) Touch. MIT Press/Bradford Books, Cambridge/New York

    Google Scholar 

  19. Fishel J, Loeb G (2012) Sensing tactile microvibrations with the biotac—comparison with human sensitivity. In: 4th IEEE RAS/EMBS international conference on biomedical robotics and biomechatronics (BioRob), pp 1122–1127, doi:10.1109/BioRob.2012.6290741

  20. Fritzsche M, Elkmann N, Schulenburg E (2011) Tactile sensing: a key technology for safe physical human robot interaction. In: 6th ACM/IEEE international conference on human-robot interaction (HRI ’11). Lausanne, pp 139–140

  21. Gibbons P, Dahl TS, Jones O (2012) Identification and production of simple tactile gestures. In: 7th ACM/IEEE international conference on human robot interaction (HRI2012), Boston, MA

  22. Giuliani M, Lenz C, Müller T, Rickert M, Knoll A (2010) Design principles for safety in human-robot interaction. Int J Soc Robot 2(3):253–274

    Article  Google Scholar 

  23. Goetz J, Kiesler S, Powers A (2003) Matching robot appearance and behavior to tasks to improve human-robot cooperation. In: Proceedings of the 12th IEEE international workshop on robot and human interactive communication (ROMAN 2003). CA, San Francisco, pp 55–60

  24. Grunwald G, Schreiber G, Albu-Schäffer A, Hirzinger G (2001) Touch: the direct type of human interaction with a redundant service robot. In: Proceedings of the 10th IEEE international workshop on robot and human interactive communication (ROMAN 2001). Bordeaux/Paris, pp 347–352

  25. Hersch M, Guenter F, Calinon S, Billard A (2008) Dynamical system modulation for robot learning via kinesthetic demonstrations. IEEE Trans Robot 24(6):1463–1467

    Article  Google Scholar 

  26. Hertenstein MJ (2002) Touch: its communicative functions in infancy. Hum Dev 45:70–94

    Article  Google Scholar 

  27. Hertenstein MJ, Keltner D, App B, Bulleit BA, Jaskolka AR (2006) Touch communicates distinct emotions. Emotion 6(3):528–533

    Article  Google Scholar 

  28. Hosoda K, Tada Y, Asada M (2006) Anthropomorphic robotic soft fingertip with randomly distributed receptors. Robot Auton Syst 54(2):104–109

    Article  Google Scholar 

  29. Ji Z, Amirabdollahian F, Polani D, Dautenhahn K (2011) Histogram based classification of tactile patterns on periodically distributed skin sensors for a humanoid robot. In: RO-MAN, 2011 IEEE, pp 433–440

  30. Kanda T, Miyashita T, Osada T, Haikawa Y, Ishiguro H (2008) Analysis of humanoid appearances in human-robot interaction. IEEE Trans Robot 24(3):725–735

    Article  Google Scholar 

  31. Kormushev P, Nenchev DN, Calinon S, Caldwell DG (2011) Upper-body kinesthetic teaching of a free-standing humanoid robot. In: 2011 IEEE international conference on robotics and automation (ICRA 2011). Shanghai, pp 3970–3975

  32. Lecanuet JP, Schaal B (2002) Sensory performances in the human foetus: a brief summary of research. Intellectica 1(34):29–56

    Google Scholar 

  33. Maeda Y, Ushioda T, Makita S (2008) Easy robot programming for industrial manipulators by manual volume sweeping. In: 2008 IEEE international conference on robotics and automation (ICRA 2008). Pasadena, CA, pp 19–23

  34. Minato T, DallaLibera F, Yokokawa S, Nakamura Y, Ishiguro H, Menegatti E (2009) A baby robot platform for cognitive developmental robotics. In: Workshop on “synergistic intelligence” at the 2009 IEEE/RSJ international conference on intelligent robots and systems (IROS 2009), St. Louis, MO

  35. Miwa H, Okuchi T, Itoh K, Takanobu H, Takanishi A (2003) A new mental model for humanoid robots for human friendly communication introduction of learning system, mood vector and second order equations of emotion. In: 2003 IEEE international conference on robotics and automation (ICRA 2003), Taipei, vol 3, pp 3588–3593

  36. Mörtl A, Lawitzky M, Kucukyilmaz A, Sezgin M, Basdogan C, Hirche S (2012) The role of roles: physical cooperation between humans and robots. Int J Robot Res 31(13):1657–1675

    Article  Google Scholar 

  37. Muir DW (2002) Adult communications with infants through touch: the forgotten sense. Hum Dev 45:95–99

    Article  Google Scholar 

  38. Naya F, Yamato J, Shinozawa K (1999) Recognizing human touching behaviors using a haptic interface for a pet-robot. In: IEEE international conference on systems man and cybernetics, Tokyo vol 2, pp 1030–1034

  39. Noda T, Miyashita T, Ishiguro H, Hagita N (2007) Map acquisition and classification of haptic interaction using cross correlation between distributed tactile sensors on the whole body surface. In: 2007 IEEE/RSJ international conference on intelligent robots and systems (IROS 2007). San Diego, CA, pp 1099–1105

  40. Pais A, Argall B, Billard A (2013) Assessing interaction dynamics in the context of robot programming by demonstration. Int J Soc Robot 5(4):477–490. doi:10.1007/s12369-013-0204-0

    Article  Google Scholar 

  41. Peternel L, Petrič T, Oztop E, Babič J (2014) Teaching robots to cooperate with humans in dynamic manipulation tasks based on multi-modal human-in-the-loop approach. Auton Robots 36(1–2):123–136. doi:10.1007/s10514-013-9361-0

    Article  Google Scholar 

  42. Pierris G, Dahl TS (2010) Compressed sparse code hierarchical som on learning and reproducing gestures in humanoid robots. In: RO-MAN, 2010 IEEE, pp 330–335

  43. Powers A, Kramer A, Lim S, Kuo J, lai Lee S, Kiesler S (2005) Eliciting information from people with a gendered humanoid robot. In: Proceedings of the 14th IEEE international workshop on robot and human interactive communication (ROMAN 2005), Nashville, TN, pp 158–163, doi:10.1109/ROMAN.2005.1513773

  44. Raffle HS, Parkes AJ, Ishii H (2004) Topobo: a constructive assembly system with kinetic memory. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’04, pp 647–654

  45. Rau PP, Li Y, Li D (2009) Effects of communication style and culture on ability to accept recommendations from robots. Comput Hum Behav 25(2):587–595. doi:10.1016/j.chb.2008.12.025

    Article  Google Scholar 

  46. Routasalo P (1999) Physical touch in nursing studies: a literature review. J Adv Nurs 30(4):843–850

    Article  Google Scholar 

  47. Russell JA (1997) Reading emotion from and into faces: resurrecting a dimensional-contextual perspective. In: Russell JA, Fernandez-Dols JM (eds) The psychology of facial expression. Cambridge University Press, New York, pp 295–320

    Chapter  Google Scholar 

  48. Schaal S, Ijspeert A, Billard A (2003) Computational approaches to motor learning by imitation. Philos Trans R Soc B 358(1431):537–547

    Article  Google Scholar 

  49. Schmitz A, Maiolino P, Maggiali M, Natale L, Cannata G, Metta G (2011) Methods and technologies for the implementation of large-scale robot tactile sensors. IEEE Trans Robot 27(3):389–400

    Article  Google Scholar 

  50. Silvera-Tawil D, Rye D, Velonaki M (2014) Interpretation of social touch on an artificial arm covered with an eit-based sensitive skin. Int J Soc Robot 6:489–505. doi:10.1007/s12369-013-0223-x

    Article  Google Scholar 

  51. Suita K, Yamada Y, Tsuchida N, Imai K, Ikeda H, Sugimoto N (1995) A failure-to-safety “kyozon” system with simple contact detection and stop capabilities for safe human-autonomous robot coexistence. In: 1995 IEEE international conference on robotics and automation (ICRA 1995), Nagoya, vol 3, pp 3089–3096 vol 3, doi:10.1109/ROBOT.1995.525724

  52. Takei T, Shimono T, Kubo R, Nishi H, Ohnishi K (2008) Gravity compensation for improvement of operationarity in bilateral teleoperation. IEEJ Trans Ind Appl 128(6):767–774

    Article  Google Scholar 

  53. Tsuji T, Ito T (2009) Command recognition by haptic interface on human support robot. 2009 IEEE/RSJ international conference on intelligent robots and systems (IROS 2009). St. Louis, MO, pp 3178–3183

  54. Vormbrock JK, Grossberg JM (1988) Cardiovascular effects of human-pet dog interactions. J Behav Med 11(5):509–517

    Article  Google Scholar 

  55. Voyles R, Khosla P (1995) Tactile gestures for human/robot interaction. 1995 IEEE/RSJ international conference on intelligent robots and systems (IROS 1995). Pittsburg, PA, pp 7–13

  56. Wang H, Kosuge K (2012) Understanding and reproducing waltz dancers’ body dynamics in physical human-robot interaction. 2012 IEEE international conference on robotics and automation (ICRA 2012). St. Paul, MN, pp 3134–3140

  57. Yamada Y, Hirasawa Y, Huang S, Umetani Y, Suita K (1997) Human-robot contact in the safeguarding space. IEEE/ASME Trans Mechatron 2(4):230–236. doi:10.1109/3516.653047

    Article  Google Scholar 

  58. Yamane K, Nakamura Y (2003) Dynamics filter-concept and implementation of online motion generator for human figures. Robot Autom IEEE Trans 19(3):421–432

    Article  Google Scholar 

  59. Yohanan S, MacLean KE (2011) Design and assessment of the haptic creature’s affect display. 6th ACM/IEEE international conference on human-robot interaction (HRI’11). Lausanne, pp 473–480

  60. Yohanan S, MacLean KE (2012) The role of affective touch in human-robot interaction: human intent and expectations in touching the haptic creature. Int J Soc Robot 4(2):163–180

    Article  Google Scholar 

  61. Yoshikai T, Hayashi M, Ishizaka Y, Fukushima H, Kadowaki A, Sagisaka T, Kobayashi K, Kumagai I, Inaba M (2012) Development of robots with soft sensor flesh for achieving close interaction behavior. Adv Artif Intell 2012:8

    Article  Google Scholar 

Download references

Acknowledgments

The first author was supported by JSPS Research Fellowship for Young Scientists. The second author was supported by the JSPS Research Activity Start-up Grant-in-Aid for Scientific Research, Project Number 26880014.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fransiska Basoeki.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (pdf 12571 KB)

Supplementary material 2 (mpg 17392 KB)

Supplementary material 3 (mpg 6302 KB)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Basoeki, F., DallaLibera, F. & Ishiguro, H. How do People Expect Humanoids to Respond to Touch?. Int J of Soc Robotics 7, 743–765 (2015). https://doi.org/10.1007/s12369-015-0318-7

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-015-0318-7

Keywords

Navigation