Skip to main content

Walk the Talk: Gestures in Mobile Interaction

  • Conference paper
  • First Online:
Social Robotics (ICSR 2017)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 10652))

Included in the following conference series:

Abstract

This study aims at describing navigation guidelines and concerning analytic motion models for a mobile interaction robot, which moves together with a human partner. We address particularly the impact of gestures on the coupled motion of this human-robot pair.

We pose that the robot needs to adjust its navigation in accordance to its gestures in a natural manner (mimicking human-human locomotion). In order to justify this suggestion, we first examine the motion patterns of real-world pedestrian dyads in accordance to 4 affective components of interaction (i.e. gestures). Three benchmark variables are derived from pedestrian trajectories and their behavior is investigated with respect to three conditions: (i) presence/absence of isolated gestures, (ii) varying number of simultaneously performed (i.e. concurring) gestures, (iii) varying size of the environment.

It is observed empirically and proven quantitatively that there is a significant difference in the benchmark variables between presence and absence of the gestures, whereas no prominent variation exists in regard to the type of gesture or the number of concurring gestures. Moreover, size of the environment is shown to be a crucial factor in sustainability of the group structure.

Subsequently, we propose analytic models to represent these behavioral variations and prove that our models attain significant accuracy in reflecting the distinctions. Finally, we propose an implementation scheme for integrating the analytic models to practical applications. Our results bear the potential of serving as navigation guidelines for the robot so as to provide a more natural interaction experience for the human counterpart of a robot-pedestrian group on-the-move.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    If a dyad performs more than one gesture (concurring gestures), it is assigned multiple labels.

  2. 2.

    For Anova relating \(\phi \), we use only the values \(\phi \in [0, \pi /2]\) to be able to highlight differences in spread between distributions with the same average value \(\approx \pi /2\).

  3. 3.

    There was no dyad which performed all 4 gestures at once.

  4. 4.

    Note that in modeling \(\phi \), the distinguishing effect is pertained by only \(\kappa \), whereas \(\mu \) is always around \(\pi /2\), as expected.

References

  1. Knapp, M.L., Hall, J.A., Horgan, T.G.: Nonverbal Communication in Human Interaction. Cengage Learning, Boston (2013)

    Google ScholarĀ 

  2. Streeck, J., Knapp, M.L.: The interaction of visual and verbal features in human communication. In: Poyatos, F. (ed.) Advances in Nonverbal Communication, vol. 10, pp. 3ā€“23. Benjamins, Amsterdam (1992)

    Google ScholarĀ 

  3. Hostetter, A.B.: When do gestures communicate? A meta-analysis. Psychol. Bull. 137(2), 297 (2011)

    ArticleĀ  Google ScholarĀ 

  4. Neff, M., Wang, Y., Abbott, R., Walker, M.: Evaluating the effect of gesture and language on personality perception in conversational agents. In: Allbeck, J., Badler, N., Bickmore, T., Pelachaud, C., Safonova, A. (eds.) IVA 2010. LNCS, vol. 6356, pp. 222ā€“235. Springer, Heidelberg (2010). doi:10.1007/978-3-642-15892-6_24

    ChapterĀ  Google ScholarĀ 

  5. Karam, M.: Ph.D. thesis: A framework for research and design of gesture-based human-computer interactions. Ph.D. dissertation, University of Southampton (2006)

    Google ScholarĀ 

  6. Breazeal, C., Kidd, C.D., Thomaz, A.L., Hoffman, G., Berlin, M.: Effects of nonverbal communication on efficiency and robustness in human-robot teamwork. In: IROS, pp. 708ā€“713 (2005)

    Google ScholarĀ 

  7. Rautaray, S.S., Agrawal, A.: Vision based hand gesture recognition for human computer interaction: a survey. Artif. Intell. Rev. 43(1), 1ā€“54 (2015)

    ArticleĀ  Google ScholarĀ 

  8. Gleeson, B., MacLean, K., Haddadi, A., Croft, E., Alcazar, J.: Gestures for industry: intuitive human-robot communication from human observation. In: HRI, pp. 349ā€“356 (2013)

    Google ScholarĀ 

  9. Matuszek, C., Bo, L., Zettlemoyer, L., Fox, D.: Learning from unscripted deictic gesture and language for human-robot interactions. In: AAAI, pp. 2556ā€“2563 (2014)

    Google ScholarĀ 

  10. Salem, M., Rohlfing, K., Kopp, S., Joublin, F.: A friendly gesture: investigating the effect of multimodal robot behavior in human-robot interaction. In: RO-MAN, pp. 247ā€“252 (2011)

    Google ScholarĀ 

  11. Salem, M., Kopp, S., Wachsmuth, I., Rohlfing, K., Joublin, F.: Generation and evaluation of communicative robot gesture. IJSR 4(2), 201ā€“217 (2012)

    Google ScholarĀ 

  12. Haddington, P., Mondada, L., Nevile, M.: Interaction and Mobility: Language and the Body in Motion, vol. 20. Walter De Gruyter, Berlin (2013)

    BookĀ  Google ScholarĀ 

  13. Mead, R., Matarić, M.: Autonomous human-robot proxemics: socially aware navigation based on interaction potential. Auton. Robots 41, 1189ā€“1201 (2016)

    ArticleĀ  Google ScholarĀ 

  14. Gullberg, M., Holmqvist, K.: Keeping an eye on gestures: visual perception of gestures in face-to-face communication. Pragmat. Cogn. 7(1), 35ā€“63 (1999)

    ArticleĀ  Google ScholarĀ 

  15. Zanlungo, F., Ikeda, T., Kanda, T.: Potential for the dynamics of pedestrians in a socially interacting group. Phys. Rev. E 89(1), 012811 (2014)

    ArticleĀ  Google ScholarĀ 

  16. Vinciarelli, A., Pantic, M., Bourlard, H.: Social signal processing: survey of an emerging domain. Image Vis. Comput. 27(12), 1743ā€“1759 (2009)

    ArticleĀ  Google ScholarĀ 

  17. Cohen, J.: Weighted kappa: nominal scale agreement provision for scaled disagreement or partial credit. Psychol. Bulle. 70(4), 213 (1968)

    ArticleĀ  Google ScholarĀ 

  18. Di Eugenio, B., Glass, M.: The kappa statistic: a second look. Comput. Linguist. 30(1), 95ā€“101 (2004)

    ArticleĀ  MATHĀ  Google ScholarĀ 

  19. Glas, D., Miyashita, T., Ishiguro, H., Hagita, N.: Laser-based tracking of human position and orientation using parametric shape modeling. Adv. Robot. 23(4), 405ā€“428 (2009)

    ArticleĀ  Google ScholarĀ 

  20. Mardia, K.V., Jupp, P.E.: Directional Statistics. Wiley, Hoboken (2000)

    MATHĀ  Google ScholarĀ 

  21. YĆ¼cel, Z., Zanlungo, F., Ikeda, T., Miyashita, T., Hagita, N.: Deciphering the crowd: modeling and identification of pedestrian group motion. Sensors 13(1), 875ā€“897 (2013)

    ArticleĀ  Google ScholarĀ 

  22. Murakami, R., Morales Saiki, L.Y., Satake, S., Kanda, T., Ishiguro, H.: Destination unknown: walking side-by-side without knowing the goal. In: HRI, pp. 471ā€“478. ACM (2014)

    Google ScholarĀ 

Download references

Acknowledgments

This study was supported by JSPS KAKENHI Grant Numbers 15H05322 and 16K12505.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zeynep YĆ¼cel .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

Ā© 2017 Springer International Publishing AG

About this paper

Cite this paper

YĆ¼cel, Z., Zanlungo, F., Shiomi, M. (2017). Walk the Talk: Gestures in Mobile Interaction. In: Kheddar, A., et al. Social Robotics. ICSR 2017. Lecture Notes in Computer Science(), vol 10652. Springer, Cham. https://doi.org/10.1007/978-3-319-70022-9_22

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-70022-9_22

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-70021-2

  • Online ISBN: 978-3-319-70022-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics