skip to main content
10.1145/3434073.3444651acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
research-article

Effects of Gaze and Arm Motion Kinesics on a Humanoid's Perceived Confidence, Eagerness to Learn, and Attention to the Task in a Teaching Scenario

Published: 08 March 2021 Publication History

Abstract

When human students practise new skills with a teacher, they often display nonverbal behaviours (e.g., head and limb movements, gaze, etc.) to communicate their level of understanding and expressing their interest in the task. Similarly, a student robot's capability to provide human teachers with social signals to express its internal state might improve learning outcomes. This could also lead to a more successful social interactions between intelligent robots and human teachers. However, to design successful nonverbal communication for a robot, we first need to understand how human teachers interpret such nonverbal cues when watching a trainee robot practising a task. Therefore, in this paper, we study the effects of different gaze behaviours as well as manipulating speed and smoothness of arm movement on human teachers' perception of a robot's (a) confidence, (b) eagerness to learn, and (c) attention to the task. In an online experiment, we asked the 167 participants (as teachers) to rate the behaviours of a trainee robot in the context of learning a physical task. The results suggest that splitting the robot's gaze between the teacher and the task not only affects the perceived attention, but can also make the robot appear to be more eager to learn. Furthermore, perceptions of all three attributes tested were systematically affected by varying parameters of the robot's arm movement trajectory while performing task actions.

References

[1]
Hervé Abdi. 2007. Bonferroni and Sidák Corrections for Multiple Comparisons. In Encyclopedia of Measurement and Statistics. Sage, Thousand Oaks, CA, 103--107.
[2]
Henny Admoni, Bradley Hayes, David Feil-Seifer, Daniel Ullman, and Brian Scassellati. 2013. Are You Looking At Me? Perception of Robot Attention is Mediated by Gaze Type and Group Size. In 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 389--395. https://doi.org/10.1109/HRI.2013.6483614
[3]
Henny Admoni and Brian Scassellati. 2017. Social Eye Gaze in Human-Robot Interaction: A Review. Journal of Human-Robot Interaction, Vol. 6, 1 (May 2017), 25. https://doi.org/10.5898/jhri.6.1.admoni
[4]
Seyed Reza Ahmadzadeh, Ali Paikan, Fulvio Mastrogiovanni, Lorenzo Natale, Petar Kormushev, and Darwin G. Caldwell. 2015. Learning Symbolic Representations of Actions from Human Demonstrations. In 2015 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 3801--3808. https://doi.org/10.1109/ICRA.2015.7139728
[5]
Waqar Ali and Andrew B. Williams. 2020. Evaluating the Effectiveness of Nonverbal Communication in Human-Robot Interaction. In HRI '20: Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction. ACM, 99--100. https://doi.org/10.1145/3371382.3378354
[6]
Sean Andrist, Bilge Mutlu, and Adriana Tapus. 2015. Look Like Me: Matching Robot Personality via Gaze to Increase Motivation. In CHI '15: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 3603--3612. https://doi.org/10.1145/2702123.2702592
[7]
Christoph Bartneck, Andreas Duenser, Elena Moltchanova, and Karolina Zawieska. 2015. Comparing the Similarity of Responses Received from Studies in Amazon Mechanical Turk to Studies Conducted Online and with Direct Recruitment. PLOS ONE, Vol. 10, 4 (April 2015), 1--23. https://doi.org/10.1371/journal.pone.0121595
[8]
Douglas Bates, Martin M"a chler, Benjamin M. Bolker, and Steven C. Walker. 2015. Fitting Linear Mixed-Effects Models Using lme4. Journal of Statistical Software, Vol. 67, 1 (Oct. 2015). https://doi.org/10.18637/jss.v067.i01
[9]
Aude Billard, Sylvain Calinon, Rüdiger Dillmann, and Stefan Schaal. 2008. Robot Programming by Demonstration. In Springer Handbook of Robotics. Springer, 1371--1394. https://doi.org/10.1007/978--3--540--30301--5_60
[10]
Ray L. Birdwhistell. 1983. Background to Kinesics. ETC: A Review of General Semantics, Vol. 40, 3 (1983), 352--361.
[11]
Mriganka Biswas, Marta Romeo, Angelo Cangelosi, and Ray B. Jones. 2020. Are older people any different from younger people in the way they want to interact with robots? Scenario based survey. Journal on Multimodal User Interfaces, Vol. 14, 1 (July 2020), 61--72. https://doi.org/10.1007/s12193-019-00306-x
[12]
Hamparsum Bozdogan. 1987. Model selection and Akaike's Information Criterion (AIC): The general theory and its analytical extensions. Psychometrika, Vol. 52, 3 (Sept. 1987), 345--370. https://doi.org/10.1007/BF02294361
[13]
Cynthia Breazeal. 2009. Role of expressive behaviour for robots that learn from people. Philosophical Transactions of the Royal Society B: Biological Sciences, Vol. 364, 1535 (Dec. 2009), 3527--3538. https://doi.org/10.1098/rstb.2009.0157
[14]
Andrew G. Brooks and Ronald C. Arkin. 2007. Behavioral overlays for non-verbal communication expression on a humanoid robot. Autonomous Robots, Vol. 22, 1 (Sept. 2007), 55--74. https://doi.org/10.1007/s10514-006--9005--8
[15]
Tom Bruneau. 2012. Chronemics: Time-binding and the construction of personal time. ETC: A Review of General Semantics, Vol. 69, 1 (2012), 72--92.
[16]
Angelo Cangelosi and Francesca Stramandinoli. 2018. A review of abstract concept learning in embodied agents and robots. Philosophical Transactions of the Royal Society B: Biological Sciences, Vol. 373, 1752 (June 2018), 2--7. https://doi.org/10.1098/rstb.2017.0131
[17]
Crystal Chao, Maya Cakmak, and Andrea L. Thomaz. 2010. Transparent Active Learning for Robots. In 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 317--324.
[18]
Josep Arnau Claret, Gentiane Venture, and Luis Basa n ez. 2017. Exploiting the Robot Kinematic Redundancy for Emotion Conveyance to Humans as a Lower Priority Task. International Journal of Social Robotics, Vol. 9, 2 (Jan. 2017), 277--292. https://doi.org/10.1007/s12369-016-0387--2
[19]
Kerstin Dautenhahn. 2007. Socially intelligent robots: dimensions of human--robot interaction. Philosophical Transactions of the Royal Society B: Biological Sciences, Vol. 362, 1480 (April 2007), 679--704. https://doi.org/10.1098/rstb.2006.2004
[20]
Giuseppe Di Cesare. 2020. The Importance of the Affective Component of Movement in Action Understanding. In Modelling Human Motion: From Human Perception to Robot Design. Springer, Cham, 103--116. https://doi.org/10.1007/978--3-030--46732--6_6
[21]
Anca Dragan and Siddhartha Srinivasa. 2013. Generating Legible Motion. In Proceedings of Robotics: Science and Systems IX. https://doi.org/10.15607/RSS.2013.IX.024
[22]
Autumn Edwards, Chad Edwards, David Westerman, and Patric R. Spence. 2019. Initial expectations, interactions, and beyond with social robots. Computers in Human Behavior, Vol. 90 (Jan. 2019), 308--314. https://doi.org/10.1016/j.chb.2018.08.042
[23]
Brittney A. English, Alexis Coates, and Ayanna Howard. 2017. Recognition of Gestural Behaviors Expressed by Humanoid Robotic Platforms for Teaching Affect Recognition to Children with Autism - A Healthy Subjects Pilot Study. In International Conference on Social Robotics. Springer, 567--576. https://doi.org/10.1007/978--3--319--70022--9
[24]
Kerstin Fischer and Joe Saunders. 2012. Getting acquainted with a developing robot. In HBU'12: Proceedings of the Third international conference on Human Behavior Understanding. Springer-Verlag, 125--133. https://doi.org/10.1007/978--3--642--34014--7_11
[25]
Moojan Ghafurian, Neil Budnarain, and Jesse Hoey. 2019. Improving Humanness of Virtual Agents and Users' Cooperation through Emotions. (2019). https://doi.org/10.1007/s00265-018--2489--3
[26]
Britta Wrede, Katharina J Rohlfing, Marc Hanheide, and Gerhard Sagerer. 2009. Towards Learning by Interacting. In Creating Brain-Like Intelligence. Springer, Berlin, Heidelberg, 139--150. https://doi.org/10.1007/978--3--642-00616--6_8

Cited By

View all
  • (2025)Continual Learning Through Human-Robot Interaction: Human Perceptions of a Continual Learning Robot in Repeated InteractionsInternational Journal of Social Robotics10.1007/s12369-025-01214-9Online publication date: 5-Feb-2025
  • (2024)"Text + Eye" on Autonomous Taxi to Provide Geospatial Instructions to PassengerProceedings of the 12th International Conference on Human-Agent Interaction10.1145/3687272.3690906(429-431)Online publication date: 24-Nov-2024
  • (2024)FARPLS: A Feature-Augmented Robot Trajectory Preference Labeling System to Assist Human Labelers’ Preference ElicitationProceedings of the 29th International Conference on Intelligent User Interfaces10.1145/3640543.3645145(344-369)Online publication date: 18-Mar-2024
  • Show More Cited By

Index Terms

  1. Effects of Gaze and Arm Motion Kinesics on a Humanoid's Perceived Confidence, Eagerness to Learn, and Attention to the Task in a Teaching Scenario

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        HRI '21: Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction
        March 2021
        425 pages
        ISBN:9781450382892
        DOI:10.1145/3434073
        • General Chairs:
        • Cindy Bethel,
        • Ana Paiva,
        • Program Chairs:
        • Elizabeth Broadbent,
        • David Feil-Seifer,
        • Daniel Szafir
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 08 March 2021

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. gaze
        2. kinesics
        3. nonverbal behaviour
        4. perceived robot attributes

        Qualifiers

        • Research-article

        Funding Sources

        • Canada 150 Research Chairs Program

        Conference

        HRI '21
        Sponsor:

        Acceptance Rates

        Overall Acceptance Rate 268 of 1,124 submissions, 24%

        Upcoming Conference

        HRI '25
        ACM/IEEE International Conference on Human-Robot Interaction
        March 4 - 6, 2025
        Melbourne , VIC , Australia

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)56
        • Downloads (Last 6 weeks)3
        Reflects downloads up to 18 Feb 2025

        Other Metrics

        Citations

        Cited By

        View all
        • (2025)Continual Learning Through Human-Robot Interaction: Human Perceptions of a Continual Learning Robot in Repeated InteractionsInternational Journal of Social Robotics10.1007/s12369-025-01214-9Online publication date: 5-Feb-2025
        • (2024)"Text + Eye" on Autonomous Taxi to Provide Geospatial Instructions to PassengerProceedings of the 12th International Conference on Human-Agent Interaction10.1145/3687272.3690906(429-431)Online publication date: 24-Nov-2024
        • (2024)FARPLS: A Feature-Augmented Robot Trajectory Preference Labeling System to Assist Human Labelers’ Preference ElicitationProceedings of the 29th International Conference on Intelligent User Interfaces10.1145/3640543.3645145(344-369)Online publication date: 18-Mar-2024
        • (2023)How Do We Perceive Our Trainee Robots? Exploring the Impact of Robot Errors and Appearance When Performing Domestic Physical Tasks on Teachers’ Trust and EvaluationsACM Transactions on Human-Robot Interaction10.1145/358251612:3(1-41)Online publication date: 5-May-2023
        • (2022)A Novel Architectural Method for Producing Dynamic Gaze Behavior in Human-Robot InteractionsProceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction10.5555/3523760.3523812(383-392)Online publication date: 7-Mar-2022
        • (2022)Comparing Two Safe Distance Maintenance Algorithms for a Gaze-Controlled HRI Involving Users with SSMIACM Transactions on Accessible Computing10.1145/353082215:3(1-23)Online publication date: 8-Jul-2022
        • (2022)A Support Worker Perspective on Use of New Technologies by People with Intellectual DisabilitiesACM Transactions on Accessible Computing10.1145/352305815:3(1-21)Online publication date: 8-Jul-2022
        • (2022)“Every Website Is a Puzzle!”: Facilitating Access to Common Website Features for People with Visual ImpairmentsACM Transactions on Accessible Computing10.1145/351903215:3(1-35)Online publication date: 8-Jul-2022
        • (2022)A Participatory Design Approach to Creating Echolocation-Enabled Virtual EnvironmentsACM Transactions on Accessible Computing10.1145/351644815:3(1-28)Online publication date: 8-Jul-2022
        • (2022)Tactile Materials in Practice: Understanding the Experiences of Teachers of the Visually ImpairedACM Transactions on Accessible Computing10.1145/350836415:3(1-34)Online publication date: 8-Jul-2022
        • Show More Cited By

        View Options

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Figures

        Tables

        Media

        Share

        Share

        Share this Publication link

        Share on social media