skip to main content
10.1145/3483845.3483852acmotherconferencesArticle/Chapter ViewAbstractPublication PagesccrisConference Proceedingsconference-collections
research-article

Interactive Intention Prediction Model for Humanoid Robot Based on Visual Features

Published: 22 October 2021 Publication History

Abstract

It is necessary for humanoid robots to have the capability of human-like social environment perception, which could enable robots to interact with people intelligently. In order to improve the interaction capability of humanoid robots in complex and changeable social environments, an interactive intention prediction model(IIPM) is proposed, which can quantitatively predict the intensity of interactive intention by the visual features of face orientation, social distance and facial expression in the actual social environment. Based on this model, humanoid robots can make autonomous decisions and select interactive person to carry out interactive tasks reasonably. Finally, the prediction accuracy of the IIPM is proved by single-person and multi-person experiments, which provides an effective and accurate solution for natural human-robot interaction (HRI).

References

[1]
N. Mavridis. "A review of verbal and non-verbal human–robot interactive communication." Robotics and Autonomous Systems 63 (2015): 22-35. https://doi.org/10.1016/j.robot.2014.09.031.
[2]
O. Mubin, "An image based non-verbal behaviour analysis of HRI." International Conference on Social Robotics. Springer, Cham, 2017. https://doi.org/10.1007/978-3-319-70022-9_3.
[3]
T. Tasaki, "Dynamic communication of humanoid robot with multiple people based on interaction distance." Information and Media Technologies 1.1 (2006): 285-295. https://doi.org/10.11185/imt.1.285
[4]
R. Bednarik, H. Vrzakova, and M.l Hradis. 2012. What do you want to do next: a novel approach for intent prediction in gaze-based interaction. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). Association for Computing Machinery, New York, NY, USA, 83–90. https://doi.org/10.1145/2168556.2168569.
[5]
W. Wang, “Human computer interaction:Intention recognition based on EEG and eyet racking.” Acta Aeronautica et Astronautica Sinica(in Chinese) 42.02(2021):292-302. https://
[6]
S. Ren, (2019) Research on Interactive Intent Recognition Based on Facial Expression and Line of Sight Direction. In: Li J., Wang S., Qin S., Li X., Wang S. (eds) Advanced Data Mining and Applications. ADMA 2019. Lecture Notes in Computer Science, vol 11888. Springer, Cham. https://doi.org/10.1007/978-3-030-35231-8_31
[7]
T. Baltrusaitis, A. Zadeh, Y. C. Lim and L. Morency, "OpenFace 2.0: Facial Behavior Analysis Toolkit," 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018), 2018, pp. 59-66. https://
[8]
T. L. Saaty, "Decision making with the analytic hierarchy process." International journal of services sciences 1.1 (2008): 83-98. https://doi.org/10.1504/IJSSci.2008.01759
[9]
A. Zadeh, Y. C. Lim, T. Baltrušaitis and L. Morency, "Convolutional Experts Constrained Local Model for 3D Facial Landmark Detection," 2017 IEEE International Conference on Computer Vision Workshops (ICCVW), 2017, pp. 2519-2528. https://
[10]
E. T. Hall., "Proxemics [and comments and replies]." Current anthropology 9.2/3 (1968): 83-108. https://
[11]
S. M. Fiore, "Toward understanding social cues and signals in human–robot interaction: effects of robot gaze and proxemic behavior." Frontiers in psychology 4 (2013): 859. https://
[12]
X. T. Truong, T. D. Ngo. "Social interactive intention prediction and categorization." ICRA 2019 Workshop on MoRobAE-Mobile Robot Assistants for the Elderly, Montreal Canada, May 20-24. 2019.
[13]
J. -E. Nurmi, (1996), Optimistic, Approach-oriented, and Avoidance Strategies in Social Situations: Three Studies on Loneliness and Peer Relationships. Eur. J. Pers., 10: 201-219. https://doi.org/10.1002/(SICI)1099-0984(199609)10:3<201::AID-PER257>3.0.CO;2-#.
[14]
Ekman, Paul, and Erika L. Rosenberg, eds. What the face reveals: Basic and applied studies of spontaneous expression using the Facial Action Coding System (FACS). Oxford University Press, USA, 1997.
[15]
Y. -I. Tian, T. Kanade and J. F. Cohn, "Recognizing action units for facial expression analysis," in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 23, no. 2, pp. 97-115, Feb. 2001. https://

Cited By

View all
  • (2023)A method based on interpretable machine learning for recognizing the intensity of human engagement intentionScientific Reports10.1038/s41598-023-29661-213:1Online publication date: 13-Feb-2023
  • (2023)Human engagement intention intensity recognition method based on two states fusion fuzzy inference systemIntelligent Service Robotics10.1007/s11370-023-00464-816:3(307-322)Online publication date: 26-May-2023

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
CCRIS '21: Proceedings of the 2021 2nd International Conference on Control, Robotics and Intelligent System
August 2021
278 pages
ISBN:9781450390453
DOI:10.1145/3483845
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 22 October 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Human-robot interaction
  2. Interactive intention prediction
  3. Robot vision
  4. Weight distribution

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

CCRIS'21

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)21
  • Downloads (Last 6 weeks)1
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2023)A method based on interpretable machine learning for recognizing the intensity of human engagement intentionScientific Reports10.1038/s41598-023-29661-213:1Online publication date: 13-Feb-2023
  • (2023)Human engagement intention intensity recognition method based on two states fusion fuzzy inference systemIntelligent Service Robotics10.1007/s11370-023-00464-816:3(307-322)Online publication date: 26-May-2023

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media