skip to main content
10.1145/2522848.2522891acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
poster

Video analysis of approach-avoidance behaviors of teenagers speaking with virtual agents

Published: 09 December 2013 Publication History

Abstract

Analysis of non-verbal behaviors in HCI allows understanding how individuals apprehend and adapt to different situations of interaction. This seems particularly relevant when considering tasks such as speaking in a foreign language, which is known to elicit anxiety. This is even truer for young users for whom negative pedagogical feedbacks might have a strong negative impact on their motivation to learn.
In this paper, we consider the approach-avoidance behaviors of teenagers speaking with virtual agents when using an e-learning platform for learning English. We designed an algorithm for processing the video of these teenagers outside laboratory conditions (e.g. a classical collective classroom in a secondary school) using a webcam. This algorithm processes the video of the user and computes the inter-ocular distance. The anxiety of the users is also collected with questionnaires.
Results show that the inter-ocular distance enables to discriminate between approach and avoidance behaviors of teenagers reacting to positive or negative stimulus. This simple metric collected via video processing enables to detect an approach behavior related to a positive stimulus and an avoidance behavior related to a negative stimulus. Furthermore, we observed that these automatically detected approach-avoidance behaviors are correlated with anxiety.

References

[1]
Alben, L. 1996. Quality of experience: defining the criteria for effective interaction design. Interactions. 3, 3 (May/June 1996), 11--15.
[2]
Arora, S. J., and Singh, R. P. 2012 Automatic Speech Recognition: A Review. International Journal of Computer Applications. 60, 9, 34--44.
[3]
Asteriadis, S., Karpouzis, K., and Kollias, S. 2009. Feature extraction and selection for inferring user engagement in an hci environment. HCI International. (July 2009), 19--24.
[4]
Bargh, J., and Chartrand, T. 1999. The unbearable automaticity of being. American Psychologist. 54, 462--479.
[5]
Cacioppo, J. T., Priester, J. R., and Berntson G. G. 1993. Rudimentary determinants of attitudes. II: Arm flexion and extension have differential effects on attitudes. Journal of Personality and Social Psychology. 65, 5--17.
[6]
Castellano, G., Pereira, A., Leite, I., Paiva, A., and P. W. McOwan, P. W. 2009. Detecting user engagement with a robot companion using task and social interaction-based features. In International Conference on Multimodal Interfaces. 119--126.
[7]
Chen, M., and Bargh, J. A. 1999. Consequences of automatic evaluation: immediate behavioral predispositions to approach or avoid the stimulus. In Personality and Social Psychology Bulletin. 25, 2, 215--224.
[8]
Dantzig, S. V., Pecher, D., and Zwaan, R. A. 2008. Approach and avoidance as action effect. Quarterly Journal of Experimental Psychology. 61, 9,1298--1306.
[9]
Deater-Deckard, K., Chang, M., and Evans, M. E. 2013. Engagement states and learning from educational games. New Directions for Child and Adolescent Development. 139, 21--30.
[10]
Eerland, A., Guadalupe T. M., Franken, I., and Zwaan, R. A. 2012. Posture as index for approach-avoidance behavior. Plos One. 7, 2 (February 2012),1.
[11]
Elliot, A.J. 2008. Approach and avoidance motivation. In Handbook of approach and avoidance motivation. A. J. Elliot, Ed. Boca Raton, Florida, 4--14.
[12]
Forster, J., Higgins, E. T., and Idson, L. C. 1998. Approach and avoidance strength during goal attainment: Regulatory focus and the "goal looms larger" effect. Journal of Personality and Social Psychology. 75, 1115--1131.
[13]
Frijda, N. H. 1986. The Laws of Emotion. In Human Emotions: A Reader. J. M. Jenkins, K. Oatley, and N.Stein, Eds. MA: Blackwell Publishers, 271--287.
[14]
Gaikwad, S. K., Gawali, B. W., and Yannawar, P. 2010. A Review on Speech Recognition Technique. International Journal of Computer Application. 10, 3, 16--24.
[15]
Gray, J. A. 1987. The neuropsychology of emotion and personality. In Cognitive neurochemistry. Oxford: Oxford University Press. 171--190.
[16]
Hassenzahl, M., and Tractinsky, N. 2006 User Experience - a Research Agenda. Behaviour and Information Technology, 25, 2 (March-April. 2006), 91--97.
[17]
Hunt, A., and McGlashan., S. 2004. Speech Recognition Grammar Specification Version 1.0. (March 2004). {Online}. http://www.w3.org/TR/speech-grammar/
[18]
Kawato, K., and Ohya, J. 2000. Real-time detection of nodding and head-shaking by directly detecting and tracking the between-eyes. In Proceedings of Fourth IEEE International Conference on Automatic Face and Gesture Recognition. 40--45.
[19]
Lang, P. 1984. Cognition in emotion: Concept and action. In Emotion, Cognition, and Behavior. C. Izard, J. Kagan, and R. Zajonc, Eds. Cambridge University Press, New York, 196--226.
[20]
Lazarus, R. 1991. Emotion and Adaption. Oxford University Press, New York
[21]
Mahlke, S. 2007 Usability, aesthetics, and emotions in human-technology-interaction. In International Journal of Psychology. 42, 253--264.
[22]
Meier, B.P., D'Agostino, P.R., Elliot, A.J., Maier, M.A., and Wilkowski, B.M. 2012. Color in context: Psychological context moderates the influence of red on approach and avoidance motivated behavior. PLoS ONE. 7, 7 (July. 2012), 1.
[23]
Michalowski, M. P., Sabanovic, S., and Simmons, R. 2006. A spatial model of engagement for a social robot. In IEEE International Worskshop on Advanced Motion Control, 762--767.
[24]
Microsoft. 2013. Microsoft Speech Platform SDK 11 Documentation. {Online}. http://www.microsoft.com/en-us/download/details.aspx?id=27226
[25]
Morency, L. P., Rahimi, A., and Darrell, T. 2003. Adaptive view-based appearance models. In Computer Vision and Pattern Recognition. 1, 803--812.
[26]
Mota, S. and Picard, R. W. 2003. Automated posture analysis for detecting learner's interest level. In Workshop on Computer Vision and Pattern Recognition for Human-Computer Interaction. (June. 2003), 49.
[27]
Peters, C. and Qureshi, A. 2010. A head movement propensity model for animating gaze shifts and blinks of virtual characters. Computer & Graphics. 34, 6, 677--687.
[28]
Rapee, R. M. and Heimberg, R. G. 1997. A cognitive--behavioral model of anxiety in social phobia. Behavior Research and Therapy. 32, 741--756.
[29]
Reynolds, C.R. and Richmond, B.O. 1985. Revised Children's Manifest Anxiety Scale (RCMAS) Manual. Los Angeles: Western Psychological Services.
[30]
Rozgic, V., Xiao, B., Katsamanis, A., Baucom, B. R., Georgio, P. G., and Narayanan, S. 2011. Estimation of ordinal approach-avoidance labels in dyadic interactions: Ordinal logistic regression approach. In IEEE International Conference on Audio, Speech and Signal Processing (ICASSP). (May 2011), 2368--2371.
[31]
Rutherford, H. J., and Lindell, A. L. 2011. Thriving and surviving: Approach and avoidance motivation and lateralisation. Emotion Review. 3, 3, 333--343.
[32]
Sanghvi, J., Castellano, G., Leite, I., Pereira, A., McOwan, P. W., and Paiva, A. 2011. Automatic analysis of affective postures and body motion to detect engagement with a game companion. In Proceedings of ACM/IEEE International Conference on Human-Robot Interaction. 305--312.
[33]
Seibt, B., Neumann, R., Nussinson, R., and Strack, F. 2008. Movement direction or change in distance' self-and object-related approach-avoidance motions. Journal of Experimental Social Psychology. 44, 3, 713--720.
[34]
Skinner, E. A. and Belmont, M. J. 1993. Motivation in the classroom: Reciprocal effects of teacher behavior and student engagement across the school year. Journal of Educational Psychology. 85, 4, 571--581.
[35]
Stins, J. F., Roelofs, K., Villan, J., Kooijman, K., Hagenaars, M. A. and Beek, P. J. 2011. Walk to me when I smile, step back when I'm angry: emotional faces modulate whole-body approach-avoidance behaviors. Experimental Brain Research. 212, 4, 603--611.
[36]
Xiao, B., Georgiou, P., Baucom, B., and Narayanan, S. 2012. Multimodal detection of salient behaviors of approach-avoidance in dyadic interactions. In Proceedings of the 14th ACM international conference on Multimodal interaction. 141--144.
[37]
Zhu, Z. and Ji, Q. 2005. Robust real-time eye detection and tracking under variable lighting conditions and various face orientations. Computer Vision and Image Understanding. 98, 1, 124--154.

Cited By

View all
  • (2023)AMBY: A development environment for youth to create conversational agentsInternational Journal of Child-Computer Interaction10.1016/j.ijcci.2023.10061838(100618)Online publication date: Dec-2023
  • (2022)The Last Decade of HCI Research on Children and Voice-based Conversational AgentsProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3502016(1-19)Online publication date: 29-Apr-2022
  • (2015)Gestural and Postural Reactions to Stressful EventProceedings of the 2015 International Conference on Affective Computing and Intelligent Interaction (ACII)10.1109/ACII.2015.7344696(988-992)Online publication date: 21-Sep-2015
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICMI '13: Proceedings of the 15th ACM on International conference on multimodal interaction
December 2013
630 pages
ISBN:9781450321297
DOI:10.1145/2522848
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 09 December 2013

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. anxiety
  2. approach-avoidance behavior
  3. e-learning platform
  4. inter-ocular distance
  5. video processing
  6. virtual agent interaction

Qualifiers

  • Poster

Conference

ICMI '13
Sponsor:

Acceptance Rates

ICMI '13 Paper Acceptance Rate 49 of 133 submissions, 37%;
Overall Acceptance Rate 453 of 1,080 submissions, 42%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)9
  • Downloads (Last 6 weeks)2
Reflects downloads up to 17 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2023)AMBY: A development environment for youth to create conversational agentsInternational Journal of Child-Computer Interaction10.1016/j.ijcci.2023.10061838(100618)Online publication date: Dec-2023
  • (2022)The Last Decade of HCI Research on Children and Voice-based Conversational AgentsProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3502016(1-19)Online publication date: 29-Apr-2022
  • (2015)Gestural and Postural Reactions to Stressful EventProceedings of the 2015 International Conference on Affective Computing and Intelligent Interaction (ACII)10.1109/ACII.2015.7344696(988-992)Online publication date: 21-Sep-2015
  • (2013)Evaluation of vision-based real-time measures for emotions discrimination under uncontrolled conditionsProceedings of the 2013 on Emotion recognition in the wild challenge and workshop10.1145/2531923.2531925(17-22)Online publication date: 9-Dec-2013

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media