skip to main content
10.1145/3640471.3680234acmconferencesArticle/Chapter ViewAbstractPublication PagesmobilehciConference Proceedingsconference-collections
extended-abstract

EmoFoot: Can Your Foot Tell How You Feel when Playing Virtual Reality Games?

Published: 21 September 2024 Publication History

Abstract

Understanding feelings in Virtual Reality (VR) games is vital for enhancing engagement in human-computer interaction. Traditional methods for assessing emotions and player experience, both subjective and objective, often fall short of capturing players’ comprehensive and nuanced experiences. This preliminary study introduces EmoFoot, a novel approach leveraging foot pressure sensors to decode player feelings during VR gameplay. We show the diverse patterns of VR game experiences using subjective reports focused on immersion, competence, negative and positive affect, flow, tension, challenge, and engagement. By integrating smart insoles, our research investigates the potential of using foot pressure data to identify valence and arousal levels. We use Machine Learning models to discover how players’ feet can reveal their emotions. EmoFoot aims to introduce a seamless and unobtrusive method for monitoring player experience, contributing to immersive technology by enhancing our understanding of the objective indicators of player emotions and improving the overall gaming experience.

References

[1]
AM AlzeerAlhouseini, I Al-Shaikhli, A Rahman, and Mariam Adawiah Dzulkifli. 2016. Emotion detection using physiological signals EEG & ECG. Int. J. Adv. Comput. Technol 8 (2016), 103–112.
[2]
Chiara Bassano, Giorgio Ballestin, Eleonora Ceccaldi, Fanny Isabelle Larradet, Maurizio Mancini, Erica Volta, and Radoslaw Niewiadomski. 2019. A VR Game-Based System for Multimodal Emotion Data Collection. In Motion, Interaction and Games. Association for Computing Machinery, New York, NY, USA, Article 38, 3 pages.
[3]
Yulong Bian, Chenglei Yang, Fengqiang Gao, Huiyu Li, Shisheng Zhou, Hanchao Li, Xiaowen Sun, and Xiangxu Meng. 2016. A framework for physiological indicators of flow in VR games: construction and preliminary evaluation. Personal and Ubiquitous Computing 20 (2016), 821–832.
[4]
Margaret M Bradley and Peter J Lang. 1994. Measuring emotion: the self-assessment manikin and the semantic differential. Journal of behavior therapy and experimental psychiatry 25, 1 (1994), 49–59.
[5]
Emily Brown and Paul Cairns. 2004. A grounded investigation of game immersion. In CHI ’04 Extended Abstracts on Human Factors in Computing Systems (Vienna, Austria) (CHI EA ’04). Association for Computing Machinery, New York, NY, USA, 1297–1300.
[6]
Nitesh V Chawla, Kevin W Bowyer, Lawrence O Hall, and W Philip Kegelmeyer. 2002. SMOTE: synthetic minority over-sampling technique. Journal of artificial intelligence research 16 (2002), 321–357.
[7]
J. A. Domínguez-Jiménez, K. C. Campo-Landines, J. C. Martínez-Santos, E. J. Delahoz, and S. H. Contreras-Ortiz. 2020. A machine learning model for emotion recognition from physiological signals. Biomedical Signal Processing and Control 55 (2020), 101646.
[8]
Ralf Dörner, Stefan Göbel, Wolfgang Effelsberg, and Josef Wiemeyer. 2016. Introduction. Springer International Publishing, Cham, 1–34.
[9]
Don Samitha Elvitigala, Denys J.C. Matthies, Löic David, Chamod Weerasinghe, and Suranga Nanayakkara. 2019. GymSoles: Improving Squats and Dead-Lifts by Visualizing the User’s Center of Pressure. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–12.
[10]
Don Samitha Elvitigala, Denys J. C. Matthies, and Suranga Nanayakkara. 2020. StressFoot: Uncovering the Potential of the Foot for Acute Stress Sensing in Sitting Posture. 20, 10 (2020), 2882.
[11]
Don Samitha Elvitigala, Philipp M. Scholl, Hussel Suriyaarachchi, Vipula Dissanayake, and Suranga Nanayakkara. 2021. StressShoe: A DIY Toolkit for just-in-time Personalised Stress Interventions for Office Workers Performing Sedentary Tasks. In Proceedings of the 23rd International Conference on Mobile Human-Computer Interaction (Toulouse & Virtual, France) (MobileHCI ’21). Association for Computing Machinery, New York, NY, USA, Article 38, 14 pages.
[12]
Don Samitha Elvitigala, Rukshani Somarathna, Yijun Yan, Gelareh Mohammadi, and Aaron Quigley. 2022. Towards using Involuntary Body Gestures for Measuring the User Engagement in VR Gaming. In The Adjunct Publication of the 35th Annual ACM Symposium on User Interface Software and Technology. 1–3.
[13]
Marco Granato, Davide Gadia, Dario Maggiorini, and Laura A. Ripamonti. 2020. An empirical study of players’ emotions in VR racing games based on a dataset of physiological data. Multimedia Tools and Applications (2020).
[14]
Marian Haescher, Denys JC Matthies, Gerald Bieber, and Bodo Urban. 2015. Capwalk: A capacitive recognition of walking-based activities as a wearable assistive technology. In Proceedings of the 8th ACM International Conference on PErvasive Technologies Related to Assistive Environments. 1–8.
[15]
Eddie Harmon-Jones, Cindy Harmon-Jones, and Elizabeth Summerell. 2017. On the importance of both dimensional and discrete models of emotion. Behavioral sciences 7, 4 (2017), 66.
[16]
K. Hidaka, H. Qin, and J. Kobayashi. 2017. Preliminary test of affective virtual reality scenes with head mount display for emotion elicitation experiment. In 2017 17th International Conference on Control, Automation and Systems (ICCAS). 325–329.
[17]
Wijnand A IJsselsteijn, Yvonne AW De Kort, and Karolien Poels. 2013. The game experience questionnaire. (2013).
[18]
S. Koelstra, C. Muhl, M. Soleymani, J. Lee, A. Yazdani, T. Ebrahimi, T. Pun, A. Nijholt, and I. Patras. 2012. DEAP: A Database for Emotion Analysis ;Using Physiological Signals. IEEE Transactions on Affective Computing 3, 1 (2012), 18–31.
[19]
Alfonso Martínez-Nova, José Carlos Cuevas-García, Javier Pascual-Huerta, and Raquel Sánchez-Rodríguez. 2007. BioFoot® in-shoe system: Normal values and assessment of the reliability and repeatability. The Foot 17, 4 (2007), 190–196.
[20]
Denys JC Matthies, Franz Müller, Christoph Anthes, and Dieter Kranzlmüller. 2013. ShoeSoleSense: proof of concept for a wearable foot interface for virtual and real environments. In Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology. 93–96.
[21]
Ben Meuleman and David Rudrauf. 2018. Induction and profiling of strong multi-componential emotions in virtual reality. IEEE Transactions on Affective Computing PP (2018), 1–1.
[22]
Federica Pallavicini, Alessandro Pepe, and Maria Eleonora Minissi. 2019. Gaming in Virtual Reality: What Changes in Terms of Usability, Emotional Response and Sense of Presence Compared to Non-Immersive Video Games?Simulation & Gaming 50 (2019), 104687811983142.
[23]
Xiaolan Peng, Jin Huang, Alena Denisova, Hui Chen, Feng Tian, and Hongan Wang. 2020. A Palette of Deepened Emotions: Exploring Emotional Challenge in Virtual Reality Games. Association for Computing Machinery, New York, NY, USA, 1–13.
[24]
Xiaolan Peng, Jin Huang, Linghan Li, Chen Gao, Hui Chen, Feng Tian, and Hongan Wang. 2019. Beyond horror and fear: Exploring player experience invoked by emotional challenge in VR games. In Extended abstracts of the 2019 CHI conference on human factors in computing systems. 1–6.
[25]
Supavich Pengnate, Frederick J Riggins, and Limin Zhang. 2020. Understanding users’ engagement and responses in 3D virtual reality: The influence of presence on user value. Interacting with Computers 32, 2 (2020), 103–117.
[26]
K Luan Phan, Tor Wager, Stephan F Taylor, and Israel Liberzon. 2002. Functional neuroanatomy of emotion: a meta-analysis of emotion activation studies in PET and fMRI. Neuroimage 16, 2 (2002), 331–348.
[27]
Rosalind W. Picard. 2000. Affective Computing. Affective Computing321 (2000).
[28]
Beatrice Rammstedt and Oliver P. John. 2007. Measuring personality in one minute or less: A 10-item short version of the Big Five Inventory in English and German. Journal of Research in Personality 41, 1 (2007), 203–212.
[29]
Lorcan Reidy, Dennis Chan, Charles Nduka, and Hatice Gunes. 2020. Facial electromyography-based adaptive virtual reality gaming for cognitive training. (2020), 174–183.
[30]
William Saunders and Daniel Vogel. 2016. Tap-kick-click: Foot interaction for a standing desk. In Proceedings of the 2016 ACM conference on designing interactive systems. 323–333.
[31]
Yoones A Sekhavat, Samad Roohi, Hesam Sakian Mohammadi, and Georgios N Yannakakis. 2020. Play with One’s Feelings: A Study on Emotion Awareness for Player Experience. IEEE Transactions on Games (2020).
[32]
SensingTex. 2024. SensingTex. https://sensingtex.com/ Accessed: 2024-06-20.
[33]
Mel Slater, Martin Usoh, and Anthony Steed. 1994. Depth of presence in virtual environments. Presence: Teleoperators & Virtual Environments 3, 2 (1994), 130–144.
[34]
Rukshani Somarathna, Don Samitha Elvitigala, Yijun Yan, Aaron J Quigley, and Gelareh Mohammadi. 2023. Exploring User Engagement in Immersive Virtual Reality Games through Multimodal Body Movements. In Proceedings of the 29th ACM Symposium on Virtual Reality Software and Technology. 1–8.
[35]
Rukshani Somarathna and Gelareh Mohammadi. 2024. Exploring Emotions in Multi-componential Space using Interactive VR Games. arXiv preprint arXiv:2404.03239 (2024).
[36]
Rukshani Somarathna and Gelareh Mohammadi. 2024. Towards Understanding Player Experience in Virtual Reality Games through Physiological Computing. In 2024 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops). 405–408.
[37]
Rukshani Somarathna, Aaron Quigley, and Gelareh Mohammadi. 2022. Multi-componential Emotion Recognition in VR Using Physiological Signals. In Australasian Joint Conference on Artificial Intelligence. Springer, 599–613.
[38]
Josef Wiemeyer, Lennart Nacke, Christiane Moser, and Florian ‘Floyd’Mueller. 2016. Player experience. Serious games: Foundations, concepts and practice (2016), 243–271.

Index Terms

  1. EmoFoot: Can Your Foot Tell How You Feel when Playing Virtual Reality Games?

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    MobileHCI '24 Adjunct: Adjunct Proceedings of the 26th International Conference on Mobile Human-Computer Interaction
    September 2024
    252 pages
    ISBN:9798400705069
    DOI:10.1145/3640471
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 21 September 2024

    Check for updates

    Author Tags

    1. Data-driven Methods
    2. Emotions
    3. Foot Wearable
    4. Virtual Reality Games

    Qualifiers

    • Extended-abstract
    • Research
    • Refereed limited

    Conference

    MobileHCI '24
    Sponsor:
    MobileHCI '24: 26th International Conference on Mobile Human-Computer Interaction
    September 30 - October 3, 2024
    VIC, Melbourne, Australia

    Acceptance Rates

    Overall Acceptance Rate 202 of 906 submissions, 22%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 70
      Total Downloads
    • Downloads (Last 12 months)70
    • Downloads (Last 6 weeks)18
    Reflects downloads up to 20 Feb 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media