Skip to main content

Design of Emotion-Driven Game Interaction Using Biosignals

  • Conference paper
  • First Online:
HCI in Games (HCII 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13334))

Included in the following conference series:

Abstract

Video games can evoke a wide range of emotions in players through multiple modalities. However, on a broader scale, human emotions are probably an important missing part of the current generation of Human Computer Interaction (HCI). The main goal of this project is to start investigating how to design video games where the game mechanics and interactions are based on the player’s emotions. We designed a two-dimensional (2D) storytelling game prototype with Unity. Game designers and creators manage the user’s experience and emotions along the play through visual effects, sound effects, controls and narration. In particular for this project, we have chosen to create emotionally-driven interactions for two specific aspects: sound (audio effects, music), and narration (storytelling). Our prototype makes use of the Ovomind smart band and biosignals analysis technology developed by the first author. By wearing the smart band, human body physiological information are extracted and classified using signal processing method into groups of emotions mapped to the arousal & valence (AV) plane. The 2D AV emotion representation is directly used as an interactive input into the game interaction system. Regarding music, we propose a system that automatically arranges background music by inputting emotions analysed by the smart band into an AI model. We evaluated the results using video recordings of the experience and collected feedback from a total of 30 participants. The results show that participants are favorable to narrative and music game adaptations based on real-time player emotion analysis. Some issues were also highlighted e.g. around the coherence of game progression. Participants also felt that the background music arrangements matched the player’s emotions well. Further experiments are required and planned to assess whether the prospects expressed by participants match their personal experience when playing the emotion-driven game.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    www.ovomind.com.

  2. 2.

    https://www.youtube.com/watch?v=V4iARp0OZIo.

  3. 3.

    https://playdead.com/games/limbo/.

  4. 4.

    https://www.youtube.com/watch?v=61DZC-60x20.

  5. 5.

    https://unity.com/.

  6. 6.

    https://www.freesound.org/.

  7. 7.

    https://www.mixamo.com/.

  8. 8.

    https://www.last.fm/.

  9. 9.

    https://www.allmusic.com/.

  10. 10.

    https://bit.ly/3sziY41.

References

  1. Amaresha, A.C., Venkatasubramanian, G.: Expressed emotion in schizophrenia: an overview. Indian J. Psychol. Med. 34, 12–20 (2012). https://doi.org/10.4103/0253-7176.96149

    Article  Google Scholar 

  2. Barthet, M., Fazekas, G., Sandler, M.: Music emotion recognition: from content- to context-based models. In: Aramaki, M., Barthet, M., Kronland-Martinet, R., Ystad, S. (eds.) CMMR 2012. LNCS, vol. 7900, pp. 228–252. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-41248-6_13

    Chapter  Google Scholar 

  3. Bolanos, M., Nazeran, H., Haltiwanger, E.: Comparison of heart rate variability signal features derived from electrocardiography and photoplethysmography in healthy individuals. In: Annual International Conference of the IEEE Engineering in Medicine and Biology - Proceedings, pp. 4289–4294 (2006). https://doi.org/10.1109/IEMBS.2006.260607

  4. Boucsein, W., et al.: Publication recommendations for electrodermal measurements. Psychophysiology 49, 1017–1034 (2012). https://doi.org/10.1111/j.1469-8986.2012.01384.x

    Article  Google Scholar 

  5. Callele, D., Neufeld, E., Schneider, K.: Emotional requirements in video games. In: Proceedings of the IEEE International Conference on Requirements Engineering, pp. 299–302 (2006). https://doi.org/10.1109/RE.2006.19

  6. Christoph, K., Hefner, D., Peter, V.: The video game experience as “true’’ identification: a theory of enjoyable alterations of players’ self-perception. Commun. Theory 19, 351–373 (2009). https://doi.org/10.1111/j.1468-2885.2009.01347.x

    Article  Google Scholar 

  7. Coutinho, E., Cangelosi, A.: Musical emotions: predicting second-by-second subjective feelings of emotion from low-level psychoacoustic features and physiological measurements. Emotion 11(4), 921 (2011)

    Article  Google Scholar 

  8. Critchley, H.D.: Electrodermal responses: what happens in the brain. Neuroscientist 8, 132–142 (2002). https://doi.org/10.1177/107385840200800209

    Article  Google Scholar 

  9. De Jonckheere, J., Ibarissene, I., Flocteil, M., Logier, R.: A smartphone based cardiac coherence biofeedback system. In: 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2014, pp. 4791–4794 (2014). https://doi.org/10.1109/EMBC.2014.6944695

  10. Dehzangi, O., Rajendra, V., Taherisadr, M.: Wearable driver distraction identification on-the-road via continuous decomposition of galvanic skin responses. Sensors (Switzerland) 18, 1–16 (2018). https://doi.org/10.3390/s18020503

    Article  Google Scholar 

  11. Frome, J.: Eight ways videogames generate emotion. In: 3rd Digital Games Research Association International Conference: “Situated Play”, DiGRA 2007, pp. 831–835 (2007)

    Google Scholar 

  12. Gil, E., Orini, M., Bailón, R., Vergara, J.M., Mainardi, L., Laguna, P.: Photoplethysmography pulse rate variability as a surrogate measurement of heart rate variability during non-stationary conditions. Physiol. Meas. 31, 1271–1290 (2010). https://doi.org/10.1088/0967-3334/31/9/015

    Article  Google Scholar 

  13. Granato, M., Gadia, D., Maggiorini, D., Ripamonti, L.A.: Feature extraction and selection for real-time emotion recognition in video games players. In: Proceedings - 14th International Conference on Signal Image Technology and Internet Based Systems, SITIS 2018, pp. 717–724 (2018). https://doi.org/10.1109/SITIS.2018.00115

  14. Huang, X., et al.: Multi-modal emotion analysis from facial expressions and electroencephalogram. Comput. Vis. Image Underst. 147, 114–124 (2016). https://doi.org/10.1016/j.cviu.2015.09.015

    Article  Google Scholar 

  15. Jovanovic, N., Popovic, N.B., Miljkovic, N.: Empirical mode decomposition for automatic artifact elimination in electrogastrogram. In: 2021 20th International Symposium INFOTEH-JAHORINA, INFOTEH 2021 - Proceedings, pp. 17–19 (2021). https://doi.org/10.1109/INFOTEH51037.2021.9400683

  16. Koelsch, S.: Brain correlates of music-evoked emotions. Nat. Rev. Neurosci. 15, 170–180 (2014). https://doi.org/10.1038/nrn3666

    Article  Google Scholar 

  17. Krkovic, K., Clamor, A., Lincoln, T.M.: Emotion regulation as a predictor of the endocrine, autonomic, affective, and symptomatic stress response and recovery. Psychoneuroendocrinology 94, 112–120 (2018). https://doi.org/10.1016/j.psyneuen.2018.04.028

    Article  Google Scholar 

  18. Lerdahl, F., et al.: Tonal Pitch Space. Oxford University Press, USA (2001)

    Google Scholar 

  19. Makris, D., Agres, K.R., Herremans, D.: Generating lead sheets with affect: a novel conditional seq2seq framework. arXiv preprint arXiv:2104.13056 (2021)

  20. McCarthy, C., Pradhan, N., Redpath, C., Adler, A.: Validation of the Empatica E4 wristband. In: 2016 IEEE EMBS International Student Conference: Expanding the Boundaries of Biomedical Engineering and Healthcare, ISC 2016 - Proceedings, pp. 4–7 (2016). https://doi.org/10.1109/EMBSISC.2016.7508621

  21. McCraty, R., Zayas, M.A.: Cardiac coherence, self-regulation, autonomic stability and psychosocial well-being. Front. Psychol. 1090, 1–13 (2014). https://doi.org/10.3389/fpsyg.2014.01090

    Article  Google Scholar 

  22. Mühlenbeck, C., Pritsch, C., Wartenburger, I., Telkemeyer, S., Liebal, K.: Attentional bias to facial expressions of different emotions - a cross-cultural comparison of Akhoe Hai—om and German children and adolescents. Front. Psychol. 11, 1–9 (2020). https://doi.org/10.3389/fpsyg.2020.00795

    Article  Google Scholar 

  23. Müllensiefen, D., Gingras, B., Musil, J., Stewart, L.: Measuring the facets of musicality: the Goldsmiths Musical Sophistication Index (Gold-MSI). Pers. Individ. Differ. 60, S35 (2014)

    Article  Google Scholar 

  24. Nummenmaa, L., Glerean, E., Hari, R., Hietanen, J.K.: Bodily maps of emotions. Proc. Natl. Acad. Sci. U.S.A. 111, 646–651 (2014). https://doi.org/10.1073/pnas.1321664111

    Article  Google Scholar 

  25. Posada-Quintero, H.F., Florian, J.P., Orjuela-Cañón, A.D., Aljama-Corrales, T., Charleston-Villalobos, S., Chon, K.H.: Power spectral density analysis of electrodermal activity for sympathetic function assessment. Ann. Biomed. Eng. 44, 3124–3135 (2016)

    Article  Google Scholar 

  26. Posada-Quintero, H.F., Florian, J.P., Orjuela-Cañón, A.D., Chon, K.H.: Electrodermal activity is sensitive to cognitive stress under water. Front. Physiol. 8, 1–8 (2018). https://doi.org/10.3389/fphys.2017.01128

    Article  Google Scholar 

  27. Ribeiro, F.S., Santos, F.H., Albuquerque, P.B., Oliveira-Silva, P.: Emotional induction through music: measuring cardiac and electrodermal responses of emotional states and their persistence. Front. Psychol. 10, 1–13 (2019). https://doi.org/10.3389/fpsyg.2019.00451

    Article  Google Scholar 

  28. Schäfer, A., Vagedes, J.: How accurate is pulse rate variability as an estimate of heart rate variability?: a review on studies comparing photoplethysmographic technology with an electrocardiogram. Int. J. Cardiol. 166, 15–29 (2013). https://doi.org/10.1016/j.ijcard.2012.03.119

    Article  Google Scholar 

  29. Scherer, K.R.: What are emotions? And how can they be measured? Soc. Sci. Inf. 44, 695–729 (2005). https://doi.org/10.1177/0539018405058216

    Article  Google Scholar 

  30. Shu, L., et al.: A review of emotion recognition using physiological signals. Sensors (Switzerland) 18, 2074 (2018). https://doi.org/10.3390/s18072074

    Article  Google Scholar 

  31. Soutter, A.R.B., Hitchens, M.: The relationship between character identification and flow state within video games. Comput. Hum. Behav. 55, 1030–1038 (2016). https://doi.org/10.1016/j.chb.2015.11.012

    Article  Google Scholar 

  32. Takahashi, T., Mathieu, B.: Automatic arrangement system for melodies based on felt emotions (2022). Submitted

    Google Scholar 

  33. Wang, C., Wang, F.: An emotional analysis method based on heart rate variability. In: Proceedings - IEEE-EMBS International Conference on Biomedical and Health Informatics: Global Grand Challenge of Health Informatics, BHI 2012, pp. 104–107 (2012). https://doi.org/10.1109/BHI.2012.6211518

  34. Warriner, A.B., Kuperman, V., Brysbaert, M.: Norms of valence, arousal, and dominance for 13,915 English lemmas. Behav. Res. Methods 45(4), 1191–1207 (2013)

    Article  Google Scholar 

  35. Wellman, H.M., Cross, D., Watson, J.: Meta-analysis of theory-of-mind development: the truth about false belief. Child Dev. 72, 655–684 (2001). Published by: Wiley on behalf of the Society for Research in Child Development Stable. http://www.jstor.org/s

  36. Widen, S.C., Pochedly, J.T., Russell, J.A.: The development of emotion concepts: a story superiority effect in older children and adolescents. J. Exp. Child Psychol. 131, 186–192 (2015). https://doi.org/10.1016/j.jecp.2014.10.009

    Article  Google Scholar 

  37. Yeh, Y.C., et al.: Automatic melody harmonization with triad chords: a comparative study. J. New Music Res. 50, 37–51 (2021)

    Article  Google Scholar 

  38. Yu, L.C., et al.: Building Chinese affective resources in valence-arousal dimensions. In: 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL HLT 2016 - Proceedings of the Conference, pp. 540–545 (2016). https://doi.org/10.18653/v1/n16-1066

Download references

Acknowledgments

This work was partly supported by Ovomind and the EPSRC and AHRC Centre for Doctoral Training in Media and Arts Technology (EP/L01632X/1). We would also like to thank the user evaluation participants.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yann Frachi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Frachi, Y., Takahashi, T., Wang, F., Barthet, M. (2022). Design of Emotion-Driven Game Interaction Using Biosignals. In: Fang, X. (eds) HCI in Games. HCII 2022. Lecture Notes in Computer Science, vol 13334. Springer, Cham. https://doi.org/10.1007/978-3-031-05637-6_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-05637-6_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-05636-9

  • Online ISBN: 978-3-031-05637-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics