ABSTRACT
This paper introduces a biosensing prototype that transforms emotions into music, helping people recognize and understand their own feelings and actions and those of other people. This study presents a series of three experiments with 20 participants in four emotional states: happiness, sadness, anger, and neutral state. Their real-time emotions were captured through a wearable probe Audiolize Emotion that detects users' EEG signals, composes data into audio files which are played to users themselves and others. At last, we conducted observations and interviews with participants to explore factors linked with social interaction, users' perceptions of music, and the reflections on the use of audio form for self-expression or communication. We found that Audiolize Emotion prototype triggers communication and self-expression in two ways: building curiosity and supporting communication by extending expression form. Based on the results, we provide future directions to explore the field of emotion and communication further and plan to apply the knowledge into more fields of VR game and accessibility.
- Rain Ashford. 2014. Responsive and Emotive Wearables: Devices, Bodies, Data and Communication. In Proceedings of the 2014 ACM International Symposium on Wearable Computers: Adjunct Program (ISWC '14 Adjunct). ACM, New York, NY, USA, 99--104. Google ScholarDigital Library
- A Ashtaputre-Sisode. 1992. Emotions and Brain Waves. The International Journal of Indian Psychology 3, 2 (1992), 14.Google Scholar
- D.E. Berlyne. 1960. Conflict, arousal, and curiosity. McGraw-Hill Book Company, New York, NY, USA.Google Scholar
- Daniel Boyarski and Richard Buchanan. 1994. Computers and communication design: Exploring the rhetoric of HCI. Interactions 1, 2 (1994), 25--35. Google ScholarDigital Library
- Galen Chuang, Shelley Wang, Sara Burns, and Orit Shaer. 2015. EmotiSphere: From Emotion to Music. In Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction (TEI '15). ACM, New York, NY, USA, 599--602. Google ScholarDigital Library
- Sebastian Gaigg. 2019. People with autism don't lack emotions but often have difficulty identifying them. https: //theconversation.com/people-with-autism-dont-lack-emotions-but-often-have-difficulty-identifying-them-25225.Google Scholar
- Noura Howell, Laura Devendorf, Tomás Alfonso Vega Gálvez, Rundong Tian, and Kimiko Ryokai. 2018. Tensions of Data-Driven Reflection: A Case Study of Real-Time Emotional Biosensing. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18). ACM, New York, NY, USA, Article 431, 13 pages. Google ScholarDigital Library
- Lars-Erik Janlert and Erik Stolterman. 2017. Things that keep us busy the elements of interaction. The MIT Press, Cambridge, MA, USA. Google ScholarDigital Library
- Rudy Lauwereins. 2012. Biomedical Electronics Serving As Physical Environmental and Emotional Watchdogs. In Proceedings of the 49th Annual Design Automation Conference (DAC '12). ACM, New York, NY, USA, 1--5. https://doi.org/ Google ScholarDigital Library
- 1145/2228360.2228362 {10} Laura Levy. 2015. The effects of background music on video game play performance, behavior and experience in extraverts and introverts. Ph.D. Dissertation. Georgia Institute of Technology, Atlanta, GA.Google Scholar
- Camila Loiola Brito Maia and Elizabeth S. Furtado. 2017. ReTUXE: A Framework for User's Emotional Evaluation Based on Psychophysiological Measures. In Proceedings of the XVI Brazilian Symposium on Human Factors in Computing Systems (IHC 2017). ACM, New York, NY, USA, Article 64, 4 pages. Google ScholarDigital Library
- Albert Mehrabian. 2008. Communication without words. Communication theory (2008), 193--200.Google Scholar
- Lauri Nummenmaa, Enrico Glerean, Mikko Viinikainen, Iiro P. Jääskeläinen, Riitta Hari, and Mikko Sams. 2012. Emotions promote social interaction by synchronizing brain activity across individuals. Proceedings of the National Academy of Sciences (2012). arXiv:https://www.pnas.org/content/early/2012/05/22/1206095109.full.pdfGoogle ScholarCross Ref
- Yi-Hsuan Yang and Homer H. Chen. 2012. Machine Recognition of Music Emotion: A Review. ACM Trans. Intell. Syst. Technol. 3, 3, Article 40 (May 2012), 30 pages. Google ScholarDigital Library
Index Terms
- "It sounds like she is sad": Introducing a Biosensing Prototype that Transforms Emotions into Real-time Music and Facilitates Social Interaction
Recommendations
Can You Hear My Heartbeat?: Hearing an Expressive Biosignal Elicits Empathy
CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing SystemsInterfaces designed to elicit empathy provide an opportunity for HCI with important pro-social outcomes. Recent research has demonstrated that perceiving expressive biosignals can facilitate emotional understanding and connection with others, but this ...
Mediating individual affective experience through the emotional photo frame
Affect Aware Ubiquitous ComputingA photograph is considered a medium with emotional legibility and a means of expressing and exchanging emotional experience. This research presents the interactive emotional photo frame system focusing on mediating individual affective experience among ...
I'm sad you're sad: emotional contagion in CMC
CSCW '08: Proceedings of the 2008 ACM conference on Computer supported cooperative workAn enduring assumption about computer-mediated communication is that it undermines emotional understanding. The present study examined emotional communication in CMC by inducing negative affect in one condition and neutral affect in another. The results ...
Comments