skip to main content
10.1145/3411109.3411147acmotherconferencesArticle/Chapter ViewAbstractPublication PagesamConference Proceedingsconference-collections
short-paper

An auditory interface for realtime brainwave similarity in dyads

Published: 16 September 2020 Publication History

Abstract

We present a case-study in the development of a"hyperscanning" auditory interface that transforms realtime brainwave-similarity between interacting dyads into music. Our instrument extends reality in face-to-face communication with a musical stream reflecting an invisible socio-neurophysiological signal. This instrument contributes to the historical context of brain-computer interfaces (BCIs) applied to art and music, but is unique because it is contingent on the correlation between the brainwaves of the dyad, and because it conveys this information using entirely auditory feedback. We designed the instrument to be i) easy to understand, ii) relatable and iii) pleasant for members of the general public in an exhibition context. We present how this context and user group led to our choice of EEG hardware, inter-brain similarity metric, and our auditory mapping strategy. We discuss our experience following four public exhibitions, as well as future improvements to the instrument design and user experience.

Supplementary Material

ZIP File (p261-winters.zip)
Supplemental material.

References

[1]
Fabio Babiloni and Laura Astolfi. 2014. Social neuroscience and hyperscanning techniques: Past, present and future. Neuroscience and Biobehavioral Reviews 44 (2014), 76--93.
[2]
Mark H. Davis. 1996. Empathy: A Social Psychological Approach. Routledge, London, UK.
[3]
John M. Demos. 2019. Getting Started with EEG Neurofeedback (2nd ed.). W. W. Norton & Company, New York, NY.
[4]
Suzanne Dikker, Georgios Michalareas, Matthias Oostrik, Amalia Serafimaki, Hasibe Melda Kahraman, Marijn E. Struiksma, and David Poeppel. 2019. Crowd-sourcing neuroscience: inter-brain coupling during face-to-face interactions outside the laboratory. bioRxiv (2019).
[5]
Suzanne Dikker, Sean Montgomery, and Suzan Tunca. 2019. Using synchrony-based neurofeedback in search of human connectedness. In Brain Art: Brain-Computer Interfaces for Artistic Expression, Anton Nijholt (Ed.). Springer, Cham, Chapter 6, 161--206.
[6]
Sivan Kinreich, Amir Djalovski, Lior Kraus, Yoram Louzoun, and Ruth Feldman. 2017. Brain-to-brain synchrony during naturalistic social interactions. Scientific Reports 7, 17060 (2017), 12 pages.
[7]
Eduardo Reck Miranda and Julien Castet (Eds.). 2014. Guide to Brain-Computer Music Interfacing. Springer-Verlag, London, UK.
[8]
Eduardo Reck Miranda and Marcelo M Wanderley. 2006. New Digital Musical Instruments: Control and Interaction Beyond the Keyboard. A-R Editions, Inc., Middleton, WI.
[9]
Anton Nijholt (Ed.). 2019. Brain Art: Brain-Computer Interfaces for Artistic Expression. Springer, Cham.
[10]
Anton Nijholt, Robert J.K. Jacob, Marvin Andujar, Beste F. Yuksel, and Grace Leslie. 2018. Brain-Computer Interfaces for Artistic Expression. In CHI '18 Extended Abstracts on Human Factors in Computing Systems. Montreal, QC.
[11]
Jaak Panskepp. 2004. Affective Neuroscience: The Foundations of Human and Animal Emotions. Oxford University Press, New York, NY.
[12]
David Rosenboom. 1975. Biofeedback and the Arts: Results of Early Experiments. Aesthetic Research Center of Canada, Vancouver, BC.
[13]
David Rosenboom. 2019. More than one---Artistic explorations with multi-agent BCIs. In Brain Art: Brain-Computer Interfaces for Artistic Expression, Anton Nijholt (Ed.). Springer, Cham, Chapter 4, 117--143.
[14]
Tania Singer and Claus Lamm. 2009. The social neuroscience of empathy. Annals of the New York Academy of Sciences 1156 (2009), 81--96.
[15]
Brianna J. Tomlinson, Brittany E. Noah, and Bruce N. Walker. 2018. BUZZ: An auditory interface user experience scale. In CHI '18 Late-Breaking Abstracts on Human Factors in Computing Systems. Montréal, QC, 1--6.
[16]
Paul Vickers. 2011. Sonification for process monitoring. In The Sonification Handbook, Thomas Hermann, Andy Hunt, and John G Neuhoff (Eds.). Logos Verlag, Berlin, Germany, Chapter 18, 455--491.
[17]
Jonathan R. Wolpaw and Elizabeth Winter Wolpaw (Eds.). 2012. Brain-Computer Interfaces: Principles and Practice. Vol. 53. Oxford University Press, New York, NY.

Cited By

View all
  • (2022)PAMPAS: A PsychoAcoustical Method for the Perceptual Analysis of multidimensional SonificationFrontiers in Neuroscience10.3389/fnins.2022.93094416Online publication date: 6-Oct-2022

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
AM '20: Proceedings of the 15th International Audio Mostly Conference
September 2020
281 pages
ISBN:9781450375634
DOI:10.1145/3411109
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 16 September 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. audio
  2. augmented reality
  3. brain-computer interfaces
  4. neuroscience
  5. social
  6. sonification
  7. sound art
  8. sound interaction design

Qualifiers

  • Short-paper

Conference

AM'20
AM'20: Audio Mostly 2020
September 15 - 17, 2020
Graz, Austria

Acceptance Rates

AM '20 Paper Acceptance Rate 29 of 47 submissions, 62%;
Overall Acceptance Rate 177 of 275 submissions, 64%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)21
  • Downloads (Last 6 weeks)1
Reflects downloads up to 20 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2022)PAMPAS: A PsychoAcoustical Method for the Perceptual Analysis of multidimensional SonificationFrontiers in Neuroscience10.3389/fnins.2022.93094416Online publication date: 6-Oct-2022

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media