skip to main content
10.1145/3357251.3360018acmotherconferencesArticle/Chapter ViewAbstractPublication PagessuiConference Proceedingsconference-collections
abstract

A Social Interaction Interface Supporting Affective Augmentation Based on Neuronal Data

Published: 19 October 2019 Publication History

Abstract

In this demonstration we present a prototype for an avatar-mediated social interaction interface that supports the replication of head- and eye movement in distributed virtual environments. In addition to the retargeting of these natural behaviors, the system is capable of augmenting the interaction based on the visual presentation of affective states. We derive those states using neuronal data captured by electroencephalographic (EEG) sensing in combination with a machine learning driven classification of emotional states.

References

[1]
Jeremy N Bailenson, Andrew C Beall, Jack Loomis, Jim Blascovich, and Matthew Turk. 2004. Transformed social interaction: Decoupling representation from behavior and form in collaborative virtual environments. Presence: Teleoperators & Virtual Environments 13, 4(2004), 428–441.
[2]
Jeremy N Bailenson, Nick Yee, Dan Merget, and Ralph Schroeder. 2006. The effect of behavioral realism and form realism of real-time avatar faces on verbal disclosure, nonverbal disclosure, emotion recognition, and copresence in dyadic interaction. Presence: Teleoperators and Virtual Environments 15, 4(2006), 359–372.
[3]
Jonathon D Hart, Thammathip Piumsomboon, Louise Lawrence, Gun A Lee, Ross T Smith, and Mark Billinghurst. 2018. Demonstrating Emotion Sharing and Augmentation in Cooperative Virtual Reality Games. In Proceedings of the 2018 Annual Symposium on Computer-Human Interaction in Play Companion Extended Abstracts. ACM, 117–120.
[4]
David Roberts, Robin Wolff, John Rae, Anthony Steed, Rob Aspin, Moira McIntyre, Adriana Pena, Oyewole Oyekoya, and Will Steptoe. 2009. Communicating eye-gaze across a distance: Comparing an eye-gaze enabled immersive collaborative virtual environment, aligned video conferencing, and being together. In 2009 IEEE Virtual Reality Conference. IEEE, 135–142.
[5]
D. Roth, C. Kleinbeck, T. Feigl, C. Mutschler, and M. E. Latoschik. 2018. Beyond Replication: Augmenting Social Behaviors in Multi-User Virtual Realities. In 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). 215–222. https://doi.org/10.1109/VR.2018.8447550
[6]
Daniel Roth, Peter Kullmann, Gary Bente, Dominik Gall, and Marc Erich Latoschik. 2018. Effects of Hybrid and Synthetic Social Gaze in Avatar-Mediated Interactions. In 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). IEEE, 103–108. https://doi.org/10.1109/ISMAR-Adjunct.2018.00044
[7]
Daniel Roth, Kristoffer Waldow, Felix Stetter, Gary Bente, Marc Erich Latoschik, and Arnulph Fuhrmann. 2016. SIAMC: A Socially Immersive Avatar Mediated Communication Platform. In Proceedings of the 22Nd ACM Conference on Virtual Reality Software and Technology(VRST ’16). ACM, New York, NY, USA, 357–358. https://doi.org/10.1145/2993369.2996302
[8]
Daniel Roth, Franziska Westermeier, Larissa Brübach, Tobias Feigl, Christian Schell, and Marc Erich Latoschik. 2019, to appear. Brain 2 Communicate: EEG-based Affect Recognition to Augment Virtual Social Interactions. In Proceedings of the Mensch und Computer 2019 Workshop on User-Embodied Interaction in Virtual Reality.

Cited By

View all
  • (2021)Study of Full-body Virtual Embodiment Using noninvasive Brain Stimulation and ImagingInternational Journal of Human–Computer Interaction10.1080/10447318.2020.1870827(1-14)Online publication date: 1-Feb-2021
  1. A Social Interaction Interface Supporting Affective Augmentation Based on Neuronal Data

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    SUI '19: Symposium on Spatial User Interaction
    October 2019
    164 pages
    ISBN:9781450369756
    DOI:10.1145/3357251
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 19 October 2019

    Check for updates

    Author Tags

    1. Communication interfaces
    2. affective computing
    3. avatars
    4. brain-computer interfaces
    5. embodiment

    Qualifiers

    • Abstract
    • Research
    • Refereed limited

    Conference

    SUI '19
    SUI '19: Symposium on Spatial User Interaction
    October 19 - 20, 2019
    LA, New Orleans, USA

    Acceptance Rates

    Overall Acceptance Rate 86 of 279 submissions, 31%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)12
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 17 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2021)Study of Full-body Virtual Embodiment Using noninvasive Brain Stimulation and ImagingInternational Journal of Human–Computer Interaction10.1080/10447318.2020.1870827(1-14)Online publication date: 1-Feb-2021

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media