skip to main content
10.1145/3110292.3110302acmotherconferencesArticle/Chapter ViewAbstractPublication PagesvricConference Proceedingsconference-collections
short-paper

FACETEQ: A novel platform for measuring emotion in VR

Published: 22 March 2017 Publication History

Abstract

Faceteq prototype v.05 is a wearable technology for measuring facial expressions and biometric responses for experimental studies in Virtual Reality. Developed by Emteq Ltd laboratory, Faceteq can enable new avenues for virtual reality research through combination of high performance patented dry sensor technologies, proprietary algorithms and real-time data acquisition and streaming. Faceteq project was founded with the aim to provide a human-centred additional tool for emotion expression, affective human-computer interaction and social virtual environments. The proposed poster will exhibit the hardware and its functionality.

References

[1]
Chanel, G., Rebetez, C., Bétrancourt, M., and Pun, 2008. Boredom, engagement and anxiety as indicators for adaptation to difficulty in games. In Proceedings of the 12th international conference on Entertainment and media in the ubiquitous era (MindTrek '08), Artur Lugmayr, Frans Mäyrä, Heljä Franssila, and Katri Lietsala (Eds.). ACM, New York, NY, USA, 13--17.
[2]
Chen, C.N.H.C.H. and Chung, H.Y., 2004. The review of applications and measurements in facial electromyography. Journal of Medical and Biological Engineering, 25(1). 15--20.
[3]
Cohn, J. and Schmidt, K., 2004. The Timing of Facial Motion in Posed and Spontaneous Smiles, International Journal of Wavelets, Multiresolution and Information Processing, 2(2). 121--132.
[4]
Darwin, C., 1965. The expression of the emotions in man and animals (Vol. 526). University of Chicago press.
[5]
Dimberg, U., Thunberg, M., and Grunedal, S. 2002. Facial reactions to emotional stimuli: Automatically controlled emotional responses. Cognition and Emotion, 16(4). 449--471.
[6]
Duchenne, G.B. and Cuthbertson, R.A., 1990. The mechanism of human facial expression. Cambridge university press.
[7]
Ekman, P., Friesen W. V. and Ellsworth P., Emotion in the Human Face. Guidelines for Research and an Integration of Findings. In Pergamon General Psychology Series, 11. Pergamon Press, New York, 1972.
[8]
Ekman, P. and Friesen, W., Facial Action Coding System: A Technique for the Measurement of Facial Movement. Consulting Psychologists Press, Palo Alto, 1978.
[9]
Fridlund, A.J. and Cacioppo, J.T. 1986. Guidelines for Human Electromyographic Research. Psychophysiology, 23(5). 567--589.
[10]
Durso FT, Geldbach KM, Corballis P. 2012. Detecting confusion using facial electromyography. Hum Factors. Feb;54(1). 60--9
[11]
Hazlett, R. L. and Benedek, J. 2007. Measuring emotional valence to understand the user's experience of software. International Journal of Human-Computer Studies, 65(4). 306--314.
[12]
Hazlett, R. L. Using Biometric Measurement to Help Develop Emotionally Compelling Games. In Isbister, K. & Schaffer, N. ed. Game Usability: Advancing the Player Experience, Morgan. Kaufmann Publishers. Burlington, MA, 2008.
[13]
Kivikangas, J.M., Chanel, G., Cowley, B., Ekman, I., Salminen, M., Järvelä, S. and Ravaja, N., 2011. A review of the use of psychophysiological methods, In Game Research. Journal of Gaming & Virtual Worlds, 3(3), pp. 181--199.
[14]
Larsen, J.T., Norris, C.J., and Cacioppo, J.T. 2003. Effects of positive and negative affect on electromyographic activity over zygomaticus major and corrugator supercilii. Psychophysiology, 40(5). 776--785.
[15]
Mandryk, R. L. Physiological Measures for Game Evaluation. In Isbister, K. and Schaffer, N. ed, Game Usability: Advancing the Player Experience, Morgan Kaufmann Publishers, Burlington, MA, 2008.
[16]
Mandryk, R. L. and Atkins, M. S. 2007. A fuzzy physiological approach for continuously modeling emotion during interaction with play technologies. International Journal of Human-Computer Studies. 65(4). 329--347.
[17]
Mirza-Babaei, P., 2011, July. Biometrics to improve methodologies on understanding player's gameplay experience. In Proceedings of the 25th BCS Conference on Human-Computer Interaction, British Computer Society 546--549.
[18]
Tassinary, L.G., & Cacioppo, J.T. 1992. Unobservable facial actions and emotion. Psychological Sciences, 2. 28--33

Cited By

View all
  • (2023)Virtual Reality for Emotion Elicitation – A ReviewIEEE Transactions on Affective Computing10.1109/TAFFC.2022.318105314:4(2626-2645)Online publication date: 1-Oct-2023
  • (2023)Analyzing the Design of Online VR Platforms for Accessing Cultural Heritage Resources and Services: Multiple Case Studies in European and American CountriesDistributed, Ambient and Pervasive Interactions10.1007/978-3-031-34609-5_21(287-299)Online publication date: 9-Jul-2023
  • (2022)emteqPRO—Fully Integrated Biometric Sensing Array for Non-Invasive Biomedical Research in Virtual RealityFrontiers in Virtual Reality10.3389/frvir.2022.7812183Online publication date: 11-Mar-2022
  • Show More Cited By

Index Terms

  1. FACETEQ: A novel platform for measuring emotion in VR

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    VRIC '17: Proceedings of the Virtual Reality International Conference - Laval Virtual 2017
    March 2017
    96 pages
    ISBN:9781450348584
    DOI:10.1145/3110292
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    In-Cooperation

    • Laval Virtual: Laval Virtual

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 22 March 2017

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Affective Computing
    2. EMG
    3. Emotion
    4. Facial Expression
    5. Virtual Reality

    Qualifiers

    • Short-paper
    • Research
    • Refereed limited

    Conference

    VRIC '17

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)10
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 08 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Virtual Reality for Emotion Elicitation – A ReviewIEEE Transactions on Affective Computing10.1109/TAFFC.2022.318105314:4(2626-2645)Online publication date: 1-Oct-2023
    • (2023)Analyzing the Design of Online VR Platforms for Accessing Cultural Heritage Resources and Services: Multiple Case Studies in European and American CountriesDistributed, Ambient and Pervasive Interactions10.1007/978-3-031-34609-5_21(287-299)Online publication date: 9-Jul-2023
    • (2022)emteqPRO—Fully Integrated Biometric Sensing Array for Non-Invasive Biomedical Research in Virtual RealityFrontiers in Virtual Reality10.3389/frvir.2022.7812183Online publication date: 11-Mar-2022
    • (2021)ExGSenseProceedings of the 20th International Conference on Information Processing in Sensor Networks (co-located with CPS-IoT Week 2021)10.1145/3412382.3458268(222-237)Online publication date: 18-May-2021
    • (2021)Embodied online dance learning objectives of CAROUSEL+2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW52623.2021.00062(309-313)Online publication date: Mar-2021
    • (2020)Affective analysis of patients in homecare video-assisted telemedicine using computational intelligenceNeural Computing and Applications10.1007/s00521-020-05203-zOnline publication date: 22-Jul-2020

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media