skip to main content
10.1145/3526114.3558723acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
poster

Gustav: Cross-device Cross-computer Synchronization of Sensory Signals

Published: 28 October 2022 Publication History

Abstract

Temporal synchronization of behavioral and physiological signals collected through different devices (and sometimes through different computers) is a longstanding challenge in HCI, neuroscience, psychology, and related areas. Previous research has proposed to synchronize sensory signals using (1) dedicated hardware; (2) dedicated software; or (3) alignment algorithms. All these approaches are either vendor-locked, non-generalizable, or difficult to adopt in practice. We propose a simple but highly efficient alternative: instrument the stimulus presentation software by injecting supervisory event-related timestamps, followed by a post-processing step over the recorded log files. Armed with this information, we introduce Gustav, our approach to orchestrate the recording of sensory signals across devices and computers. Gustav ensures that all signals coincide exactly with the duration of each experiment condition, with millisecond precision. Gustav is publicly available as open source software.

References

[1]
P. Bœkgaard, M. K. Petersen, and J. E. Larsen. 2014. In the twinkling of an eye: Synchronization of EEG and eye tracking based on blink signatures. In Proc. CIP Workshop.
[2]
A. Borawska, J. Duda, and K. Biercewicz. 2021. Best practices of neurophysiological data collection for media message evaluation in social campaigns. Procedia Comput. Sci. 192 (2021).
[3]
C. de la Torre-Ortiz, M. M. Spapé, L. Kangassalo, and T. Ruotsalo. 2020. Brain relevance feedback for interactive image generation. In Proc. UIST.
[4]
J. C. de Munck, S. I. Gonçalves, R. Mammoliti, R. M. Heethaar, and F. H. L. da Silva. 2009. Interactions between different EEG frequency bands and their effect on alpha–fMRI correlations. NeuroImage 47, 1 (2009).
[5]
K. Huo, T. Wang, L. Paredes, A. M. Villanueva, Y. Cao, and K. Ramani. 2018. SynchronizAR: Instant synchronization for spontaneous and spatial collaborations in augmented reality. In Proc. UIST.
[6]
D. Kahneman and J. Beatty. 1966. Pupil Diameter and Load on Memory. Science 154, 3756 (1966).
[7]
E. Y. Kimchi, B. F. Coughlin, B. E. Shanahan, G. Piantoni, J. Pezaris, and S. S. Cash. 2020. OpBox: Open source tools for simultaneous EEG and EMG acquisition from multiple subjects. eNeuro 7, 5 (2020).
[8]
L. A. Leiva and R. Vivó. 2012. Interactive hypervideo visualization for browsing behavior analysis. In Proc. WWW Companion.
[9]
G. M. Notaro and S. G. Diamond. 2018. Simultaneous EEG, eye-tracking, behavioral, and screen-capture data during online German language learning. Data Brief 21(2018).
[10]
M. I. Posner. 1980. Orienting of attention. Q. J. Exp. Psychol. (Hove) 32, 1 (1980).
[11]
M. Ragot, N. Martin, S. Em, N. Pallamin, and J.-M. Diverrez. 2017. Emotion Recognition Using Physiological Signals: Laboratory vs. Wearable Sensors. In Proc. AHFE.
[12]
M. Richardson, M. Durasoff, and R. Wang. 2020. Decoding surface touch typing from hand-tracking. In Proc. UIST.
[13]
E. J. Shah, J. Y. Chow, and M. J. Lee. 2020. Anxiety on Quiet Eye and Performance of Youth Pistol Shooters. J. Sport Exerc. Psychol. 42, 4 (2020).
[14]
Y. Sugano, X. Zhang, and A. Bulling. 2016. Aggregaze: Collective estimation of audience attention on public displays. In Proc. UIST.
[15]
D. Szajerman, P. Napieralski, and J.-P. Lecointe. 2018. Joint analysis of simultaneous EEG and eye tracking data for video images. In Proc. IEEE ISEF.
[16]
R. Taib, B. Itzstein, and K. Yu. 2014. Synchronising Physiological and Behavioural Sensors in a Driving Simulator. In Proc. ICMI.
[17]
C. Thanh Vi, K. Hornbæk, and S. Subramanian. 2017. Neuroanatomical correlates of perceived usability. In Proc. UIST.
[18]
F. Wolling, C. D. Huynh, and K. Van Laerhoven. 2021. IBSync: Intra-body synchronization of wearable devices using artificial ECG landmarks. In Proc. ISWC.
[19]
R. Xiao, C. Ding, and X. Hu. 2022. Time Synchronization of Multimodal Physiological Signals through Alignment of Common Signal Types and Its Technical Considerations in Digital Health. J. Imaging 8, 5 (2022).
[20]
J. Xue, C. Quan, C. Li, J. Yue, and C. Zhang. 2017. A crucial temporal accuracy test of combining EEG and Tobii eye tracker. Medicine 96, 13 (2017).
[21]
D. Yoon, N. Chen, F. Guimbretière, and A. Sellen. 2014. RichReview: blending ink, speech, and gesture to support collaborative document review. In Proc. UIST.
[22]
X. Zhang. 2021. Evaluating the Effects of Saccade Types and Directions on Eye Pointing Tasks. In Proc. UIST.
[23]
Y. Zhao, B. Li, Y. Li, J. Zhou, J. Cao, Y. Luo, X. Zhou, C. Yao, L. Shi, and G. Wang. 2021. Rope X: Assistance and Guidance on Jumping Rope Frequency, based on Real-time, Heart Rate Feedback During Exercise. In Adj. Proc. UIST.
[24]
P. H. Zimmerman, J. E. Bolhuis, A. Willemsen, E. S. Meyer, and L. P. Noldus. 2009. The Observer XT: A tool for the integration and synchronization of multimodal signals. Behav. Res. Methods 41, 3 (2009).

Cited By

View all
  • (2024)Efficient Decoding of Affective States from Video-elicited EEG Signals: An Empirical InvestigationACM Transactions on Multimedia Computing, Communications, and Applications10.1145/366366920:10(1-24)Online publication date: 12-Sep-2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
UIST '22 Adjunct: Adjunct Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology
October 2022
413 pages
ISBN:9781450393218
DOI:10.1145/3526114
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 28 October 2022

Check for updates

Author Tags

  1. Behavioral/Physiological Sensing
  2. Orchestration
  3. Synchronization

Qualifiers

  • Poster
  • Research
  • Refereed limited

Funding Sources

Conference

UIST '22

Acceptance Rates

Overall Acceptance Rate 355 of 1,733 submissions, 20%

Upcoming Conference

UIST '25
The 38th Annual ACM Symposium on User Interface Software and Technology
September 28 - October 1, 2025
Busan , Republic of Korea

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)18
  • Downloads (Last 6 weeks)5
Reflects downloads up to 07 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Efficient Decoding of Affective States from Video-elicited EEG Signals: An Empirical InvestigationACM Transactions on Multimedia Computing, Communications, and Applications10.1145/366366920:10(1-24)Online publication date: 12-Sep-2024

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media