skip to main content
10.1145/2448136.2448178acmotherconferencesArticle/Chapter ViewAbstractPublication PagesecceConference Proceedingsconference-collections
research-article

Comparing four technologies for measuring postural micromovements during monitor engagement

Published: 28 August 2012 Publication History

Abstract

Objective metrics of engagement are valuable for estimating user experience or progression through interactional narratives. Postural micromovements of seated individuals during computer engagement have been previously measured with magnetic field sensors and chair-mounted force matrix detection mats. Here we compare readings from a head-mounted accelerometer, single camera sagittal motion tracking, and force distribution changes using floor-mounted force plates against a Vicon 8-camera motion capture system. Measurements were recorded on five participants who were watching or interacting with a computer monitor. Our results show that sagittal and coronal plane measurements for Vicon, the accelerometer and the single camera produced nearly identical data, were precisely synchronized in time, and in many cases proportional in amplitude. None of the systems tested were able to match the Vicon's measurement of yaw.

References

[1]
Balaban, C. D. et al., 2004. Postural Control as a Probe for Cognitive State: Exploiting Human Information Processing to Enhance Performance. International Journal of Human-Computer Interaction, 17(2), 275--286.
[2]
Baltrušaitis, T. et al., 2011. Real-time inference of mental states from facial expressions and upper body gestures. 2011 IEEE International Conference on Automatic Face & Gesture Recognition and Workshops (FG 2011), 909--914.
[3]
Barker, S. et al., 2006. Accuracy, reliability, and validity of a spatiotemporal gait analysis system. Medical Engineering & Physics, 28(5), 460--467.
[4]
Bland J. M., Altman D. G., 1986. Statistical methods for assessing agreement between two methods of clinical measurement. The Lancet. 327 (8476), 307--10
[5]
Cloete, T. and Scheffer, C., 2008. Benchmarking of a full-body inertial motion capture system for clinical gait analysis. 30th Annual International IEEE EMBS Conference, 4579--4582.
[6]
Coan, J. A. & Gottman, J. M., 2007. The Specific Affect Coding System (SPAFF). In Coan, J. A. & Allen, J. J. B., Handbook of Emotion Elicitation and Assessment (Oxford: Oxford University Press), p. 267--285.
[7]
Coker, D. A. & Burgoon, J., 1987. The nature of conversational involvement and nonverbal encoding patterns. Human Communication Research, 13(4), 463--494.
[8]
D'Mello, S., Chipman, P. & Graesser, A., 2005. Posture as a Predictor of Learner's Affective Engagement. Psychology, 1, 905--910.
[9]
Kapoor, A., Burleson, W. & Picard, R., 2007. Automatic prediction of frustration. International Journal of Human-Computer Studies, 65(8), 724--736.
[10]
Kleinsmith, A. & Berthouze, N., 2007. Recognizing Affective Dimensions from Body Posture. A. Paiva, R. Prada & R. Picard, eds. Affective Computing and Intelligent Interaction, LNCS-4738, 1--12.
[11]
Kleinsmith, A., Bianchi-Berthouze, N., & Steed, A. 2011. Automatic recognition of non-acted affective postures. IEEE Transactions on Systems, Man, And Cybernetics---Part B: Cybernetics, 41(4), 1027--1038.
[12]
Law, E. L., 2011. The Measurability and Predictability of User Experience. Proceedings from EICS '11. New York: ACM.
[13]
Mota, S. & Picard, R. W., 2003. Automated Posture Analysis for Detecting Learner's Interest Level. 2003 Conference on Computer Vision and Pattern Recognition Workshop, 5, 49--49.
[14]
O'Brien, H. L. & Toms, E. G., 2008. What is user engagement? A conceptual framework for defining user engagement with technology. Journal of the American Society for Information Science, 59(6), 938--955.
[15]
Stancic, I. et al., 2009. Development and testing of a device for human kinematics measurements. WSEAS Transactions on Systems, 8(9), 1083--1092.
[16]
Villanueva, M. B. G. et al., 2000. Evaluation of the Ergonomic Aspects of Portable Personal Computers with Flat Panel Displays (PC-FPDs). Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 44(6), 654--657

Cited By

View all
  • (2019)Toward emotional recognition during HCI using marker-based automated video trackingProceedings of the 31st European Conference on Cognitive Ergonomics10.1145/3335082.3335103(49-52)Online publication date: 10-Sep-2019
  • (2016)Non-Instrumental Movement Inhibition (NIMI) Differentially Suppresses Head and Thigh Movements during Screenic Engagement: Dependence on InteractionFrontiers in Psychology10.3389/fpsyg.2016.001577Online publication date: 23-Feb-2016
  • (2013)Engagement: the inputs and the outputsProceedings of the 2013 Inputs-Outputs Conference: An Interdisciplinary Conference on Engagement in HCI and Performance10.1145/2557595.2557596(1-4)Online publication date: 26-Jun-2013
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
ECCE '12: Proceedings of the 30th European Conference on Cognitive Ergonomics
August 2012
224 pages
ISBN:9781450317863
DOI:10.1145/2448136
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

  • EACE: European Association for Cognitive Ergonomics
  • Edinburgh Napier University, UK: Edinburgh Napier University, UK

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 28 August 2012

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. engagement measurement
  2. human-computer interaction (HCI)
  3. monitor engagement
  4. motion capture
  5. postural micromovements
  6. usability
  7. user experience (UX)

Qualifiers

  • Research-article

Conference

ECCE '12
Sponsor:
  • EACE
  • Edinburgh Napier University, UK
ECCE '12: European Conference on Cognitive Ergonomics
August 28 - 31, 2012
Edinburgh, United Kingdom

Acceptance Rates

Overall Acceptance Rate 56 of 91 submissions, 62%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)8
  • Downloads (Last 6 weeks)0
Reflects downloads up to 20 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2019)Toward emotional recognition during HCI using marker-based automated video trackingProceedings of the 31st European Conference on Cognitive Ergonomics10.1145/3335082.3335103(49-52)Online publication date: 10-Sep-2019
  • (2016)Non-Instrumental Movement Inhibition (NIMI) Differentially Suppresses Head and Thigh Movements during Screenic Engagement: Dependence on InteractionFrontiers in Psychology10.3389/fpsyg.2016.001577Online publication date: 23-Feb-2016
  • (2013)Engagement: the inputs and the outputsProceedings of the 2013 Inputs-Outputs Conference: An Interdisciplinary Conference on Engagement in HCI and Performance10.1145/2557595.2557596(1-4)Online publication date: 26-Jun-2013
  • (2013)Mean head and shoulder heights when seatedProceedings of the 31st European Conference on Cognitive Ergonomics10.1145/2501907.2501957(1-6)Online publication date: 26-Aug-2013

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media