Skip to main content

Measuring Experiences

  • Chapter
  • First Online:

Part of the book series: Human–Computer Interaction Series ((HCIS))

Abstract

The science of HCI in the third wave is intended to understand user experiences through the filter of the values and contexts of individuals using systems and moreover as filtered through the values and contexts of individual researchers. This is not to neglect the importance of measurement to science and the challenges of measuring user experience (UX). This chapter will discuss how HCI can draw on the methods of modern psychometrics to provide tools for measuring user experiences. In particular, we will introduce bifactor analysis as a way to examine both the conceptual coherence of a questionnaire for measuring UX and also the distinct influences of different facets of the core concept. Further, through looking at modern methods of analysis, in particular treatment of outliers, we also consider how modern statistics are not to be treated as black boxes but require researchers to think more deeply about the people behind the data. Drawing on our work in player experiences, we make the case that psychometrics used well as a tool in UX has an important role to play in HCI as a successor science.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  • Bakker M, Wicherts JM (2014) Outlier removal, sum scores, and the inflation of the type i error rate in independent samples t tests: the power of alternatives and recommendations. Psychol Methods 19(3):409

    Article  Google Scholar 

  • Baumer EP, Silberman M (2011) When the implication is not to design (technology). In: Proceedings of the SIGCHI Conference on human factors in computing systems, ACM, pp 2271–2274

    Google Scholar 

  • Bødker S (2006) When second wave HCI meets third wave challenges. In: Proceedings of the 4th Nordic conference on human-computer interaction: changing roles, ACM, pp 1–8

    Google Scholar 

  • Brockmyer JH, Fox CM, Curtiss KA, McBroom E, Burkhart KM, Pidruzny JN (2009) The development of the game engagement questionnaire: a measure of engagement in video game-playing. J Exp Soc Psychol 45(4):624–634

    Article  Google Scholar 

  • Brown E, Cairns P (2004) A grounded investigation of game immersion. In: CHI’04 extended abstracts on human factors in computing systems, ACM, pp 1297–1300

    Google Scholar 

  • Cairns P (2016) Engagement in digital games. In: O’Brien H, Cairns P (eds) Why engagement matters. Springer, Cham, pp 81–104

    Google Scholar 

  • Cairns P (2018) Being less wrong: essays on statistical methods in HCI. Cambridge University Press, Cambridge

    Google Scholar 

  • Cairns P, Cox AL, Day M, Martin H, Perryman T (2013) Who but not where: the effect of social play on immersion in digital games. Int J Hum Comput Stud 71(11):1069–1077

    Article  Google Scholar 

  • Cairns P, Cox A, Nordin AI (2014) Immersion in digital games: review of gaming experience research. In: Handbook of digital games. Wiley, Hoboken, pp 339–361

    Google Scholar 

  • Calvillo-Gamez EH, Cairns P, Cox AL (2015) Assessing the core elements of the gaming experience. In: Bernhaupt R (ed) Game user experience evaluation. Springer, Cham, pp 37–62

    Chapter  Google Scholar 

  • Chang H (2004) Inventing temperature: measurement and scientific progress. Oxford University Press, Oxford

    Book  Google Scholar 

  • Charmaz K (2014) Constructing grounded theory. Sage, Thousand Oaks

    Google Scholar 

  • Chen J (2007) Flow in games (and everything else). Commun ACM 50(4):31–34

    Article  Google Scholar 

  • Clarke RI, Lee JH, Clark N (2015) Why video game genres fail: a classificatory analysis. Games Cult 12:445–465

    Article  Google Scholar 

  • Costikyan G (2013) Uncertainty in games. MIT Press, Cambridge, MA

    Google Scholar 

  • Cox AL, Cairns PA, Walton A, Lee S (2008) Tlk or txt? using voice input for SMS composition. Pers Ubiquit Comput 12(8):567–588

    Article  Google Scholar 

  • Denisova A, Cairns P (2015) The placebo effect in digital games: phantom perception of adaptive artificial intelligence. In: Proceedings of the 2015 annual symposium on computer-human interaction in play, ACM, pp 23–33

    Google Scholar 

  • Denisova A, Guckelsberger C, Zendle D (2017) Challenge in digital games: towards developing a measurement tool. In: Proceedings of the 2017 CHI conference extended abstracts on human factors in computing systems, ACM, pp 2511–2519

    Google Scholar 

  • Dowell J, Long J (1998) Conception of the cognitive engineering design problem. Ergonomics 41(2):126–139

    Article  Google Scholar 

  • Fox CR, Ülkümen G (2011) Distinguishing two dimensions of uncertainty. In: Brun W, Keren G, Kirkebøen G, Montgomery H (eds) Perspectives on thinking, judging, and decision making. Universitetsforlaget, Oslo, pp 21–35

    Google Scholar 

  • Gould SJ (1996) The mismeasure of man. WW Norton, London

    Google Scholar 

  • Hacking I (1983) Representing and intervening: introductory topics in the philosophy of natural science. Cambridge University Press, Cambridge

    Book  Google Scholar 

  • Hair JF, Anderson RE, Tatham RL, Black WC (1998) Multivariate data analysis, 5th edn. Prentice Hall, Upper Saddle River

    Google Scholar 

  • Harrison S, Tatar D, Sengers P (2007) The three paradigms of HCI. In: Alt. Chi. Session at the SIGCHI Conference on human factors in computing systems San Jose, California, USA, pp 1–18

    Google Scholar 

  • Harrison S, Sengers P, Tatar D (2011) Making epistemological trouble: third- paradigm HCI as successor science. Interact Comput 23(5):385–392

    Article  Google Scholar 

  • Hassenzahl M (2004) The interplay of beauty, goodness, and usability in interactive products. Hum Comput Interact 19(4):319–349

    Article  Google Scholar 

  • Hudson M, Cairns P (2014) Measuring social presence in team based digital games. In: Riva G, Waterworth J, Murray D (eds) Interacting with presence. de Gruyter, Warsaw, pp 83–101

    Google Scholar 

  • Hudson M, Cairns P (2016) The effects of winning and losing on social presence in team-based digital games. Comput Hum Behav 60:1–12

    Article  Google Scholar 

  • Jennett C, Cox AL, Cairns P, Dhoparee S, Epps A, Tijs T, Walton A (2008) Measuring and defining the experience of immersion in games. Int J Hum Comput Stud 66(9):641–661

    Article  Google Scholar 

  • Kahneman D, Tversky A (1982) Variants of uncertainty. Cognition 11(2):143–157

    Article  Google Scholar 

  • Kline P (1994) An easy guide to factor analysis. Routledge, London

    Google Scholar 

  • Kline P (1998) The new psychometrics: science, psychology and measurement. Routledge, London

    Google Scholar 

  • Kuhlthau CC, Heinström J, Todd RJ (2008) The ‘information search process’ revisited: is the model still useful. Inf Res 13(4):13–14

    Google Scholar 

  • Kumari S, Power C, Cairns P (2017) Investigating uncertainty in digital games and its impact on player immersion. In: Extended abstracts publication of the annual symposium on computer-human interaction in play, ACM, CHI PLAY ‘17 Extended abstracts, pp 503–509

    Google Scholar 

  • Kuutti K, Bannon LJ (2014) The turn to practice in HCI: towards a research agenda. In: Proceedings of the 32nd annual ACM conference on human factors in computing systems, ACM, pp 3543–3552

    Google Scholar 

  • Lazzaro N (2009) Why we play: affect and the fun of games, entertainment interfaces and interactive products. In: Sears A, Jacko JA (eds) Human-computer interaction: designing for diverse users and domains. CRC Press, Boca Raton, pp 155–176

    Chapter  Google Scholar 

  • Likert R (1932) A technique for the measurement of attitudes. Arch Psychol 22(140):55

    Google Scholar 

  • Long J, Dowell J (1989) Conceptions of the discipline of HCI: craft, applied science, and engineering. In: People and Computers V: proceedings of the fifth conference of the British Computer Society, Cambridge University Press, vol 5, p 9

    Google Scholar 

  • Müller H, Sedley A, Ferrall-Nunge E (2014) Survey research in HCI. In: Ways of knowing in HCI. Springer, New York, pp 229–266

    Chapter  Google Scholar 

  • Nordin A (2014) Immersion and players’ time perception in digital games. PhD thesis, University of York

    Google Scholar 

  • O’Brien HL, Toms EG (2010) The development and evaluation of a survey to measure user engagement. J Am Soc Inf Sci Technol 61(1):50–69

    Article  Google Scholar 

  • Oppenheim AN (2000) Questionnaire design, interviewing and attitude measurement. Bloomsbury Publishing, London

    Google Scholar 

  • Osborne JW (2010) Data cleaning basics: best practices in dealing with extreme scores. Newborn Infant Nurs Rev 10(1):37–43

    Article  Google Scholar 

  • Poels K, De Kort Y, Ijsselsteijn W (2007) It is always a lot of fun!: exploring dimensions of digital game experience using focus group methodology. In: Proceedings of the 2007 conference on future play, ACM, pp 83–89

    Google Scholar 

  • Power C, Denisova A, Papaioannou T, Cairns P (2017) Measuring uncertainty in games: Design and preliminary validation. In: Proceedings of the 2017 CHI conference extended abstracts on human factors in computing systems, ACM, pp 2839–2845

    Google Scholar 

  • Power C, Cairns P, Denisova A, Papaioannou T (to appear) Player uncertainty in games: measuring when players gets stuck. Under review

    Google Scholar 

  • Pugh J, Power C (2015) Swimming the channels: an analysis of online archival reference enquiries. In: Abascal J, Barbosa S, Fetter M, Gross T, Palanque P, Winckler M (eds) Human-computer interaction. Springer, Cham, pp 99–115

    Google Scholar 

  • Reeves S (2015) Human-computer interaction as science. In: Proceedings of the fifth decennial aarhus conference on critical alternatives, Aarhus University Press, pp 73–84

    Google Scholar 

  • Reise SP (2012) The rediscovery of bifactor measurement models. Multivar Behav Res 47(5):667–696

    Article  Google Scholar 

  • Ryu YS, Smith-Jackson TL (2006) Reliability and validity of the mobile phone usability questionnaire (mpuq). J Usability Stud 2(1):39–53

    Google Scholar 

  • Salen K, Zimmerman E (2004) Rules of play: game design fundamentals. MIT Press, Cambridge, MA

    Google Scholar 

  • Sawilowsky SS, Blair RC (1992) A more realistic look at the robustness and type II error properties of the t test to departures from population normality. Psychol Bull 111(2):352

    Article  Google Scholar 

  • Schiller J, Cairns P (2008) There’s always one!: modelling outlying user performance. In: CHI’08 extended abstracts on human factors in computing systems, ACM, pp 3513–3518

    Google Scholar 

  • Ülkümen G, Fox CR, Malle BF (2016) Two dimensions of subjective uncertainty: clues from natural language. J Exp Psychol Gen 145(10):1280–1297

    Article  Google Scholar 

  • Vieweg S, Hughes AL, Starbird K, Palen L (2010) Microblogging during two natural hazards events: what twitter may contribute to situational awareness. In: Proceedings of the SIGCHI conference on human factors in computing systems, ACM, pp 1079–1088

    Google Scholar 

  • Wilcox RR (2017) Introduction to robust estimation and hypothesis testing, 4th edn. Academic Press, London

    MATH  Google Scholar 

  • Witmer BG, Singer MJ (1998) Measuring presence in virtual environments: a presence questionnaire. Presence Teleop Virt 7(3):225–240

    Article  Google Scholar 

  • Wright P, McCarthy J (2004) Technology as experience. MIT Press, Cambridge, MA

    Google Scholar 

  • Wright P, McCarthy J, Meekison L (2003) Making sense of experience. In: Blythe MA, Overbeeke K, Monk AF, Wright PC (eds) Funology. Kluwer Academic Publishers, London, pp 43–53

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Paul Cairns .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Cairns, P., Power, C. (2018). Measuring Experiences. In: Filimowicz, M., Tzankova, V. (eds) New Directions in Third Wave Human-Computer Interaction: Volume 2 - Methodologies . Human–Computer Interaction Series. Springer, Cham. https://doi.org/10.1007/978-3-319-73374-6_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-73374-6_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-73373-9

  • Online ISBN: 978-3-319-73374-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics