Skip to main content

Detect and Interpret: Towards Operationalization of Automated User Experience Evaluation

  • Conference paper
  • First Online:
Design, User Experience, and Usability (HCII 2023)

Abstract

The evaluation of user experience (UX) with software products is widely recognized as a critical aspect of supporting a product lifecycle. However, existing UX evaluation methods tend to require high levels of human involvement in data collection and analysis. This makes the ongoing UX monitoring particularly challenging, especially given the increasing number of products, growing user base and associated data. Thus, there is a strong demand in developing UX evaluation systems that are able to automatically track UX and provide insights on required design improvements. The few existing frameworks for such automated systems can help identify user-centric metrics for UX evaluation, but mostly focus on providing recommendations on best practices of determining metrics and tend to reflect only parts of the UX. Moreover, these frameworks predominantly rely on high-level UX concepts, but do not necessarily allow measurements to reveal the underlying causes of UX challenges. In this paper, we demonstrate how the above-mentioned challenges can be addressed through a combination of data gathering and analysis paths employed by the traditional UX evaluation methods. Our paper contributes to the field by providing a review of existing automated UX evaluation approaches and common UX evaluation data collection methods, and offering a two-tier measurement approach for developing automated UX evaluation system, which augments the reflective power of traditional UX evaluation methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    According to ISO 9241‑210:2010, 2.13 [1], usability is the “extent to which a system, product or service can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use”. In other words, usability refers to how the product/system/UI can help users to complete specific goals in an easy and satisfactory way. In this paper, we would not provide an overview of the differentiation between usability and UX as this is not our primary content and research interest. Instead, we take the perspective of other researchers (e.g., ISO 9241‑210:2010, 2.15 [1], [92]) that usability is a part of UX and also a measure of UX because UX additionally includes (but not limited to) users’ perceptions and feelings when users interact with the products or systems. Thus, we review usability inspection methods as a way to collect experts’ data as a proxy of users’ data that measures and evaluates UX.

References

  1. ISO 9241-11:2018(en), Ergonomics of human-system interaction — Part 11: Usability: Definitions and concepts. https://www.iso.org/obp/ui/#iso:std:iso:9241:-11:ed-2:v1:en

  2. Inan Nur, A., Santoso, H. B., Hadi Putra, P.O.: The method and metric of user experience evaluation: a systematic literature review. In: 2021 10th International Conference on Software and Computer Applications, Kuala Lumpur Malaysia, pp. 307–317. ACM (2021)

    Google Scholar 

  3. Hussain, J., et al.: A multimodal deep log-based user experience (UX) PLATFORM for UX evaluation. Sensors 18, 1622 (2018). https://doi.org/10.3390/s18051622

    Article  Google Scholar 

  4. McClure, D.: Startup Metrics for Pirates (13:03:16 UTC)

    Google Scholar 

  5. Rodden, K., Hutchinson, H., Fu, X.: Measuring the user experience on a large scale: user-centered metrics for web applications. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Atlanta, Georgia, USA, pp. 2395–2398. ACM (2010)

    Google Scholar 

  6. Kohavi, R., Deng, A., Frasca, B., Walker, T., Xu, Y., Pohlmann, N.: Online controlled experiments at large scale. In: Proceedings of the 19th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Chicago, Illinois, USA, pp. 1168–1176. ACM (2013)

    Google Scholar 

  7. Fabijan, A., Dmitriev, P., Olsson, H.H., Bosch, J.: The benefits of controlled experimentation at scale. In: 2017 43rd Euromicro Conference on Software Engineering and Advanced Applications (SEAA), Vienna, Austria, pp. 18–26. IEEE (2017)

    Google Scholar 

  8. Gupta, S., Ulanova, L., Bhardwaj, S., Dmitriev, P., Raff, P., Fabijan, A.: The anatomy of a large-scale experimentation platform. In: 2018 IEEE International Conference on Software Architecture (ICSA), Seattle, WA, pp. 1–109. IEEE (2018)

    Google Scholar 

  9. Gupta, S., et al.: Top challenges from the first practical online controlled experiments summit. SIGKDD Explor. Newsl. 21, 20–35 (2019). https://doi.org/10.1145/3331651.3331655

    Article  Google Scholar 

  10. Deng, A., Shi, X.: Data-driven metric development for online controlled experiments: seven lessons learned. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, California, USA, pp. 77–86. ACM (2016)

    Google Scholar 

  11. Dmitriev, P., Wu, X.: Measuring metrics. In: Proceedings of the 25th ACM International on Conference on Information and Knowledge Management, Indianapolis, Indiana, USA, pp. 429–437. ACM (2016)

    Google Scholar 

  12. Dmitriev, P., Gupta, S., Kim, D.W., Vaz, G.: A dirty dozen: twelve common metric interpretation pitfalls in online controlled experiments. In: Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Halifax, NS, Canada, pp. 1427–1436. ACM (2017)

    Google Scholar 

  13. Robinson, J., Lanius, C., Weber, R.: The past, present, and future of UX empirical research. Commun. Des. Q. Rev. 5, 10–23 (2018). https://doi.org/10.1145/3188173.3188175

    Article  Google Scholar 

  14. Shyr, C., Kushniruk, A., Wasserman, W.W.: Usability study of clinical exome analysis software: top lessons learned and recommendations. J. Biomed. Inform. 51, 129–136 (2014). https://doi.org/10.1016/j.jbi.2014.05.004

    Article  Google Scholar 

  15. Joachim, V., Spieth, P., Heidenreich, S.: Active innovation resistance: an empirical study on functional and psychological barriers to innovation adoption in different contexts. Ind. Mark. Manag. 71, 95–107 (2018). https://doi.org/10.1016/j.indmarman.2017.12.011

    Article  Google Scholar 

  16. Likert, R.: A technique for measurement of attitudes. Arch. Psychol. 140, 5–55 (1932)

    Google Scholar 

  17. Brooke, J.: SUS: a retrospective. J. Usability Stud. 8, 29–40 (2013)

    Google Scholar 

  18. Lewis, J.R.: Critical review of “the usability metric for user experience.” Interact. Comput. 25, 320–324 (2013). https://doi.org/10.1093/iwc/iwt013

    Article  Google Scholar 

  19. Moshagen, M., Thielsch, M.T.: Facets of visual aesthetics. Int. J. Hum. Comput. Stud. 68, 689–709 (2010). https://doi.org/10.1016/j.ijhcs.2010.05.006

    Article  Google Scholar 

  20. Lavie, T., Tractinsky, N.: Assessing dimensions of perceived visual aesthetics of web sites. Int. J. Hum. Comput. Stud. 60, 269–298 (2004). https://doi.org/10.1016/j.ijhcs.2003.09.002

    Article  Google Scholar 

  21. Sauro, J., Dumas, J.S.: Comparison of three one-question, post-task usability questionnaires. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Boston, MA, USA, pp. 1599–1608. ACM (2009)

    Google Scholar 

  22. Paas, F.G.W.C., Van Merriënboer, J.J.G.: The efficiency of instructional conditions: an approach to combine mental effort and performance measures. Hum. Factors 35, 737–743 (1993). https://doi.org/10.1177/001872089303500412

    Article  Google Scholar 

  23. Hassenzahl, M., Platz, A., Burmester, M., Lehner, K.: Hedonic and ergonomic quality aspects determine a software’s appeal. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, The Hague, The Netherlands, pp. 201–208. ACM (2000)

    Google Scholar 

  24. Laugwitz, B., Held, T., Schrepp, M.: Construction and evaluation of a user experience questionnaire. In: Holzinger, A. (ed.) USAB 2008. LNCS, vol. 5298, pp. 63–76. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-89350-9_6

    Chapter  Google Scholar 

  25. Sauro, J., Lewis, J.R.: Quantifying the User Experience: Practical Statistics for User Research. Elsevier, Morgan Kaufmann, Amsterdam (2016)

    Google Scholar 

  26. Schankin, A., Budde, M., Riedel, T., Beigl, M.: Psychometric properties of the user experience questionnaire (UEQ). In: CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, pp. 1–11. ACM (2022)

    Google Scholar 

  27. Abrahams, A.S., Fan, W., Wang, G.A., Zhang, Z.J., Jiao, J.: An integrated text analytic framework for product defect discovery. Prod. Oper. Manag. 24, 975–990 (2015). https://doi.org/10.1111/poms.12303

    Article  Google Scholar 

  28. Qi, J., Zhang, Z., Jeon, S., Zhou, Y.: Mining customer requirements from online reviews: a product improvement perspective. Inf. Manag. 53, 951–963 (2016). https://doi.org/10.1016/j.im.2016.06.002

    Article  Google Scholar 

  29. Ding, X., Liu, B., Zhang, L.: Entity discovery and assignment for opinion mining applications. In: Proceedings of the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Paris, France, pp. 1125–1134. ACM (2009)

    Google Scholar 

  30. Park, E., Kang, J., Choi, D., Han, J.: Understanding customers’ hotel revisiting behaviour: a sentiment analysis of online feedback reviews. Curr. Issue Tour. 23, 605–611 (2020). https://doi.org/10.1080/13683500.2018.1549025

    Article  Google Scholar 

  31. Cheng, M., Jin, X.: What do Airbnb users care about? An analysis of online review comments. Int. J. Hosp. Manag. 76, 58–70 (2019). https://doi.org/10.1016/j.ijhm.2018.04.004

    Article  Google Scholar 

  32. Guo, Y., Barnes, S.J., Jia, Q.: Mining meaning from online ratings and reviews: tourist satisfaction analysis using latent Dirichlet allocation. Tour. Manag. 59, 467–483 (2017). https://doi.org/10.1016/j.tourman.2016.09.009

    Article  Google Scholar 

  33. Vu, H.Q., Li, G., Law, R., Zhang, Y.: Exploring tourist dining preferences based on restaurant reviews. J. Travel Res. 58, 149–167 (2019). https://doi.org/10.1177/0047287517744672

    Article  Google Scholar 

  34. Xu, X., Wang, X., Li, Y., Haghighi, M.: Business intelligence in online customer textual reviews: understanding consumer perceptions and influential factors. Int. J. Inf. Manag. 37, 673–683 (2017). https://doi.org/10.1016/j.ijinfomgt.2017.06.004

    Article  Google Scholar 

  35. Yang, B., Liu, Y., Liang, Y., Tang, M.: Exploiting user experience from online customer reviews for product design. Int. J. Inf. Manag. 46, 173–186 (2019). https://doi.org/10.1016/j.ijinfomgt.2018.12.006

    Article  Google Scholar 

  36. Hussain, J., Azhar, Z., Ahmad, H.F., Afzal, M., Raza, M., Lee, S.: User experience quantification model from online user reviews. Appl. Sci. 12, 6700 (2022). https://doi.org/10.3390/app12136700

    Article  Google Scholar 

  37. Podsakoff, P.M., MacKenzie, S.B., Lee, J.-Y., Podsakoff, N.P.: Common method biases in behavioral research: a critical review of the literature and recommended remedies. J. Appl. Psychol. 88, 879–903 (2003). https://doi.org/10.1037/0021-9010.88.5.879

    Article  Google Scholar 

  38. Holtom, B., Baruch, Y., Aguinis, H., A Ballinger, G.: Survey response rates: trends and a validity assessment framework. Hum. Relat. 75, 1560–1584 (2022). https://doi.org/10.1177/00187267211070769

  39. Sivo, S., Saunders, C., Chang, Q., Jiang, J.: How low should you go? Low response rates and the validity of inference in IS questionnaire research. JAIS 7, 351–414 (2006). https://doi.org/10.17705/1jais.00093

  40. Singh, A.S., Masuku, M.B.: Sampling techniques and determination of sample size in applied statistics research: an overview. Int. J. Econ. Commer. Manag. 2, 1–22 (2014)

    Google Scholar 

  41. Analytics Tools & Solutions for Your Business - Google Analytics. https://marketingplatform.google.com/about/analytics/

  42. Hotjar: Website Heatmaps & Behavior Analytics Tools. https://www.hotjar.com/

  43. Katerina, T., Nicolaos, P.: Mouse behavioral patterns and keystroke dynamics in end-user development: what can they tell us about users’ behavioral attributes? Comput. Hum. Behav. 83, 288–305 (2018). https://doi.org/10.1016/j.chb.2018.02.012

    Article  Google Scholar 

  44. Meidenbauer, K.L., Niu, T., Choe, K.W., Stier, A.J., Berman, M.G.: Mouse movements reflect personality traits and task attentiveness in online experiments. J. Personal. (2022). https://doi.org/10.1111/jopy.12736

  45. Griffiths, L., Chen, Z.: Investigating the differences in web browsing behaviour of chinese and european users using mouse tracking. In: Aykin, N. (ed.) UI-HCII 2007. LNCS, vol. 4559, pp. 502–512. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-73287-7_59

    Chapter  Google Scholar 

  46. Mueller, F., Lockerd, A.: Cheese: tracking mouse movement activity on websites, a tool for user modeling. In: CHI 2001 Extended Abstracts on Human Factors in Computing Systems, Seattle Washington, pp. 279–280. ACM (2001)

    Google Scholar 

  47. Arapakis, I., Leiva, L.A.: Predicting user engagement with direct displays using mouse cursor information. In: Proceedings of the 39th International ACM SIGIR Conference on Research and Development in Information Retrieval, Pisa, Italy, pp. 599–608. ACM (2016)

    Google Scholar 

  48. Yamauchi, T., Xiao, K.: Reading emotion from mouse cursor motions: affective computing approach. Cognit. Sci. 42, 771–819 (2018). https://doi.org/10.1111/cogs.12557

    Article  Google Scholar 

  49. SadighZadeh, S., Kaedi, M.: Modeling user preferences in online stores based on user mouse behavior on page elements. JSIT. 24, 112–130 (2022). https://doi.org/10.1108/JSIT-12-2019-0264

    Article  Google Scholar 

  50. Smith, J.R., Terry, D.J., Manstead, A.S.R., Louis, W.R., Kotterman, D., Wolfs, J.: The attitude-behavior relationship in consumer conduct: the role of norms, past behavior, and self-identity. J. Soc. Psychol. 148, 311–334 (2008). https://doi.org/10.3200/SOCP.148.3.311-334

    Article  Google Scholar 

  51. Sauro, J.: Linking UX Attitudes to Future Website Purchases – MeasuringU. https://measuringu.com/ux-purchases/

  52. Bechler, C.J., Tormala, Z.L., Rucker, D.D.: The attitude-behavior relationship revisited. Psychol. Sci. 32, 1285–1297 (2021). https://doi.org/10.1177/0956797621995206

    Article  Google Scholar 

  53. Kohavi, R., Deng, A., Longbotham, R., Xu, Y.: Seven rules of thumb for web site experimenters. In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, New York, USA, pp. 1857–1866. ACM (2014)

    Google Scholar 

  54. Fu, B., Noy, N.F., Storey, M.-A.: Eye tracking the user experience – an evaluation of ontology visualization techniques. SW 8, 23–41 (2016). https://doi.org/10.3233/SW-140163

    Article  Google Scholar 

  55. Zaman, B., Shrimpton-Smith, T.: The FaceReader: measuring instant fun of use. In: Proceedings of the 4th Nordic Conference on Human-Computer Interaction: Changing Roles, Oslo, Norway, pp. 457–460. ACM (2006)

    Google Scholar 

  56. Lane, R., Mcrae, K., Reiman, E., Chen, K., Ahern, G., Thayer, J.: Neural correlates of heart rate variability during emotion. Neuroimage 44, 213–222 (2009). https://doi.org/10.1016/j.neuroimage.2008.07.056

    Article  Google Scholar 

  57. Zheng, W.-L., Zhu, J.-Y., Lu, B.-L.: Identifying stable patterns over time for emotion recognition from EEG. IEEE Trans. Affect. Comput. 10, 417–429 (2019). https://doi.org/10.1109/TAFFC.2017.2712143

    Article  Google Scholar 

  58. Dawson, M.E., Schell, A.M., Filion, D.L., Berntson, G.G.: The electrodermal system. In: Cacioppo, J.T., Tassinary, L.G., Berntson, G. (eds.) Handbook of Psychophysiology, pp. 157–181. Cambridge University Press, Cambridge (2007)

    Chapter  Google Scholar 

  59. Calvo, R.A., D’Mello, S.: Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans. Affect. Comput. 1, 18–37 (2010). https://doi.org/10.1109/T-AFFC.2010.1

    Article  Google Scholar 

  60. Nacke, L.E.: Games user research and physiological game evaluation. In: Bernhaupt, R. (ed.) Game User Experience Evaluation. HIS, pp. 63–86. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-15985-0_4

    Chapter  Google Scholar 

  61. Bernhaupt, R.: User experience evaluation methods in the games development life cycle. In: Bernhaupt, R. (ed.) Game User Experience Evaluation. HIS, pp. 1–8. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-15985-0_1

    Chapter  Google Scholar 

  62. Courtemanche, F., Léger, P.-M., Dufresne, A., Fredette, M., Labonté-LeMoyne, É., Sénécal, S.: Physiological heatmaps: a tool for visualizing users’ emotional reactions. Multimedia Tools Appl. 77(9), 11547–11574 (2017). https://doi.org/10.1007/s11042-017-5091-1

    Article  Google Scholar 

  63. Wehbe, R.R., Kappen, D.L., Rojas, D., Klauser, M., Kapralos, B., Nacke, L.E.: EEG-based assessment of video and in-game learning. In: CHI 2013 Extended Abstracts on Human Factors in Computing Systems, Paris, France, pp. 667–672. ACM (2013)

    Google Scholar 

  64. Wehbe, R.R., Nacke, L.E.: Towards understanding the importance of co-located gameplay. In: Proceedings of the 2015 Annual Symposium on Computer-Human Interaction in Play, London, United Kingdom, pp. 733–738. ACM (2015)

    Google Scholar 

  65. Nacke, L.E., Stellmach, S., Sasse, D., Niesenhaus, J., Dachselt, R.: LAIF: a logging and interaction framework for gaze-based interfaces in virtual entertainment environments. Entertain. Comput. 2, 265–273 (2011). https://doi.org/10.1016/j.entcom.2010.09.004

    Article  Google Scholar 

  66. Halbig, A., Latoschik, M.E.: A systematic review of physiological measurements, factors, methods, and applications in virtual reality. Front. Virtual Real. 2, 694567 (2021). https://doi.org/10.3389/frvir.2021.694567

    Article  Google Scholar 

  67. Witmer, B.G., Singer, M.J.: Measuring presence in virtual environments: a presence questionnaire. Presence 7, 225–240 (1998). https://doi.org/10.1162/105474698565686

    Article  Google Scholar 

  68. Deniaud, C., Honnet, V., Jeanne, B., Mestre, D.: The concept of “presence” as a measure of ecological validity in driving simulators. J. Interact. Sci. 3(1), 1–13 (2015). https://doi.org/10.1186/s40166-015-0005-z

    Article  Google Scholar 

  69. Lemmens, J.S., Simon, M., Sumter, S.R.: Fear and loathing in VR: the emotional and physiological effects of immersive games. Virtual Real. 26, 223–234 (2021). https://doi.org/10.1007/s10055-021-00555-w

  70. Dey, A., Phoon, J., Saha, S., Dobbins, C., Billinghurst, M.: A neurophysiological approach for measuring presence in immersive virtual environments. In: 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Porto de Galinhas, Brazil, pp. 474–485. IEEE (2020)

    Google Scholar 

  71. Athif, M., et al.: Using biosignals for objective measurement of presence in virtual reality environments. In: 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Montreal, QC, Canada, pp. 3035–3039. IEEE (2020)

    Google Scholar 

  72. Arake, M., et al.: Measuring task-related brain activity with event-related potentials in dynamic task scenario with immersive virtual reality environment. Front. Behav. Neurosci. 16, 779926 (2022). https://doi.org/10.3389/fnbeh.2022.779926

    Article  Google Scholar 

  73. Michaelis, J.R., et al.: Describing the user experience of wearable fitness technology through online product reviews. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 60, 1073–1077 (2016). https://doi.org/10.1177/1541931213601248

    Article  Google Scholar 

  74. Cano, S., Araujo, N., Guzman, C., Rusu, C., Albiol-Perez, S.: Low-cost assessment of user experience through EEG signals. IEEE Access 8, 158475–158487 (2020). https://doi.org/10.1109/ACCESS.2020.3017685

    Article  Google Scholar 

  75. Nielson, J.: Usability inspection methods. Presented at the Conference Companion on Human Factors in Computing Systems, April 1994

    Google Scholar 

  76. Hollingsed, T., Novick, D.G.: Usability inspection methods after 15 years of research and practice. In: Proceedings of the 25th Annual ACM International Conference on Design of Communication, El Paso, Texas, USA, pp. 249–255. ACM (2007)

    Google Scholar 

  77. Nielsen, J., Molich, R.: Heuristic evaluation of user interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems Empowering People - CHI 1990, Seattle, Washington, United States, pp. 249–256. ACM Press (1990)

    Google Scholar 

  78. Lewis, C., Polson, P.G., Wharton, C., Rieman, J.: Testing a walkthrough methodology for theory-based design of walk-up-and-use interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems Empowering People - CHI 1990, Seattle, Washington, United States, pp. 235–242. ACM Press (1990)

    Google Scholar 

  79. Bias, R.: The pluralistic usability walkthrough: coordinated empathies. In: Nielsen, J., Mack, R.L. (eds.) Usability Inspection Methods, pp. 63–76. Wiley, New York (1994)

    Google Scholar 

  80. Sobiesiak, R., O’Keefe, Ti.: Complexity analysis: a quantitative approach to usability engineering. In: CASCON 2011: Proceedings of the 2011 Conference of the Center for Advanced Studies on Collaborative Research, pp. 242–256 (2011)

    Google Scholar 

  81. Polson, P.G., Lewis, C., Rieman, J., Wharton, C.: Cognitive walkthroughs: a method for theory-based evaluation of user interfaces. Int. J. Man Mach. Stud. 36, 741–773 (1992). https://doi.org/10.1016/0020-7373(92)90039-N

    Article  Google Scholar 

  82. Lewis, C., Wharton, C.: Cognitive walkthroughs. In: Handbook of Human-Computer Interaction, pp. 717–732. Elsevier (1997)

    Google Scholar 

  83. Rohrer, C.P., Wendt, J., Sauro, J., Boyle, F., Cole, S.: Practical usability rating by experts (PURE): a pragmatic approach for scoring product usability. In: Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, San Jose, California, USA, pp. 786–795. ACM (2016)

    Google Scholar 

  84. Joyce, G., Lilley, M.: Towards the development of usability heuristics for native smartphone mobile applications. In: Marcus, A. (ed.) DUXU 2014. LNCS, vol. 8517, pp. 465–474. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-07668-3_45

    Chapter  Google Scholar 

  85. Quiñones, D., Rusu, C., Rusu, V.: A methodology to develop usability/user experience heuristics. Comput. Stand. Interfaces 59, 109–129 (2018). https://doi.org/10.1016/j.csi.2018.03.002

    Article  Google Scholar 

  86. Hermawati, S., Lawson, G.: Establishing usability heuristics for heuristics evaluation in a specific domain: is there a consensus? Appl. Ergon. 56, 34–51 (2016). https://doi.org/10.1016/j.apergo.2015.11.016

    Article  Google Scholar 

  87. Nielsen, J.: Finding usability problems through heuristic evaluation. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI 1992, Monterey, California, United States, pp. 373–380. ACM Press (1992)

    Google Scholar 

  88. de Lima Salgado, A., de Mattos Fortes, R.P.: Heuristic evaluation for novice evaluators. In: Marcus, A. (ed.) DUXU 2016. LNCS, vol. 9746, pp. 387–398. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-40409-7_37

    Chapter  Google Scholar 

  89. Botella, F., Alarcon, E., Peñalver, A.: How to classify to experts in usability evaluation. In: Proceedings of the XV International Conference on Human Computer Interaction - Interacción 2014, Puerto de la Cruz, Tenerife, Spain, pp. 1–4. ACM Press (2014)

    Google Scholar 

  90. Solano, A., Collazos, C.A., Rusu, C., Fardoun, H.M.: Combinations of methods for collaborative evaluation of the usability of interactive software systems. Adv. Hum. Comput. Interact. 2016, 1–16 (2016). https://doi.org/10.1155/2016/4089520

    Article  Google Scholar 

  91. Nasir, M., Ikram, N., Jalil, Z.: Usability inspection: novice crowd inspectors versus expert. J. Syst. Softw. 183, 111122 (2022). https://doi.org/10.1016/j.jss.2021.111122

    Article  Google Scholar 

  92. Hassan, H.M., Galal-Edeen, G.H.: From usability to user experience. In: 2017 International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS), Okinawa, pp. 216–222. IEEE (2017)

    Google Scholar 

Download references

Acknowledgment

We would like to thank Dr. Hrag Pailian for his comments and discussion at the earlier stages of the development of the two-tier UX measurement approach.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Angeline Sin Mei Tsui .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Tsui, A.S.M., Kuzminykh, A. (2023). Detect and Interpret: Towards Operationalization of Automated User Experience Evaluation. In: Marcus, A., Rosenzweig, E., Soares, M.M. (eds) Design, User Experience, and Usability. HCII 2023. Lecture Notes in Computer Science, vol 14032. Springer, Cham. https://doi.org/10.1007/978-3-031-35702-2_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-35702-2_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-35701-5

  • Online ISBN: 978-3-031-35702-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics