Skip to main content

Advertisement

Log in

A QoE evaluation of augmented reality for the informational phase of procedure assistance

  • Research Article
  • Published:
Quality and User Experience Aims and scope Submit manuscript

Abstract

Augmented reality (AR) is an emerging technology that has significant potential as a solution for novel procedure assistance in mass customisation. Procedure assistance is a series of steps or instructions required to aid a person to complete a task. The informational phase of a procedure is the period when a user is trying to understand instructions, in advance of implementing them. With AR as a potential solution to communicate these steps, it is important to understand the factors that influence user acceptability and experience. In this context, this paper reports the results of a Quality of Experience (QoE) evaluation of two approaches for informational phase assistance: AR procedure assistance and paper-based procedure assistance (control group). Each approach presented a procedure to solve a Rubik’s Cube® in the minimum number of steps. As part of the evaluation methodology of these procedure assistance methods, different metrics were captured. These included the user’s physiological ratings, facial expression features and self-reported measures in terms of affect, task load and QoE. The results show that AR-based assistance yielded significantly reduced procedure completion times and increased success rates compared to paper-based instructions. Several correlations were discovered between physiological and self-reported measures. For example, frustration and mental task load components were seen to correlate to both electrodermal and interbeat interval ratings. The findings from this work will stimulate experimentation and theoretical discussion on the use of physiological ratings and facial expressions as indicators of task load and QoE.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. https://tinyurl.com/4vpeht79.

  2. https://tinyurl.com/yc4wzba3.

  3. https://tinyurl.com/2phuf92d.

  4. https://tinyurl.com/t2mafmn2.

References

  1. Van Krevelen DWF, Poelman R (2010) A survey of augmented reality technologies, applications and limitations. Int J Virtual Real 9(2):1–20. https://doi.org/10.20870/IJVR.2010.9.2.2767

    Article  Google Scholar 

  2. Farshid M, Paschen J, Eriksson T, Kietzmann J (2018) Go boldly!: explore augmented reality (AR), virtual reality (VR), and mixed reality (MR) for business. Bus Horiz 61(5):657–663. https://doi.org/10.1016/j.bushor.2018.05.009

    Article  Google Scholar 

  3. Lu Y (2017) Industry 4.0: a survey on technologies, applications and open research issues. J Ind Inf Integr 6(Supplement C):1–10. https://doi.org/10.1016/j.jii.2017.04.005

    Article  Google Scholar 

  4. Paelke V (2014) Augmented reality in the smart factory: supporting workers in an industry 4.0. environment. In: Proceedings of the 2014 IEEE emerging technology and factory automation (ETFA), pp 1–4. https://doi.org/10.1109/ETFA.2014.7005252

  5. Syberfeldt A, Danielsson O, Gustavsson P (2017) Augmented reality smart glasses in the smart factory: product evaluation guidelines and review of available products. IEEE Access 5:9118–9130. https://doi.org/10.1109/ACCESS.2017.2703952

    Article  Google Scholar 

  6. Mourtzis D, Zogopoulos V, Xanthi F (2019) Augmented reality application to support the assembly of highly customized products and to adapt to production re- scheduling. Int J Adv Manuf Technol 105(9):3899–3910. https://doi.org/10.1007/s00170-019-03941-6

    Article  Google Scholar 

  7. Eiriksdottir E, Catrambone R (2011) Procedural instructions, principles, and examples: how to structure instructions for procedural tasks to enhance performance, learning, and transfer. Hum Factors J Hum Factors Ergon Soc 53(6):749–770. https://doi.org/10.1177/0018720811419154

    Article  Google Scholar 

  8. Möller S, Raake A (2013) Quality of experience, advanced concepts, applications and methods. Springer, Cham

    Google Scholar 

  9. Kruijff E, Swan JE, Feiner S (2010) Perceptual issues in augmented reality. In: 2010 IEEE international symposium on mixed and augmented reality, pp 3–12.https://doi.org/10.1109/ISMAR.2010.5643530

  10. Silva RLS, Rodrigues PS, Mazala D, Giraldi G (2004) Applying object recognition and tracking to augmented reality for information visualization. Technical report, LNCC Braz., p 7

  11. Uva A, Gattullo M, Manghisi V, Spagnulo D, Cascella G, Fiorentino M (2018) Evaluating the effectiveness of spatial augmented reality in smart manufacturing: a solution for manual working stations. Int J Adv Manuf Technol 94(1–4):509–521. https://doi.org/10.1007/s00170-017-0846-4

    Article  Google Scholar 

  12. Hou L, Wang X, Bernold L, Love PED (2013) Using animated augmented reality to cognitively guide assembly. J Comput Civ Eng 27(5):439–451. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000184

    Article  Google Scholar 

  13. Loch F, Quint F, Brishtel I (2016) Comparing video and augmented reality assistance in manual assembly. In: 2016 12th international conference on intelligent environments (IE), pp 147–150. https://doi.org/10.1109/IE.2016.31

  14. Egger-Lampl S, Gerdenitsch C, Deinhard L, Schatz R, Hold P (2019) Assembly instructions with AR: towards measuring interactive assistance experience in an Industry 4.0 context. In: 2019 Eleventh international conference on quality of multimedia experience (QoMEX), pp 1–3. https://doi.org/10.1109/QoMEX.2019.8743266

  15. Vieira V, Rafael D, Agnihotri R (2022) Augmented reality generalizations: a meta-analytical review on consumer-related outcomes and the mediating role of hedonic and utilitarian values. J Bus Res 151:170–184. https://doi.org/10.1016/j.jbusres.2022.06.030

    Article  Google Scholar 

  16. Yang X (2021) Augmented reality in experiential marketing: the effects on consumer utilitarian and hedonic perceptions and behavioural responses. In: Lee ZWY, Chan TKH, Cheung CMK (eds) Information technology in organisations and societies: multidisciplinary perspectives from AI to technostress. Emerald Publishing Limited, Bradford, pp 147–174. https://doi.org/10.1108/978-1-83909-812-320211006

    Chapter  Google Scholar 

  17. Riar M, Korbel JJ, Xi N, Zarnekow R, Hamari J (2021) The use of augmented reality in retail: a review of literature. In: Presented at the Hawaii international conference on system sciences. https://doi.org/10.24251/HICSS.2021.078

  18. Neumann U, Majoros A (1998) Cognitive, performance, and systems issues for augmented reality applications in manufacturing and maintenance. In: Proceedings. IEEE 1998 virtual reality annual international symposium (Cat. No. 98CB36180), pp 4–11. https://doi.org/10.1109/VRAIS.1998.658416

  19. Chiang THC, Yang SJH, Hwang G-J (2014) An augmented reality-based mobile learning system to improve students’ learning achievements and motivations in natural science inquiry activities. J Educ Technol Soc 17(4):352–365

    Google Scholar 

  20. Vogt S, Khamene A, Sauer F (2006) Reality augmentation for medical procedures: system architecture, single camera marker tracking, and system evaluation. Int J Comput Vis 70(2):179. https://doi.org/10.1007/s11263-006-7938-1

    Article  Google Scholar 

  21. Henderson SJ, Feiner S (2009) Evaluating the benefits of augmented reality for task localization in maintenance of an armored personnel carrier turret. In: 2009 8th IEEE international symposium on mixed and augmented reality, pp 135–144. https://doi.org/10.1109/ISMAR.2009.5336486

  22. Bhaduri S, Van Horne K, Sumner T (2019) Designing an informal learning curriculum to develop 3D modeling knowledge and improve spatial thinking skills. In: Extended abstracts of the 2019 CHI conference on human factors in computing systems, Glasgow Scotland UK, pp 1–8. https://doi.org/10.1145/3290607.3299039

  23. Berki B (2019) Does effective use of MaxWhere VR relate to the individual spatial memory and mental rotation skills? Acta Polytech Hung. https://doi.org/10.12700/APH.16.6.2019.6.4

    Article  Google Scholar 

  24. Valerie J (2021) Supporting middle school students’ spatial skills through Rubik’S Cube play. https://tinyurl.com/k4v4wb4t. Accessed 24 Jun 2021

  25. Rokicki T (2019) Why it’s almost impossible to solve a Rubik’s cube in under 3 seconds. https://tinyurl.com/34vauyp8. Accessed 09 Jul 2019

  26. Hoβfeld T, Schatz R, Egger S (2011) SOS: the MOS is not enough!. In: 2011 third international workshop on quality of multimedia experience, pp 131–136. https://doi.org/10.1109/QoMEX.2011.6065690

  27. Sabet SS, Griwodz C, Möller S (2019) Influence of primacy, recency and peak effects on the game experience questionnaire. In: Proceedings of the 11th ACM workshop on immersive mixed and virtual environment systems, Amherst, Massachusetts, pp 22–27. https://doi.org/10.1145/3304113.3326113

  28. Perkis A et al. (2020) QUALINET white paper on definitions of immersive media experience (IMEx). ArXiv200707032 Cs. https://tinyurl.com/ye24vcs2. Accessed 15 Jul 2020

  29. McCarthy C, Pradhan N, Redpath C, Adler A (2016) Validation of the Empatica E4 wristband. In: 2016 IEEE EMBS international student conference (ISC), pp 1–4. https://doi.org/10.1109/EMBSISC.2016.7508621

  30. Baltrušaitis T, Robinson P, Morency LP (2016) OpenFace: an open source facial behavior analysis toolkit. In: 2016 IEEE winter conference on applications of computer vision (WACV), pp 1–10.https://doi.org/10.1109/WACV.2016.7477553

  31. Likert R (1932) A technique for the measurement of attitudes. Arch Psychol 22(140):55–55

    Google Scholar 

  32. Morris JD (1995) Observations: SAM: the self-assessment manikin an efficient cross-cultural measurement of emotional response. J Advert Res 35:63–68

    Google Scholar 

  33. Hart SG, Field M (2006) Nasa-Task Load Index (NASA-TLX); 20 years later. Proc Hum Factors Ergon Soc Annu Meet 50(9):5

    Google Scholar 

  34. Moinnereau M-A, de Oliveira AA, Falk TH (2022) Immersive media experience: a survey of existing methods and tools for human influential factors assessment. Qual User Exp 7(1):5. https://doi.org/10.1007/s41233-022-00052-1

    Article  Google Scholar 

  35. Rodrigues TB, Catháin CÓ, Oconnor NE, Murray N (2020) A Quality of Experience assessment of haptic and augmented reality feedback modalities in a gait analysis system. PLoS ONE 15(3):e0230570. https://doi.org/10.1371/journal.pone.0230570

    Article  Google Scholar 

  36. Salgado DP, Flynn R, Naves ELM, Murray N (2020) The impact of jerk on quality of experience and cybersickness in an immersive wheelchair application. In: 2020 twelfth international conference on quality of multimedia experience (QoMEX), May 2020, pp 1–6. https://doi.org/10.1109/QoMEX48832.2020.9123086

  37. Salgado DP et al (2018) A QoE assessment method based on EDA, heart rate and EEG of a virtual reality assistive technology system. In: Proceedings of the 9th ACM multimedia systems conference on—MMSys ’18, Amsterdam, Netherlands, pp 517–520. https://doi.org/10.1145/3204949.3208118

  38. Concannon D, Flynn R, Murray N (2019) A quality of experience evaluation system and research challenges for networked virtual reality-based teleoperation applications. In: Proceedings of the 11th ACM workshop on immersive mixed and virtual environment systems, Amherst, Massachusetts, pp 10–12. https://doi.org/10.1145/3304113.3326119

  39. Hynes E, Flynn R, Lee B, Murray N (2019) A quality of experience evaluation comparing augmented reality and paper based instruction for complex task assistance. In: 2019 IEEE 21st international workshop on multimedia signal processing (MMSP), pp 1–6. https://doi.org/10.1109/MMSP.2019.8901705

  40. Keighrey C, Flynn R, Murray S, Murray N (2020) A physiology-based QoE comparison of interactive augmented reality, virtual reality and tablet-based applications. IEEE Trans Multimed. https://doi.org/10.1109/TMM.2020.2982046

    Article  Google Scholar 

  41. Lerner JS, Dahl RE, Hariri AR, Taylor SE (2007) Facial expressions of emotion reveal neuroendocrine and cardiovascular stress responses. Biol Psychiatry 61(2):253–260. https://doi.org/10.1016/j.biopsych.2006.08.016

    Article  Google Scholar 

  42. Zhai J, Barreto A (2006) Stress detection in computer users based on digital signal processing of noninvasive physiological variables. In: 2006 international conference of the IEEE Engineering in Medicine and Biology Society, pp 1355–1358.https://doi.org/10.1109/IEMBS.2006.259421

  43. Paschero M, et al. (2012) A real time classifier for emotion and stress recognition in a vehicle driver. In: 2012 IEEE international symposium on industrial electronics, pp 1690–1695. https://doi.org/10.1109/ISIE.2012.6237345

  44. De Moor K, Mazza F, Hupont I, Ríos Quintero M, Mäki T, Varela M (2014) Chamber QoE: a multi-instrumental approach to explore affective aspects in relation to quality ofexperience. In: Presented at the IS&T/SPIE electronic imaging, San Francisco, California, USA, p 90140U. https://doi.org/10.1117/12.2042243

  45. Hynes E, Flynn R, Lee B, Murray N (2020) An evaluation of lower facial micro expressions as an implicit QoE metric for an augmented reality procedure assistance application. In: 2020 31st Irish signals and systems conference (ISSC), pp 1–6. https://doi.org/10.1109/ISSC49989.2020.9180173

  46. Takalkar M, Xu M, Wu Q, Chaczko Z (2018) A survey: facial micro-expression recognition. Multimed Tools Appl 77(15):19301–19325. https://doi.org/10.1007/s11042-017-5317-2

    Article  Google Scholar 

  47. Polikovsky S, Kameda Y, Ohta Y (2009) Facial micro-expressions recognition using high speed camera and 3D-gradient descriptor, pp 16–16. https://doi.org/10.1049/ic.2009.0244

  48. Davison AK, Lansley C, Costen N, Tan K, Yap MH (2018) SAMM: a spontaneous micro-facial movement dataset. IEEE Trans Affect Comput 9(1):116–129. https://doi.org/10.1109/TAFFC.2016.2573832

    Article  Google Scholar 

  49. Yan W-J, Wu Q, Liang J, Chen Y-H, Fu X (2013) how fast are the leaked facial expressions: the duration of micro-expressions. J Nonverbal Behav 37(4):217–230. https://doi.org/10.1007/s10919-013-0159-8

    Article  Google Scholar 

  50. Pfister T, Li X, Zhao G, Pietikäinen M (2011) Recognising spontaneous facial micro-expressions. In: 2011 international conference on computer vision, pp 1449–1456. https://doi.org/10.1109/ICCV.2011.6126401

  51. Du S, Tao Y, Martinez AM (2014) Compound facial expressions of emotion. Proc Natl Acad Sci 111(15):E1454–E1462. https://doi.org/10.1073/pnas.1322355111

    Article  Google Scholar 

  52. Donato G, Bartlett MS, Hager JC, Ekman P, Sejnowski TJ (1999) Classifying facial actions. IEEE Trans Pattern Anal Mach Intell 21(10):974–989. https://doi.org/10.1109/34.799905

    Article  Google Scholar 

  53. Ekman P, Erika RL. What the face reveals: basic and applied studies of spontaneous expression using the facial action coding system (FACS), 2nd edn

  54. Lucey P, Cohn JF, Kanade T, Saragih J, Ambadar Z, Matthews I (2010) The extended Cohn-Kanade dataset (CK+): a complete dataset for action unit and emotion-specified expression. In: 2010 IEEE Computer Society conference on computer vision and pattern recognition—workshops, pp 94–101. https://doi.org/10.1109/CVPRW.2010.5543262

  55. Kanade T, Cohn JF, Tian Y (2000) Comprehensive database for facial expression analysis. In: Proceedings fourth IEEE international conference on automatic face and gesture recognition (Cat. No. PR00580), pp 46–53. https://doi.org/10.1109/AFGR.2000.840611

  56. Tian Y-I, Kanade T, Cohn JF (2001) Recognizing action units for facial expression analysis. IEEE Trans Pattern Anal Mach Intell 23(2):97–115. https://doi.org/10.1109/34.908962

    Article  Google Scholar 

  57. ‘Meta View’, Meta View Inc. Meta 2. https://tinyurl.com/bdhxwdkv. Accessed 26 Jul 2019

  58. Rokicki T, Kociemba H, Davidson M, Dethridge J (2014) The diameter of the Rubik’s cube group is twenty. SIAM Rev 56(4):645–670. https://doi.org/10.1137/140973499

    Article  MathSciNet  MATH  Google Scholar 

  59. Solving the Rubik’s cube optimally is NP-complete. bit.ly/3Z9S4PD. Accessed 18 Jul 2018

  60. Ollander S, Godin C, Campagne A, Charbonnier S (2016) A comparison of wearable and stationary sensors for stress detection. In: 2016 IEEE international conference on systems, man, and cybernetics (SMC), pp 004362–004366. https://doi.org/10.1109/SMC.2016.7844917

  61. AndroidSteve, ‘Rubik-Cube-Wizard’, GitHub, Dec. 16, 2020. https://bit.ly/3ItfUQg. Accessed 29 Dec 2020

  62. Henderson SJ, Feiner SK (2011) Augmented reality in the psychomotor phase of a procedural task. In: 2011 10th IEEE international symposium on mixed and augmented reality, pp 191–200. https://doi.org/10.1109/ISMAR.2011.6092386

  63. Wilschut ES, Könemann R, Murphy MS, van Rhijn GJW, Bosch T (2019) Evaluating learning approaches for product assembly: using chunking of instructions, spatial augmented reality and display based work instructions. In: Proceedings of the 12th ACM international conference on pervasive technologies related to assistive environments, New York, NY, USA, pp 376–381. https://doi.org/10.1145/3316782.3322750

  64. Valerie J, Aylward G, Varma K (2020) I solved it! using the Rubik’s cube to support mental rotation in a middle school science classroom. https://doi.org/10.22318/icls2020.653

  65. Pradhan N, Rajan S, Adler A, Redpath C (2017) Classification of the quality of wristband-based photoplethysmography signals. In: 2017 IEEE international symposium on medical measurements and applications (MeMeA), pp 269–274. https://doi.org/10.1109/MeMeA.2017.7985887

  66. Wechsung I, Engelbrecht K-P, Kühnel C, Möller S, Weiss B (2012) Measuring the Quality of Service and Quality of Experience of multimodal human–machine interaction. J Multimodal User Interfaces 6(1):73–85. https://doi.org/10.1007/s12193-011-0088-y

    Article  Google Scholar 

  67. Hynes E (2021) Hynes, E. QoE questionnaire for AR or paper-based procedure assitance modality. http://bitly.ws/ssV9. Accessed 27 Jun 2021

  68. ITU-T P.851:subjective quality evaluation of telephone services based on spoken dialogue systems. https://www.itu.int/rec/T-REC-P.851. Accessed 18 Jul 2018

  69. Lewis JR (1995) IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use. Int J Hum Comput Interact 7(1):57–78. https://doi.org/10.1080/10447319509526110

    Article  Google Scholar 

  70. Chin JP, Diehl VA, Norman KL (1988) Development of an instrument measuring user satisfaction of the human–computer interface. In: Proceedings of the SIGCHI conference on human factors in computing systems, New York, NY, USA, pp 213–218. https://doi.org/10.1145/57167.57203

  71. Davis FD (1993) User acceptance of information technology: system characteristics, user perceptions and behavioral impacts. Int J Man-Mach Stud 38(3):475–487. https://doi.org/10.1006/imms.1993.1022

    Article  Google Scholar 

  72. Legris P, Ingham J, Collerette P (2003) Why do people use information technology? A critical review of the technology acceptance model. Inf Manag 40(3):191–204. https://doi.org/10.1016/S0378-7206(01)00143-4

    Article  Google Scholar 

  73. Paltoglou G, Thelwall M (2013) Seeing stars of valence and arousal in blog posts. IEEE Trans Affect Comput 4(1):116–123. https://doi.org/10.1109/T-AFFC.2012.36

    Article  Google Scholar 

  74. Bradley MM, Lang PJ (1994) Measuring emotion: the self-assessment manikin and the semantic differential. J Behav Ther Exp Psychiatry 25(1):49–59. https://doi.org/10.1016/0005-7916(94)90063-9

    Article  Google Scholar 

  75. Hart SG, Staveland LE (1988) Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. In: Advances in psychology, vol 52. Elsevier, pp 139–183. https://doi.org/10.1016/S0166-4115(08)62386-9

  76. ITU-T P. 913, Series P: terminals and subjective and objective assessment methods. https://www.itu.int/rec/T-REC-P.913-201603-I/en. Accessed 15 Oct 2018

  77. Kaiser PK (2009) Prospective evaluation of visual acuity assessment: a comparison of Snellen versus ETDRS charts in clinical practice (an AOS thesis). Trans Am Ophthalmol Soc 107:311–324

    Google Scholar 

  78. Committee on Vision, Assembly of Behavioural and Social Sciences National Research Council (1981) Procedures for tesing color vision: report of. National Academies Press, Washington

    Google Scholar 

  79. Vandenberg SG, Kuse AR (1978) Mental rotations, a group test of three-dimensional spatial visualization. Percept Mot Skills 47(2):599–604. https://doi.org/10.2466/pms.1978.47.2.599

    Article  Google Scholar 

  80. Aigrain J, Spodenkiewicz M, Dubuisson S, Detyniecki M, Cohen D, Chetouani M (2018) Multimodal stress detection from multiple assessments. IEEE Trans Affect Comput 9(4):491–506. https://doi.org/10.1109/TAFFC.2016.2631594

    Article  Google Scholar 

  81. Timmerer C, Ebrahimi T, Pereira F (2015) Toward a new assessment of quality. Computer 48(3):108–110. https://doi.org/10.1109/MC.2015.89

    Article  Google Scholar 

  82. Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161–1178

    Article  Google Scholar 

Download references

Acknowledgements

The authors acknowledge the financial support of Science Foundation Ireland (SFI) under Grant Number SFI/16/RC/3918 and the Athlone Institute of Technology President’s Seed Fund.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eoghan Hynes.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hynes, E., Flynn, R., Lee, B. et al. A QoE evaluation of augmented reality for the informational phase of procedure assistance. Qual User Exp 8, 1 (2023). https://doi.org/10.1007/s41233-023-00054-7

Download citation

  • Received:

  • Published:

  • DOI: https://doi.org/10.1007/s41233-023-00054-7

Keywords

Navigation