Abstract
Critical thinking is essential in health disciplines though is reportedly underdeveloped in student health professionals. Immersive mobile extended reality (mXR) may facilitate critical thinking in health education though has not yet been fully explored. The main aim of this study was to evaluate the impact of co-designing a virtual environment on the facilitation of critical thinking in health education students. Second-year graduate-entry Doctor of Physiotherapy students (n = 25) co-designed health-related case scenarios over six weeks in a web-based 360-degree immersive environment. This included embedding exercise prescription videos that incorporated prompts for critical thinking of a target population. The evaluation included pre- and post-evaluation surveys, the Health Science Reasoning Test (HSRT-N) and the System Usability Scale (SUS). The results of this study demonstrated a positive effect on critical thinking skills- particularly in analysis, interpretation, inference, deduction, numeracy and overall (p < .05). Participants reported favourable perceptions of mXR usability and the learning experience, although challenges such as cybersickness and technical complexities were noted. Peer feedback suggested that the virtual environment promoted engagement and authenticity in learning. Recommendations for future iterations include enhancing population representation, addressing challenges in system usability, and refining instructional design elements. Overall, the study demonstrates the potential of mobile immersive reality to enhance critical thinking and foster authentic learning experiences in health education. Further design principles and implications for research design are proposed in the study.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Background
1.1 Critical thinking
The development of critical thinking is essential for health professional students to analyse a clinical scenario to make well-informed, safe and effective judgements in complex environments (Carbogim et al. 2018; Chan 2013). The term “critical thinking” has been synonymously used with decision-making, problem-solving, and in the case of health professional education (HPE), clinical reasoning and clinical judgment (Carter et al. 2022; Dissen 2023). Although critical thinking is essential in health disciplines, there has been increasing concern about the development of critical thinking skills in health professional students (Audétat et al. 2013; Koivisto et al. 2018).
Barriers to developing critical thinking include the institutional de-prioritisation of infrastructure to support educators’ modelling of critical thinking; the use of a fixed, didactic, teacher-directed curriculum delivery rather than a heutagogical, socially constructed and student-determined approach to learning; limited development of critical thinking dispositions such as truth-seeking, open-mindedness, inquisitive thinking and reflective judgement; and access to real-world learning experiences with discipline-specific context (Blaschke and Hase 2015; Dwyer 2023; Facione 1990).
While developing critical thinking remains one of the major unsolved problems in pedagogy (Kuhn and Dean 2004; Larsson 2017), two principles are consistently represented in the literature. Firstly, critical thinking is a transformational process. This is facilitated by active, experiential learning and reflective thinking (Dewey 1910) by creating opportunities for the learner to be self-directed, self-disciplined, self-monitored and self-corrective thinking (Paul and Elder 2006). Secondly, critical thinking can be developed independently or with others. Vygotsky (1978) identified that critical thinking is constructed by social interactions with ‘more capable peers’. A Zone of Proximal Development is created by collaboratively working on a task that the learner could not perform independently, in time shifting to performing the same task without assistance.
1.2 Theoretical framework
Social constructivism, situated learning, and heutagogy are three theoretical frameworks that have been identified to facilitate the development of critical thinking in HPE. Social constructivism highlights the interdependence of the social context and the learner’s knowledge of the problem to be solved (Thomas et al. 2014; Vygotsky 1978). The learner actively participates in meaningful, authentic experiences with others who may complement or challenge worldviews and contextual frameworks (Thomas et al. 2014). One approach to enable social constructivism in learning is ‘co-design’, which facilitates collaborative engagement, creativity and designing a meaningful solution to make sense of the problem (Treasure-Jones and Joynes 2018). Intentional co-design in HPE develops knowledge and skills including critical thinking, confidence, building health professional-client relationships and enjoyment in the learning activity (Abbonizio et al. 2024; O’Connor et al. 2021).
Situated learning complements social constructivism, advocating for continuous, active (legitimate) learning, within a social process (peripheral participation) while familiarising a way of doing and knowing in an authentic learning environment (Lave and Wenger 1991; Nicolini et al. 2016). This can be achieved through Communities of Practice which represent the social learning spaces where people develop a communal repertoire of knowledge and practices (Lave and Wenger 1991; Wenger 1999, 2004) or, through communities of inquiry in more formal educational settings.
Heutagogy has been referred to as both a form (Blaschke 2012) and study (Hase and Kenyon 2000) of self-determined learning. It extends from pedagogy and andragogy as learners progress in maturity and autonomy (Canning 2010). A heutagogical approach embraces learner agency or the ability to choose their pathway to learning which may be non-linear and out of sync with that of the facilitator or curriculum (Blaschke 2021; Blaschke and Hase 2019; Hase 2009; Hase and Kenyon 2000). Secondly, heutagogy promotes self-reflection where the learner reflects on the problem-solving process, actions and outcomes, and how it influences their beliefs and actions (Blaschke 2012). Thirdly, heutagogy considers the learner’s confidence in their competency (Hase and Kenyon 2000, 2007) and their ability to take action (Blaschke and Hase 2016; Cochrane et al. 2018).
1.3 Immersive extended reality and 360-degree virtual environments
Alongside theoretical frameworks that support the development of critical thinking, it is important to consider recent advances in technology-enhanced learning and the pedagogical implications of facilitating critical thinking. Immersive extended reality (XR), including virtual reality and virtual environments, is increasingly being used to facilitate skills including critical thinking (Jans et al. 2023). Affordances of mobile immersive extended reality (mXR) align with key concepts outlined in social constructivist, situated learning and heutagogical frameworks. These include improved accessibility (compared to tethered, high-fidelity methods), authentic learning, collaborative practice, confidence and self-efficacy in clinical skills, feedback on student performance, information literacy, self-reflection, motivation, engagement, repetitive practice for skill improvement, safe application of skills, and scalability (Stretton et al. 2024) (Fig. 1).
mXR can be experienced in 360-degree virtual environments. Omnidirectional panoramic images or videos allow the learner to pan and tilt in an uninterrupted circle by shifting the position of either a phone, low-cost phone-enabled VR headset (e.g., Google© Cardboard or Merge) or head-mounted display (e.g., Oculus© Rift or Apple© Vision Pro). Because the learner has the agency to look around and explore, they are more immersive than traditional 2D media but less than a high-fidelity VR learning experience (Rupp et al. 2019). 360-degree virtual environments are an emerging tool in HPE as they enable learners to affordably conceptualise, produce and edit clinical environments that represent authentic clinical experiences (Baysan et al. 2023; Evens et al. 2023). 360-degree scenarios are typically between two to 15 min (Baysan et al. 2023; Evens et al. 2023) and have beneficial affordances in learning performance, problem-solving, self-confidence, motivation, satisfaction, attention, situational awareness, and reflective and skills-based knowledge (Baysan et al. 2023; Blair et al. 2021; Snelson and Hsu 2020). Interaction with the virtual environment can be enhanced with the inclusion of hotspots, behaviour-triggered images, audio, or advancement between scenes based on actions or responses to questions (Evens et al. 2023; Snelson and Hsu 2020). Limitations, however, include viewers’ interactive movements limited to the head and neck, an inability to view objects in 3D, difficulty incorporating multiusers in the same environment, and some reports of cybersickness related to low resolution and refresh rates (Baysan et al. 2023).
While it has been demonstrated that mXR can positively facilitate critical thinking in health profession education programmes, a recent systematic review highlighted that investigations were limited to five health disciplines, focused on emergency or critical response, and only a small number directly measured critical thinking (Stretton et al. 2024).
This paper reports upon the first iteration of a larger educational design research (EDR) project investigating how mobile immersive extended reality (mXR) facilitates critical thinking in health professional education (HPE).
2 Methods (design and construction phase)
2.1 Population
The source population included 123 second-year graduate entry Doctorate of Physiotherapy (DPT) students enrolled in a strength and conditioning subject in a large Australian metropolitan university. Human ethics was approved by the university ethics committee in August 2022 (Reference Number 2022-23676-30979-3).
2.2 Procedure
Educational design research (EDR) is an iterative approach that explores and analyses current literature, theoretical frameworks and stakeholder involvement to inform the design and construction of an intervention that is evaluated and reflected on for the maturation of subsequent iterations (McKenney and Reeves 2019) (Fig. 2). An initial literature review (Analysis Phase) (Stretton et al. 2024), and findings from focus groups (Exploration Phase) (Stretton and Cochrane 2023) informed the Design and Construction Phases along with co-designing the learning activity with the subject coordinators to align with the subject learning outcomes (February to August 2023). Subsequent correspondence and meetings informed the development of an overview recruitment trailer, the initial session with the participants, a video template, and evaluation session planning. Learning management system (Canvas) announcements directed potential student participants to the recruitment trailer, plain language statement, and consent form.
In the Evaluation Phase, all students enrolled in the strength and conditioning subject attended an Initial Session that outlined instructions for the group learning task to be developed over six weeks (September- October 2023). In week one, groups of four to five students self-selected one of 25 target populations (case scenarios) and co-designed and recorded a two-sentence case scenario audio biography for a browser-based virtual environment (https://www.seekbeak.com). Students were provided a templated video in PowerPoint demonstrating the most appropriate exercise (Table 1), that the researcher uploaded to the virtual environment on behalf of the students. Students could include additional “hotspots” that provided further scenario-based information in the virtual environment as they (a) met the client in the reception, (b) determined the best exercise in the gym, and (c) highlighted key elements in a co-designed video (see Fig. 3). Aligned with the course learning outcomes, students were encouraged to include the aims and benefits of exercise, prerequisites, precautions and contraindications, exercise setup, trick movements and exercise principles (frequency, intensity, time, type, volume, and progression).
While the content and technical assistance were available to all students by the subject coordinators and primary researcher respectively, only those who had accessed the plain language statement and signed the consent form were invited to participate in the research component of the Initial Session (additional one hour). This included the completion of a pre-evaluation survey and a pre-test critical thinking survey (Health Sciences Reasoning Test- HSRT-N).
In week six, a one-hour Evaluation Session was scheduled for all students to exchange target group scenarios and provide peer feedback. Students could review these in a browser on their laptop or using personal mobile phones with low-fidelity Merge headsets (https://mergeedu.com/headset). Research participants were then asked to complete a post-evaluation survey, the System Usability Scale (SUS), and a post-test HSRT-N (additional one hour).
2.3 Measures
2.3.1 Pre-evaluation survey
At the Initial Session (week one), participants were asked to complete an online form (Qualtrics©) with demographic data for baseline analysis including age, gender, ethnicity, and previous use of mobile phones, 360-degree virtual environments and augmented and virtual reality.
2.3.2 Health science reasoning test (HSRT-N)
Participants completed the Health Science Reasoning Test (HSRT-N) at both the Initial Session (week one) and the Evaluation Session (week six). The HSRT uses the same critical thinking subscales as the more generic California Critical Thinking Skills Test (CCTST) (Facione and Facione 2023) originally described in the Delphi study (Facione 1990).
The HSRT-N is a 33-item multiple-choice assessment that takes approximately 50 min to complete and is specifically designed to evaluate the critical thinking skills of health professional students and professionals (Huhn et al. 2011; Saghafi et al. 2024). The vignettes and associated questions are developed by Insight Assessment© with a health science context, but no prior knowledge is required to correctly answer the questions (Flowers et al. 2020). Participants were encouraged to complete the test on a laptop, though the scenario was formatted so could be completed on a mobile phone as an alternative.
The HSRT has reported internal consistency (Kuder Richardson-20) ranging from 0.77 to 0.84 (Cazzell and Anderson 2016; Forneris 2015) with an overall internal consistency of 0.81 (Facione and Facione 2023). Construct validity has been confirmed in a physiotherapy population by Huhn et al. (2011) who were able to discriminate between experts and novices (p =.008).
An overall HSRT-N score, as well as scale scores in the areas of analysis, interpretation, inference, evaluation, explanation, induction, deduction, and numeracy are provided on completion (Huhn et al. 2011). The overall and sub-scale scores are rated out of 100 and categorised as either not manifested (50–62), weak (62–71), moderate (72–80), strong (81–88) or superior (89–100) (Facione and Facione 2023). Currently, there is no published data on what constitutes an important change score on the HSRT-N (Huhn et al. 2013).
2.3.3 Post-evaluation survey
At the Evaluation Session (week six), all students (including those not in the study) were asked to provide peer feedback on each other’s target population virtual environment. Research participants completed a post-evaluation Survey, the System Usability Scale (SUS) and a post-test HSRT-N. The online post-evaluation survey (Qualtrics©) focused on the use of the headset, co-design of the virtual environment, and overall experience of the learning experience. Twelve questions were answered on a five-point Likert scale.
2.3.4 System Usability Scale (SUS)
The Systems Usability Scale (SUS) is a 10-item questionnaire rated on a five-point Likert scale, resulting in very little time to administer (Brooke 1986). The SUS has been used to measure the usability of commercial products, clinical interventions and more recently, HPE (Escalada-Hernandez et al. 2024; Yoo et al. 2024). The SUS has reported reliability and validity, including concurrent validity (Bangor et al. 2008). While the Technology Acceptance Model (TAM) has also been utilised to guide perceived usability (Yoo et al. 2024; Zlamal et al. 2022), the SUS was selected as the project investigates variables beyond just usability (i.e. effectiveness, efficiency and satisfaction), and could be used with a small sample size (Cheah et al. 2023; Lewis and Sauro 2009).
2.3.5 Peer feedback
Feedback has a positive impact on student learning and achievement, especially cognitive and motor skills compared to motivational and behavioural outcomes (Hattie and Timperley 2007; Wisniewski et al. 2020). All students were asked to provide peer feedback on the virtual environments of other groups. This was facilitated by an online survey (Qualtrics) with questions answered on a five-point Likert scale. Questions related to usability, engagement, and critical thinking, as well as identification of “best element” and element “requiring improvement”.
2.4 Data analysis
The HRST-overall score and sub-scores were collated for each participant and compiled as mean and standard deviations. The HSRT-N-overall percentile was also compared with an aggregate sample of the ‘HSRT-N Graduate Physical Therapy’ comparison group from Insight Assessment©. For example, if a test taker had a 60th percentile score, roughly 59 people out of 100 would score lower than this test taker and 40 people out of 100 would score higher than this test taker in the comparison group (Facione and Facione 2023).
Paired and independent samples t-test analysis were used to evaluate the HSRT-N mean test scores between pre-test and post-test overall and sub-scores. An alpha level of 0.05 was used for all statistical tests. The effect size of change scores in the HSRT-N was calculated for all pairwise comparisons of change scores using Cohen’s d = (mean 1 − mean 2)/SDpooled. Effect sizes were defined as small (0.00–0.29), moderate (0.30–0.79), and large (> 0.8) (Cohen 1988). All data were analysed using IBM SPSS Statistics for Mac© version 29.0.1.0 (IBM Corp, Armonk, NY) and Microsoft Excel© software. All other quantitative data were collated as mean values and standard deviations.
The odd-numbered questions on the SUS are scored more favourable if rated “strongly agree”, and even-numbered questions more favourable if rated “strongly disagree”. Raw SUS scores can be converted into percentile ranks with indicative grades and levels of acceptability (Sauro 2018). The average SUS score (at the 50th percentile) is 68, with scores over 68 considered above average and anything under 68 as below average (Sauro 2018).
Post-evaluation qualitative data focusing on the use of headsets, co-design and the primary researcher’s key reflections were exported to Excel©. The dataset was read systematically from start to end with semantic (explicitly expressed) and latent (implicit or conceptual) code labels conceptualised and assigned before refining, defining and thematic mapping (Braun and Clarke 2022).
3 Results
3.1 Evaluation of population
From the source population, n = 74 completed a consent form, with 46 completing the pre-evaluation survey (37% of the source population), and 25 completing the post-evaluation surveys (20% of the source population; 45% attrition). Participant ages ranged between 22 and 26 years of age (mean 25, SD 1.24) and most participants identified as female (n = 18; 72%), with six (24%) identifying as male and one (4%) preferring not to answer. The majority of participants were Australian (n = 16; 64%) with the remaining identifying as Asian (n = 9; 36%).
In the pre-evaluation survey, the majority of participants indicated that they used their mobile phone for educational purposes daily (n = 17; 68%) or two-to-three times a week (n = 4; 16%). The majority of participants had rarely used 360-degree virtual environments (n = 14; 56%) or never (n = 4; 16%). Similarly, most participants had rarely used virtual or augmented reality (n = 16; 64%) or never (n = 8; 32%) before the introduction of this learning task.
3.2 Critical thinking (HSRT-N)
Analysis of the HSRT-N test results suggests that the co-design of clinical virtual environments had a positive effect on critical thinking scores, with a mean pre-test of 83.48 (SD = 6.42) compared with a mean test of 87.00 (SD = 4.79) at post-test evaluation.
Comparison of HSRT-N total pre-test and post-test mean sub-scores showed no statistically significant change in the sub-scores for evaluation, explanation and induction. However, there was a statistically significant increase in the mean sub-score for analysis, interpretation, inference, deduction and numeracy (p <.05) (Table 2).
Even at the pre-test level (55th percentile), analysis of mean HSRT-N scores showed that most test takers scored well against their HSRT-N Graduate Physical Therapy comparison group. The mean HSRT-N improved over the six weeks to a post-test level at the 75th percentile. This would equate to roughly 74 Graduate Physical Therapy students out of 100 scoring lower than the cohort of participants in this study (Facione and Facione 2023).
3.3 System Usability Scale (SUS)
The overall SUS mean was 65.20 indicating that the usability was close to the average (N = 25, M = 65.20, SD = 15.59) (Table 3). Of the items in the SUS where a higher mean indicated more positive responses (i.e., the odd-numbered questions), participants indicated that they quickly learnt how to use the browser-based virtual platform (https://www.seekbeak.com) (n = 25, M = 3.92, SD = 1.00), and found it easy to use (n = 25, M = 3.80, SD = 0.87). They were confident in the use of the virtual environment (n = 25, M = 3.64, SD = 0.76), found the items well integrated (n = 25, M = 3.52, SD = 0.82), and would like to use it more frequently (n = 25, M = 3.48, SD = 0.87).
Of the items in the SUS where a lower mean indicated more positive responses (i.e., the even-numbered questions), participants indicated that: there was a lot to learn to use the virtual platform (n = 25, M = 2.04, SD = 0.89), they would need support to use (n = 25, M = 2.28, SD = 1.10), there were some inconsistencies in the system (n = 25, M = 2.52, SD = 0.77), it was unnecessarily complex (n = 25, M = 2.68, SD = 1.11), and cumbersome (n = 25, M = 2.76, SD = 1.13).
3.4 Use of headsets, co-design and overall experience
Participants were asked questions related to the use of the Merge headset in the post-evaluation survey. Participants agreed that the use of the headsets added to the learning experience (n = 12, M = 4.17, SD = 0.83) and aided critical thinking (n = 11, M = 4.00, SD = 0.63). However, the use of the headset provoked cybersickness in a couple of participants (n = 11, M = 3.36, SD = 1.36).
Participants indicated that co-designing the virtual environment with their peers did not have an impact on their development of critical thinking (n = 25, M = 3.12, SD = 1.05), though trended to a positive impact on their reasoning (n 25, M = 3.52, SD = 0.92) and knowledge (n 25, M = 3.76, SD = 1.01).
Participants indicated that they agreed that they had a sufficient overview of the purpose of the learning experience (n 25, M = 3.92, SD = 0.91), equipment and resources (n 25, M = 4.28, SD = 0.84), support (n 25, M = 4.36, SD = 0.64), and feedback (n 25, M = 4.00, SD = 0.82), and that the task trended towards a better learning experience than conventional (n 25, M = 3.56, SD = 0.77). Participants implied they would recommend the use of mXR for future learning (n 25, M = 3.96, SD = 0.89) (Table 4).
Participants reflected on their own learning experiences when using the mXR. The most pleasing elements of the learning experience were being able to navigate new spaces virtually, having a creative license in the development of their scenario, the use of mixed reality, and co-designing with peers. The most challenging elements included clarity on how to upload the objects, time management, navigating the environment, the use of the headsets (e.g., large phones not able to fit into the headset), cybersickness, and a lack of clarity of the purpose of the learning task (Table 5).
3.5 Evaluation of peer feedback
During the post-evaluation session, students were asked to share the virtual environments with other groups and provide feedback on the presentation of the target population using an online form. Participants either “Strongly agreed” or “Agreed” that their peer virtual environments were easy to navigate, made good use of the virtual environment, had a high level of engagement, and provided more authentic learning compared to other modes of learning experiences. The challenge to critical thinking was rated slightly less than the other items, though was positively trending (Table 6).
Participants were asked to identify the best element and element for improvement. Students were mostly complimentary of the development of their peers’ videos, especially if included potential trick movements when completing exercises, additional tips, FITT VP, and benefits of exercise. Students were also positive when audio accompanied the biography or exercise instructions in the video.
Areas for improvement included audio for engagement, less text in the video slides, more elements to challenge critical thinking, progressions, outcome measures, and editing considerations (e.g., including frontal and sagittal videos or location of objects in the virtual environment).
4 Discussion
4.1 Critical thinking (HSRT-N)
Developing these critical thinking attributes in health professional students is essential to enhance their ability to analyse a scenario to inform effective clinical practice (Carbogim et al. 2018; Chan 2013). This study has demonstrated that the co-design of virtual environments facilitates the development of critical thinking in physiotherapy HPE, specifically improving the ability to analyse, interpret, and make inferences and deductions in unfamiliar environments.
However, there was no statistical improvement in evaluation, explanation, or induction according to the Health Science Reasoning Test (HSRT-N). The reduced ability to assess the credibility of claims or strength of arguments (Evaluation) may be impacted by the current climate of misinformation, generative artificial intelligence, and deep fake. Participants in this study ranged between 22 and 26 years of age, aligning with 18% of Australians as Generation Z (Australian Bureau of Statistics 2022) and representing the majority of students in higher education today (Basinger et al. 2021). While Gen Z feel more confident in identifying false or misleading information than other generations (Poynter Institute for Media Studies 2022), they frequently base conclusions on surface-level features of [mis]information and are not adequately taught how to judge credibility (Breakstone et al. 2021).
Critical thinking itself is not inherently dangerous as it positively enables a health professional’s problem-solving, decision-making, creativity, communication and self-reflection. However, the potential negative impact needs to be considered, such as epistemological engagement, intuitive judgement, as well as emotional and biased thinking (Dwyer 2023)- i.e. misinformation. This has heightened alongside increasing political, social and health-related concerns, and may present to the current Gen-Z cohort as unchecked facts in social media and generative artificial intelligence.
Health professional students are to be encouraged to developing their critical thinking by actively engaging in forming well-constructed questions, maturing truth-seeking strategies while formulating comprehensive analysis of information, and evaluation of findings and outcomes as they mature prior knowledge.
It was anticipated that participants would have developed reasoning skills in their entry bachelor programmes before commencing the Doctor of Physiotherapy. However, graduate entry students, despite starting as experienced learners, face the challenge of applying critical thinking within a shortened degree timeframe (Macdiarmid et al. 2024). This may be further compounded by the need to disassociate from didactic foundational learning experiences from their undergraduate programme, and be open to alternative methods of teaching to elevate critical thinking as a health professional. The development of critical thinking is progressive over the time of the degree- even for graduate entry programmes (Furze et al. 2015). Graduate entry students’ enjoyment of active learning and alternative approaches to learning can be utilised to enhance their development and learning experiences (Berg et al. 2021). Future iterations may integrate engagement with nuanced statements that are neither entirely true nor false to better prepare for complex real-world HPE experiences (Schvaneveldt et al. 2022).
This study did not show a statistical difference in Explanation- the ability to provide evidence, reasoning, assumptions, or rationale for judgements (Facione and Facione 2023). Future iterations could integrate the practical structure of Toulmin’s Argument Model and evidence-based reasoning to scaffold well-supported explanations (Ju and Choi 2017), along with Socratic questioning built into the template or reflective logbook could further facilitate critical thinking and explanation skills (Hu 2023).
Both deductive and inductive reasoning are important to developing critical thinking in HPE (Karlsen et al. 2021). The improvement in deductive learning in this study may reflect the participants’ familiarisation with a learning process that is explained by the demonstration of its application to clinical situations (Lin et al. 2023). However, improvement in Inductive Reasoning was not observed. Inductive reasoning draws on prior experience, knowledge and empirical observations to analyse patterns, and then draw conclusion(s) to make a reasoned judgement of what may happen in an unfamiliar situation (Facione and Facione 2023; Lin et al. 2023). Although participants may have developed an ability to form conclusions from observations in previous non-health professional degrees, the transference of this skill to form reasoned judgements from specific health observations may be limited in this second-year DPT cohort. Future iterations may integrate deductive reasoning further by providing structured logical problems where students apply general rules or theoretical principles to specific real-world scenarios to reach conclusions.
Participants received the HSRT-N results immediately from the online portal (https://insightassessment.com), appealing to Gen Z participants who expect instant feedback and access to the content (Abril 2024). Feedback should be personalised, explicit, and explain why and how the development of the virtual environment could be carried out differently, rather than focus on what went wrong (Abril 2024; Basinger et al. 2021; Cragun et al. 2024). While this first iteration included an option for participants to contact the researcher as needed, future iterations could benefit from scheduled “check-ins” that would focus on expectations for the week, while also providing an opportunity to ask questions (Abril 2024).
Beyond the HSRT-N, participants indicated that co-designing the virtual environment did not develop their ‘critical thinking’, however, did have some impact on ‘clinical reasoning’. Neither term was defined in the participant information, though this cohort may be more familiar with the latter (clinical reasoning) and see more directly the impact on knowledge and application when responding to the survey.
4.2 Co-design
Participants indicated that co-design did not have an impact on the learning experience or development of critical thinking, though could be the result of involvement of co-design in their bachelor’s degree. By starting with the co-design of the scenarios, learners begin with curiosity as they discern the knowledge required and reflect on the learning process and application to [clinical] practice (Blaschke 2012; Blaschke and Hase 2016; Canning and Callan 2010; Hase 2009; Hase and Kenyon 2007). Co-design could be enhanced in future iterations with self-selected group tasks that stimulate both heutagogical and social constructivist practice with “more knowledgeable others” (Oliver 2000; Thomas et al. 2014).
Mapping “clinical clues” with decision points similar to a “choose-your-own-adventure”(CYOA) approach would enhance learner agency (heutagogy) while co-designing the virtual environment. While using CYOA to facilitate critical thinking is yet to be fully explored in HPE, preliminary studies indicate improvement in engagement and satisfaction in learning, confidence, and developing clinical decision-making in preparation for unexpected situations (Jogerst et al. 2022; Litten and Stewart 2023; Thomas et al. 2022). Consequences of the CYOA pathway choice would then be presented, either providing positive feedback on the correct option, or reflective questions to assist learning before returning to re-evaluate the virtual scene, developing a learner’s capability and ability to act on the information presented (Blaschke and Hase 2016; Cochrane et al. 2018).
Incorporating a review of both (a) the problem and resulting actions and outcomes, and (b) how the problem-solving process influences their own beliefs and actions promotes self-reflection (heutagogy) could be included in future iterations with a reflective journal and prompting questions based on the expectations for the week.
4.3 System Usability Scale (SUS) and post-evaluation survey
The combined results of the System Usability Scale (SUS) and the post-evaluation survey produced some conflicting findings. This may be a result of the design of the SUS (alternating positive and negative statements) and the focus on the use of the virtual headset, which was novel for most (96%) participants. However, the mixed responses to usability are comparable to findings in Saab et al. (2023) those who reported that while the use of virtual reality clinical scenarios promoted clinical decision-making and critical thinking, familiarity with the use of mXR was initially confusing before figuring out how to use it. The inclusion of a brief tutorial video and bullet point instructions may positively impact the usability of the virtual headset.
4.4 Limitations
While the study advocates for mXR to facilitate critical thinking in HPE, some limitations exist. Firstly, the participant number (n = 25) represents only 20% of the source population in the doctor of physiotherapy programme thereby limiting generalisability. Although it was anticipated that the population would not be representative of students new to tertiary study, the age range was narrow (22–26 years of age) for a post-graduate degree. This, however, could be generalisable to both undergraduate and post-graduate student cohorts. As the first iteration of Educational Design Research, realised design principles cannot be reported at this point of the project. However, some suggestions for future iterations have been presented above. In addition, next steps may integrate design principles in other health education programmes and variations of mXR. This iteration was limited to six weeks in a semester, however, future development could consider the value of consolidating or expanding this timeframe for engagement, critical thinking skills and dispositional development. mXR-facilitated critical thinking may be intentionally scaffolded across the health programme to supplement student clinical experiences (i.e. orientation, traumatic and/ or complex scenarios), the value of which accrediting professional bodies could consider toward competency and registration requirements (i.e. clinical hours).
5 Conclusion
This paper presents findings from the first iteration of a larger educational design research project. The study demonstrated that critical thinking improved using a heutagogical, social constructivist approach while co-designing a virtual environment for health professional education. Some elements of critical thinking may be influenced by inherent perceptions of a Generation Z cohort and pre-exposure to the development of these elements in previous degrees. The usability and learning experience of immersive mobile extended reality for health professional education is encouraging, with suggestions for future iterations presented.
Data availability
No datasets were generated or analysed during the current study.
References
Abbonizio J, Palermo C, Brand G et al (2024) Co-designing formal health professions curriculum in partnership with students: a scoping review. Med Teach. https://doi.org/10.1080/0142159X.2024.2339403
Abril D (2024) Gen Z workers can take criticism. You’re just phrasing it wrong. The Washington Post. https://www.washingtonpost.com/technology/2024/04/10/gen-z-feedback-work/. Accessed 10th September 2024
Audétat M-C, Laurin S, Sanche G et al (2013) Clinical reasoning difficulties: a taxonomy for clinical teachers. Med Teach 35:e984–e989. https://doi.org/10.3109/0142159X.2012.733041
Australian Bureau of Statistics (2022) 2021 Census shows Millennials overtaking Boomers. https://www.abs.gov.au/media-centre/media-releases/2021-census-shows-millennials-overtaking-boomers. Accessed 27 June 2024
Bangor A, Kortum PT, Miller JT (2008) An empirical evaluation of the system usability scale. Int J Hum Comput Interact 24:574–594. https://doi.org/10.1080/10447310802205776
Basinger KL, Alvarado D, Ortega AV et al (2021) Creating ACTIVE learning in an online environment. In: 2021 ASEE annual conference, virtual conference. https://peer.asee.org/36870
Baysan A, Çonoğlu G, Özkütük N et al (2023) Come and see through my eyes: a systematic review of 360-degree video technology in nursing education. Nurse Educ Today 128:105886. https://doi.org/10.1016/j.nedt.2023.105886
Berg C, Philipp R, Taff SD (2021) Scoping Review of Critical Thinking Literature in Healthcare Education. Occupational Therapy In Health Care 1–18. https://doi.org/10.1080/07380577.2021.1879411
Blair C, Walsh C, Best P (2021) Immersive 360° videos in health and social care education: a scoping review. BMC Med Educ 21:590. https://doi.org/10.1186/s12909-021-03013-y
Blaschke LM (2012) Heutagogy and lifelong learning: a review of heutagogical practice and self-determined learning. Int Rev Res Open Distance Learn 13:56–71. https://doi.org/10.19173/irrodl.v13i1.1076
Blaschke LM (2021) The dynamic mix of heutagogy and technology: preparing learners for lifelong learning. Br J Edu Technol 52:1629–1645. https://doi.org/10.1111/bjet.13105
Blaschke LM, Hase S (2015) Heutagogy, technology, and lifelong learning for professional and part-time learners. In: Dailey-Hebert A, Dennis K (eds) Transformative perspectives and processes in higher education, 1st edn. Springer, New York, pp 75–94
Blaschke LM, Hase S (2016) Heutagogy: a holistic framework for creating twenty-first- century self-determined learners. In: Gros B, Kinshuk Marcelo M (eds) The future of ubiquitous learning: learning designs for emerging pedagogies. Springer, pp 25–40
Blaschke LM, Hase S (2019) Heutagogy and digital media networks: setting students on the path to lifelong learning. Pac J Technol Enhanc Learn 1:1–14. https://doi.org/10.24135/pjtel.v1i1.1
Braun V, Clarke V (2022) Thematic analysis: a practical guide, 1st edn. SAGE Publications Ltd, London
Breakstone J, Smith M, Wineburg S et al (2021) Students’ civic online reasoning: a national portrait, 1st edn
Brooke J (1986) SUS: a quick and dirty usability scale. In: Jordan PW, Thomas B, Weermeester BA, McClelland AL (eds) Usability evaluation in industry. Taylor and Francis, London, pp 1–7
Canning N (2010) Playing with heutagogy: exploring strategies to empower mature learners in higher education. J Furth High Educ 34:59–71. https://doi.org/10.1080/03098770903477102
Canning N, Callan S (2010) Heutagogy: spirals of reflection to empower learners in higher education. Reflective Pract 11:71–82. https://doi.org/10.1080/14623940903500069
Carbogim FD, Barbosa AC, de Oliviera LB et al (2018) Educational intervention to improve critical thinking for undergraduate nursing students: a randomized clinical trial. Nurse Educ Pract 33:121–126. https://doi.org/10.1016/j.nepr.2018.10.001
Carter AG, Müller A, Graham K et al (2022) Critical thinking development in undergraduate midwifery students: an Australian validation study using Rasch analysis. BMC Pregnancy Childbirth. https://doi.org/10.1186/s12884-022-05303-9
Cazzell M, Anderson M (2016) The impact of critical thinking on Clinical Judgment during Simulation with senior nursing students. Nurse Educ Perspect 37:83–90
Chan ZCY (2013) A systematic review of critical thinking in nursing education. Nurse Educ Today 33:236–240. https://doi.org/10.1016/j.nedt.2013.01.007
Cheah WH, Mat Jusoh N, Aung MMT et al (2023) Mobile technology in medicine: development and validation of an adapted System Usability Scale (SUS) questionnaire and modified technology acceptance model (TAM) to evaluate user experience and acceptability of a mobile application in MRI safety screening. Indian J Radiol Imaging 33:36–45. https://doi.org/10.1055/s-0042-1758198
Cochrane T, Stretton T, Aiello S et al (2018) Authentic interprofessional health education scenarios using mobile VR. Res Learn Technol 26. https://doi.org/10.25304/rlt.v26.2130
Cohen J (1988) Statistical power analysis for the behavioral sciences, 2nd edn. L. Erlbaum Associates, Hillsdale
Cragun DL, Hunt PP, Dean M et al (2024) Creation and beta testing of a choose your own adventure digital simulation to reinforce motivational interviewing skills in genetic counseling. J Genet Couns 33:15–27. https://doi.org/10.1002/jgc4.1833
Dewey J (1910) What is thought? In: Dewey J (ed) How we think, 1st edn. D. C. Heath, Lexington, pp 1–13
Dissen A (2023) A critical issue: assessing the critical thinking skills and dispositions of undergraduate health science students. Discover Educ 2:21. https://doi.org/10.1007/s44217-023-00044-z
Dwyer CP (2023) An evaluative review of barriers to critical thinking in educational and real-world settings. J Intell 11:105. https://doi.org/10.3390/jintelligence11060105
Escalada-Hernandez P, Soto-Ruiz N, Ballesteros-Egüés T et al (2024) Usability and user expectations of a HoloLens-based augmented reality application for learning clinical technical skills. Virtual Real. https://doi.org/10.1007/s10055-024-00984-3
Evens M, Empsen M, Hustinx W (2023) A literature review on 360-degree video as an educational tool: towards design guidelines. J Comput Educ 10:325–375. https://doi.org/10.1007/s40692-022-00233-z
Facione PA (1990) Critical Thinking: a Statement of Expert Consensus for Purposes of Educational Assessment and Instruction, (The Delphi Report). https://insightassessment.com/iaresource/the-delphi-report-a-statement-of-expert-consensus-on-the-definition-of-critical-thinking/. Accessed 5 September 2024
Facione N, Facione P (2023) Health Sciences Reasoning Test (HSRT): user manual and resource guide, 2023 edn. California Academic Press, Millbrae
Flowers M, Yates C, Fletcher J et al (2020) Does dosing of pediatric experiential learning impact the development of clinical reasoning, self-efficacy, and critical thinking in DPT students? J Allied Health 49:190–196
Forneris SG (2015) Enhancing clinical reasoning through simulation debriefing: a multisite study. Nurs Educ Perspect 36:304–310. https://doi.org/10.5480/15-1672
Furze J, Black L, Hoffman J et al (2015) Exploration of students’ clinical reasoning development in Professional Physical Therapy Education. J Phys Ther Educ 29:22–33. https://doi.org/10.1097/00001416-201529030-00005
Hase S (2009) Heutagogy and e-learning in the workplace: some challenges and opportunities. Impact J Appl Res Workplace E-learning. https://doi.org/10.5043/impact.13
Hase S, Kenyon C (2000) From andragogy to heutagogy. UltiBase articles. https://www.researchgate.net/publication/301339522_From_andragogy_to_heutagogy. Accessed 26 Sept 2024
Hase S, Kenyon C (2007) Heutagogy: a child of complexity theory. complicity: Int J Complex Educ 4:111–118. https://doi.org/10.29173/cmplct8766
Hattie J, Timperley H (2007) The power of feedback. Rev Educ Res 77:81–112. https://doi.org/10.3102/003465430298487
Hu Z (2023) Promoting critical thinking through socratic questions in health sciences work-integrated learning. Int J Learn Teach Educ Res 22:137–151. https://doi.org/10.26803/ijlter.22.6.8
Huhn K, Black L, Jensen GM et al (2011) Construct validity of the health science reasoning test. J Allied Health 40:181–186
Huhn K, Black L, Jensen GM et al (2013) Tracking change in critical-thinking skills. J Phys Ther Educ 27:26–31. https://journals.lww.com/jopte/fulltext/2013/07000/tracking_change_in_critical_thinking_skills.5.aspx
Jans C, Bogossian F, Andersen P et al (2023) Examining the impact of virtual reality on clinical decision making: an integrative review. Nurse Educ Today 125:105767. https://doi.org/10.1016/j.nedt.2023.105767
Jogerst K, Chou E, Tanious A et al (2022) Virtual Simulation of intra-operative decision-making for open abdominal aortic aneurysm repair: a mixed methods analysis. J Surg Educ 79:1043–1054. https://doi.org/10.1016/j.jsurg.2022.03.004
Ju H, Choi I (2017) The role of argumentation in hypothetico-deductive reasoning during problem-based learning in medical education: a conceptual framework. Interdisciplinary J Problem-based Learn 12. https://doi.org/10.7771/1541-5015.1638
Karlsen B, Hillestad TM, Dysvik E (2021) Abductive reasoning in nursing: challenges and possibilities, 1st edn. Blackwell Publishing Ltd, Oxford
Koivisto J, Haavisto E, Niemi H et al (2018) Design principles for simulation games for learning clinical reasoning: a design-based research approach. Nurse Educ Today 60:114–120. https://doi.org/10.1016/j.nedt.2017.10.002
Kuhn D, Dean JD (2004) Metacognition: a bridge between cognitive psychology and educational practice. Theory Pract 43:268–273. https://doi.org/10.1207/s15430421tip4304_4
Larsson K (2017) Understanding and teaching critical thinking—a new approach. Int J Educ Res 84:32–42. https://doi.org/10.1016/j.ijer.2017.05.004
Lave J, Wenger E (1991) Situated learning: legitimate peripheral participation, 1st edn. Cambridge University Press, Cambridge
Lewis JR, Sauro J (2009) The factor structure of the System Usability Scale. In: Kurosu M (ed) Human centered design 2009, 1st edn. Springer, Berlin, pp 94–103
Lin C-C, Han C-Y, Chen L-C et al (2023) Undergraduate nurses’ reflections on visual thinking learning to construct inductive reasoning through situated patient pictures: a mixed-method study. Nurse Educ Today 131:105991. https://doi.org/10.1016/j.nedt.2023.105991
Litten K, Stewart MP (2023) Implementing a choose your own adventure activity to improve insulin decision making. Curr Pharm Teach Learn 15:149–154. https://doi.org/10.1016/j.cptl.2023.02.020
Macdiarmid R, Merrick E, Winnington R (2024) Using unfolding case studies to develop critical thinking for graduate entry nursing students: an educational design research study. BMC Nurs 23:399. https://doi.org/10.1186/s12912-024-02076-8
McKenney SE, Reeves TC (2019) Conducting educational design research, 2nd edn. Routledge, Taylor & Francis Group, London
Nicolini D, Scarbrough H, Gracheva J (2016) Communities of practice and situated learning in health care. Oxford University Press, Oxford
O’Connor S, Zhang M, Trout KK et al (2021) Co-production in nursing and midwifery education: a systematic review of the literature. Nurse Educ Today 102. https://doi.org/10.1016/j.nedt.2021.104900
Oliver KM (2000) Methods for developing constructivism learning on the web. Educ Technol 40:5–18
Paul R, Elder L (2006) Critical thinking: learn the tools the best thinkers use, concise edn. Pearson/Prentice Hall, Upper Saddle River
Poynter Institute for Media Studies (2022) A Global Study on information literacy: understanding generational behaviors and concerns around false and misleading information online. https://www.poynter.org/wp-content/uploads/2022/08/A-Global-Study-on-Information-Literacy-1.pdf. Accessed 14 Aug 2024
Rupp MA, Kozachuk J, Michaelis JR et al (2019) Investigating learning outcomes and subjective experiences in 360-degree videos, 1st edn. Elsevier Ltd, London
Saab MM, McCarthy M, O’Mahony B et al (2023) Virtual reality simulation in nursing and midwifery education: a usability study. Comput Inf Nurs 41:815–824. https://journals.lww.com/cinjournal/fulltext/2023/10000/virtual_reality_simulation_in_nursing_and.12.aspx
Saghafi F, Blakey N, Guinea S et al (2024) Effectiveness of simulation in nursing students’ critical thinking scores: a pre-/post-test study. Clin Simul Nurs 89:101500. https://doi.org/10.1016/j.ecns.2023.101500
Sauro J (2018) 5 ways to Interpret a SUS Score. https://measuringu.com/interpret-sus-score/. Accessed 28 Sept 2024
Schvaneveldt N, Diekema AR, Hopkins ES et al (2022) New nurses apply only basic source evaluation criteria but realize their skills are lacking: more sophisticated approaches to teaching evaluation skills are required. Health Inf Libr J 39:166–177. https://doi.org/10.1111/hir.12395
Snelson C, Hsu Y-C (2020) Educational 360-degree videos in virtual reality: a scoping review of the emerging research. TechTrends 64:404–412. https://doi.org/10.1007/s11528-019-00474-3
Stretton T, Cochrane T (2023) Reality check: Insights on critical thinking in health education through mobile mixed reality. Australasian Society for Computers in Learning in Tertiary Education (ASCILITE) annual conference, 2023- People, partnerships and pedagogies, Christchurch, New Zealand
Stretton T, Cochrane T, Sevigny C et al (2024) Exploring mobile mixed reality for critical thinking skills in healthcare education: a systematic review. Nurse Educ Today 133:106072. https://doi.org/10.1016/j.nedt.2023.106072
Thomas A, Menon A, Boruff J et al (2014) Applications of social constructivist learning theories in knowledge translation for healthcare professionals: a scoping review. Implement Sci 9:54–74. https://doi.org/10.1186/1748-5908-9-54
Thomas SP, Fathy R, Aepli S et al (2022) Comparative evaluation of choose your own adventure and traditional linear case formats in radiology small group teaching. Acad Radiol 29:585–588. https://doi.org/10.1016/j.acra.2021.10.022
Treasure-Jones T, Joynes V (2018) Co-design of technology-enhanced learning resources. Clin Teach 15:281–286. https://doi.org/10.1111/tct.12733
Vygotsky LS (1978) Mind in society: the development of higher psychological processes, 1st edn. Harvard University Press, Cambridge
Wenger E (1999) Communities of practice: learning, meaning, and identity, 1st edn. Cambridge University Press, Cambridge
Wenger E (2004) Knowledge management as a doughnut: shaping your knowledge strategy through communities of practice. Ivey Bus J 68:1–8
Wisniewski B, Zierer K, Hattie J (2020) The power of feedback revisited: a meta-analysis of educational feedback research. Front Psychol 10. https://doi.org/10.3389/fpsyg.2019.03087
Yoo S, Heo S, Song S et al (2024) Adoption of augmented reality in educational programs for nurses in intensive care units of tertiary academic hospitals: mixed methods study. JMIR Serious Games 12:e54188. https://doi.org/10.2196/54188
Zlamal J, Roth Gjevjon E, Fossum M et al (2022) Technology-supported guidance models stimulating the development of critical thinking in clinical practice: mixed methods systematic review. JMIR Nurs 5:e37380. https://doi.org/10.2196/37380
Funding
Open Access funding enabled and organized by CAUL and its Member Institutions.
Author information
Authors and Affiliations
Contributions
T.S. and T.C. developed the study conception, methodology and design. T.S. wrote the main manuscript. Material preparation, data collection and analysis were performed by T.S, T.C. and J.S. T.S. prepared Figs. 1, 2 and 3. All authors reviewed the manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Stretton, T., Cochrane, T., Sevigny, C. et al. Co-designing critical thinking in health professional education: a 360 immersive environment case study. Virtual Reality 29, 40 (2025). https://doi.org/10.1007/s10055-025-01115-2
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s10055-025-01115-2