Abstract
Collaboration is an important element in learning activities but it is difficult to facilitate it in remote education. Mixed reality provides the opportunity to create a seamless shared environment with learning materials, co-located students and remote participants. To investigate how students can learn with such a collaboration scenario in mixed reality whilst minimizing the disruption in existing learning workflows, we realized the open-source system Realitybox Collab. It is a reusable H5P module based on the WebXR Device API specification and can be integrated into existing learning management systems like Moodle. It allows lecturers to upload 3D models and to annotate them with other interactive content. Students can then collaboratively view these learning activities in the shared space using virtual reality headsets, augmented reality on the smartphone or on a desktop computer. We evaluated the solution in a user study with 14 students. The results indicate a high usability, a positive attitude of the students towards collaboration with mixed reality and validate the seamless integration of the WebXR system into the existing learning infrastructure. Realitybox Collab is a step towards widespread collaborative learning solutions with mixed reality that are embedded in existing learning management systems and which can be accessed on the Web from any device.
We thank the German Federal Ministry of Education and Research for their support within the project “Personalisierte Kompetenzentwicklung und hybrides KI-Mentoring” (tech4compKI; id: 16DHB2213).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bangor, A., Kortum, P.T., Miller, J.T.: An empirical evaluation of the system usability scale. Int. J. Hum.-Comput. Interact. 24(6), 574–594 (2008). https://doi.org/10.1080/10447310802205776
Brooke, J.: SUS: a quick and dirty usability scale. In: Jordan, P.W., Thomas, B., Weerdmeester, B.A., McClelland, I.L. (eds.) Usability Evaluation in Industry, pp. 189–194. Taylor & Francis, Milton Park (1996). https://www.taylorfrancis.com/chapters/edit/10.1201/9781498710411-35/sus-quick-dirty-usability-scale-john-brooke
Cleto, B., Carvalho, R., Ferreira, M.: Students’ perceptions exploring a WebXR learning environment. In: Brooks, E., Sjöberg, J., Møller, A.K. (eds.) DLI 2021. LNICST, vol. 435, pp. 230–241. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-06675-7_17
Eide, Ø., Schubert, Z., Türkoğlu, E., Wieners, J.G., Niebes, K.: The intangibility of tangible objects: re-telling artefact stories through spatial multimedia annotations and 3D objects (2019). https://doi.org/10.5281/zenodo.3878966
Hafidz, I.A.A., et al.: Design of collaborative WebXR for medical learning platform. In: 2021 International Electronics Symposium (IES), pp. 499–504. IEEE (2021). https://doi.org/10.1109/IES53407.2021.9593951
Hensen, B., Koren, I., Klamma, R., Herrler, A.: An augmented reality framework for gamified learning. In: Hancke, G., Spaniol, M., Osathanunkul, K., Unankard, S., Klamma, R. (eds.) ICWL 2018. LNCS, vol. 11007, pp. 67–76. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-96565-9_7
Hudak, M., Korecko, S., Sobota, B.: Advanced user interaction for web-based collaborative virtual reality. In: 2020 11th IEEE International Conference on Cognitive Infocommunications (CogInfoCom), pp. 000343–000348. IEEE (2020). https://doi.org/10.1109/CogInfoCom50765.2020.9237899
Jones, B., Goregaokar, M., Cabanier, R.: WebXR device API: W3C candidate recommendation draft (2023). https://www.w3.org/TR/2023/CRD-webxr-20230303/
Laal, M., Ghodsi, S.M.: Benefits of collaborative learning. Procedia. Soc. Behav. Sci. 31, 486–490 (2012). https://doi.org/10.1016/j.sbspro.2011.12.091
Meyer, F., Gehrke, C., Schäfer, M.: Evaluating user acceptance using WebXR for an augmented reality information system. In: 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), pp. 418–419. IEEE (2021). https://doi.org/10.1109/VRW52623.2021.00091
Milgram, P., Kishino, F.: A taxonomy of mixed reality visual displays. IEICE Trans. Inf. Syst.E77-D(12), 1321–1329 (1994)
Nicolaescu, P., Jahns, K., Derntl, M., Klamma, R.: Yjs: a framework for near real-time P2P shared editing on arbitrary data types. In: Cimiano, P., Frasincar, F., Houben, G.-J., Schwabe, D. (eds.) ICWE 2015. LNCS, vol. 9114, pp. 675–678. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-19890-3_55
Nicolaescu, P., Toubekis, G., Klamma, R.: A microservice approach for near real-time collaborative 3D objects annotation on the web. In: Li, F.W.B., Klamma, R., Laanpere, M., Zhang, J., Manjón, B.F., Lau, R.W.H. (eds.) ICWL 2015. LNCS, vol. 9412, pp. 187–196. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-25515-6_17
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Hensen, B., Kühlem, K. (2023). Collaborative Learning in Mixed Reality Using WebXR and H5P. In: Milrad, M., et al. Methodologies and Intelligent Systems for Technology Enhanced Learning, 13th International Conference. MIS4TEL 2023. Lecture Notes in Networks and Systems, vol 764. Springer, Cham. https://doi.org/10.1007/978-3-031-41226-4_25
Download citation
DOI: https://doi.org/10.1007/978-3-031-41226-4_25
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-41225-7
Online ISBN: 978-3-031-41226-4
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)