skip to main content
survey

Extended Reality (XR) Toward Building Immersive Solutions: The Key to Unlocking Industry 4.0

Published: 25 April 2024 Publication History

Abstract

When developing XR applications for Industry 4.0, it is important to consider the integration of visual displays, hardware components, and multimodal interaction techniques that are compatible with the entire system. The potential use of multimodal interactions in industrial applications has been recognized as a significant factor in enhancing humans’ ability to perform tasks and make informed decisions. To offer a comprehensive analysis of the current advancements in industrial XR, this review presents a structured tutorial that provides answers to the following research questions: (R.Q.1) What are the similarities and differences between XR technologies, including augmented reality (AR), mixed reality (MR), Augmented Virtuality (AV), and virtual reality (VR) under Industry 4.0 consideration? (R.Q.2) What types of visual displays and hardware devices are needed to present XR for Industry 4.0? (R.Q.3) How did the multimodal interaction in XR perceive and relate to Industry 4.0? (R.Q.4) How have modern adaptations of XR technologies dealt with the theme of Industry 4.0? (R.Q.5) How can XR technologies in Industry 4.0 develop their services and usages to be more solution-inclusive? This review showcases various instances that demonstrate XR’s potential to transform how humans interact with the physical world in Industry 4.0. These advancements can increase productivity, reduce costs, and enhance safety.

References

[1]
Ahmad Abushakra and Miad Faezipour. 2014. Augmenting breath regulation using a mobile driven virtual reality therapy framework. IEEE Journal of Biomedical and Health Informatics 18, 3 (2014), 746–752.
[2]
Roland Aigner, Daniel Wigdor, Hrvoje Benko, Michael Haller, David Lindbauer, Alexandra Ion, Shengdong Zhao, and JTKV Koh. 2012. Understanding mid-air hand gestures: A study of human preferences in usage of gesture types for HCI. Microsoft Research TechReport MSR-TR-2012-111 2 (2012), 30.
[3]
Heikki Aisala, Jussi Rantala, Saara Vanhatalo, Markus Nikinmaa, Kyösti Pennanen, Roope Raisamo, and Nesli Sözer. 2020. Augmentation of perceived sweetness in sugar reduced cakes by local odor display. In Companion Publication of the 2020 International Conference on Multimodal Interaction (Virtual Event, Netherlands) (ICMI’20 Companion). Association for Computing Machinery, New York, NY, USA, 322–327.
[4]
M. Alam, M. D. Samad, L. Vidyaratne, A. Glandon, and K. M. Iftekharuddin. 2020. Survey on deep neural networks in speech and vision systems. Neurocomputing 417 (2020), 302–321.
[5]
A’Aeshah Alhakamy and Mihran Tuceryan. 2020. Real-time illumination and visual coherence for photorealistic augmented/mixed reality. ACM Comput. Surv. 53, 3, Article 49 (May 2020), 34 pages.
[6]
Sepehr Alizadehsalehi, Ahmad Hadavi, and Joseph Chuenhuei Huang. 2019. BIM/MR-lean construction project delivery management system. In 2019 IEEE Technology Engineering Management Conference (TEMSCON). 1–6.
[7]
Sepehr Alizadehsalehi, Ahmad Hadavi, and Joseph Chuenhuei Huang. 2019. Virtual reality for design and construction education environment. AEI 2019: Integrated Building Solutions–The National Agenda (2019), 193–203.
[8]
Sepehr Alizadehsalehi, Ahmad Hadavi, and Joseph Chuenhuei Huang. 2020. From BIM to extended reality in AEC industry. Automation in Construction 116 (2020), 103254.
[9]
Leopoldo Angrisani, Pasquale Arpaia, Nicola Moccaldi, and Antonio Esposito. 2018. Wearable augmented reality and brain computer interface to improve human-robot interactions in smart industry: A feasibility study for SSVEP signals. In 2018 IEEE 4th International Forum on Research and Technology for Society and Industry (RTSI). 1–5.
[10]
Hilary Arksey and Lisa O’Malley. 2005. Scoping studies: Towards a methodological framework. International Journal of Social Research Methodology 8, 1 (2005), 19–32.
[11]
Pasquale Arpaia, Egidio De Benedetto, and Luigi Duraccio. 2021. Design, implementation, and metrological characterization of a wearable, integrated AR-BCI hands-free system for health 4.0 monitoring. Measurement 177 (2021), 109280.
[12]
Jonas Auda, Max Pascher, and Stefan Schneegass. 2019. Around the (virtual) world: Infinite walking in virtual reality using electrical muscle stimulation. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland UK) (CHI’19). Association for Computing Machinery, New York, NY, USA, 1–8.
[13]
Mirjam Augstein and Thomas Neumayr. 2019. A human-centered taxonomy of interaction modalities and devices. Interacting with Computers 31, 1 (02 2019), 27–58. https://academic.oup.com/iwc/article-pdf/31/1/27/28666029/iwz003.pdf
[14]
Malika Auvray and Charles Spence. 2008. The multisensory perception of flavor. Consciousness and Cognition 17, 3 (2008), 1016–1031.
[15]
Ashraf Ayoub and Yeshwanth Pulijala. 2019. The application of virtual reality and augmented reality in oral & maxillofacial surgery. BMC Oral Health 19, 1 (2019), 1–8.
[16]
Mohd Daniel Azraff Bin Rozmi, Gokul Sidarth Thirunavukkarasu, Elmira Jamei, Mehdi Seyedmahmoudian, Saad Mekhilef, Alex Stojcevski, and Ben Horan. 2019. Role of immersive visualization tools in renewable energy system development. Renewable and Sustainable Energy Reviews 115 (2019), 109363.
[17]
Tae Hyun Baek, Chan Yun Yoo, and Sukki Yoon. 2018. Augment yourself through virtual mirror: The impact of self-viewing and narcissism on consumer responses. International Journal of Advertising 37, 3 (2018), 421–439.
[18]
Chunguang Bai, Patrick Dallasega, Guido Orzes, and Joseph Sarkis. 2020. Industry 4.0 technologies assessment: A sustainability perspective. International Journal of Production Economics 229 (2020), 107776.
[19]
Huidong Bai, Prasanth Sasikumar, Jing Yang, and Mark Billinghurst. 2020. A User Study on Mixed Reality Remote Collaboration with Eye Gaze and Hand Gesture Sharing. Association for Computing Machinery, New York, NY, USA, 1–13.
[20]
Wided Batat. 2021. How augmented reality (AR) is transforming the restaurant sector: Investigating the impact of “Le petit chef” on customers’ dining experiences. Technological Forecasting and Social Change 172 (2021), 121013.
[21]
Oliver Baus, Stéphane Bouchard, and Kevin Nolet. 2019. Exposure to a pleasant odour may increase the sense of reality, but not the sense of presence or realism. Behaviour & Information Technology 38, 12 (2019), 1369–1378.
[22]
Brett Benda and Eric D. Ragan. 2021. The effects of virtual avatar visibility on pointing interpretation by observers in 3D environments. In 2021 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 50–59.
[23]
Joanna Bergström, Aske Mottelson, and Jarrod Knibbe. 2019. Resized grasping in VR: Estimating thresholds for object discrimination. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (New Orleans, LA, USA) (UIST’19). Association for Computing Machinery, New York, NY, USA, 1175–1183.
[24]
Christoph Bichlmeier, Sandro Michael Heining, Marco Feuerstein, and Nassir Navab. 2009. The virtual mirror: A new interaction paradigm for augmented reality environments. IEEE Transactions on Medical Imaging 28, 9 (2009), 1498–1510.
[25]
Ross Bille, Shamus P. Smith, Kim Maund, and Graham Brewer. 2014. Extending building information models into game engines. In Proceedings of the 2014 Conference on Interactive Entertainment (Newcastle, NSW, Australia) (IE2014). Association for Computing Machinery, New York, NY, USA, 1–8.
[26]
Oliver Bimber, Bernd Fröhlich, Dieter Schmalstieg, and L. Miguel Encarnação. 2006. The virtual showcase. In ACM SIGGRAPH 2006 Courses (Boston, Massachusetts) (SIGGRAPH’06). Association for Computing Machinery, New York, NY, USA, 9–es.
[27]
Udo Birk and Philipp Roebrock. 2019. User-position aware adaptive display of 3D data without additional stereoscopic hardware. In Eleventh International Conference on Machine Vision (ICMV 2018), Antanas Verikas, Dmitry P. Nikolaev, Petia Radeva, and Jianhong Zhou (Eds.), Vol. 11041. International Society for Optics and Photonics, SPIE, 142–148.
[28]
Shantonu Biswas and Yon Visell. 2019. Emerging material technologies for haptics. Advanced Materials Technologies 4, 4 (2019), 1900042.
[29]
Richard A. Bolt. 1980. “Put-that-there”: Voice and gesture at the graphics interface. SIGGRAPH Comput. Graph. 14, 3 (Jul. 1980), 262–270.
[30]
Marina Carulli, Monica Bordegoni, and Umberto Cugini. 2016. Integrating scents simulation in virtual reality multisensory environment for industrial products evaluation. Computer-Aided Design and Applications 13, 3 (2016), 320–328.
[31]
Polona Caserman, Philip Schmidt, Thorsten Gobel, Jonas Zinnacker, Andre Kecke, and Stefan Gobel. 2022. Impact of full-body avatars in immersive multiplayer virtual reality training for police forces. IEEE Transactions on Games (2022), 1–1.
[32]
Jad Chalhoub and Steven K. Ayer. 2018. Using mixed reality for electrical construction design communication. Automation in Construction 86 (2018), 1–10.
[33]
Yeqing Chen, Yulong Bian, Chenglei Yang, Xiyu Bao, Yafang Wang, Gerard De Melo, Juan Liu, Wei Gai, Lu Wang, and Xiangxu Meng. 2019. Leveraging blowing as a directly controlled interface. In 2019 IEEE SmartWorld, Ubiquitous Intelligence & Computing, Advanced & Trusted Computing, Scalable Computing & Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation (SmartWorld/SCALCOM/UIC/ATC/CBDCom/IOP/SCI). 419–424.
[34]
Jack C. P. Cheng, Keyu Chen, and Weiwei Chen. 2020. State-of-the-art review on mixed reality applications in the AECO industry. Journal of Construction Engineering and Management 146, 2 (2020), 03119009.
[35]
Adrian David Cheok and Kasun Karunanayaka. 2018. Virtual Taste and Smell Technologies for Multisensory Internet and Virtual Reality. Springer.
[36]
Marina Cidota, Stephan Lukosch, Dragos Datcu, and Heide Lukosch. 2016. Comparing the effect of audio and visual notifications on workspace awareness using head-mounted displays for remote collaboration in augmented reality. Augmented Human Research 1, 1 (2016), 1–15.
[37]
Adrian Ciprian Firu, Alin Ion Tapîrdea, Anamaria Ioana Feier, and George Drăghici. 2021. Virtual reality in the automotive field in Industry 4.0. Materials Today: Proceedings 45 (2021), 4177–4182. 8th International Conference on Advanced Materials and Structures - AMS 2020.
[38]
Beth Coleman. 2009. Using sensor inputs to affect virtual and real environments. IEEE Pervasive Computing 8, 3 (2009), 16–23.
[39]
Armando W. Colombo, Stamatis Karnouskos, and Thomas Bangemann. 2014. Towards the Next Generation of Industrial Cyber-Physical Systems. Springer International Publishing, Cham, 1–22.
[40]
Carolina Cruz-Neira, Daniel J. Sandin, and Thomas A. DeFanti. 1993. Surround-screen projection-based virtual reality: The design and implementation of the CAVE. In Proceedings of the 20th Annual Conference on Computer Graphics and Interactive Techniques (Anaheim, CA) (SIGGRAPH’93). Association for Computing Machinery, New York, NY, USA, 135–142.
[41]
Fabrizio Cutolo, Umberto Fontana, Marina Carbone, Renzo D’Amato, and Vincenzo Ferrari. 2017. [POSTER] hybrid video/optical see-through HMD. In 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct). 52–57.
[42]
René de Koster, Tho Le-Duc, and Kees Jan Roodbergen. 2007. Design and control of warehouse order picking: A literature review. European Journal of Operational Research 182, 2 (2007), 481–501.
[43]
Marcelo de Paiva Guimarães, James Miranda Martins, Diego Roberto Colombo Dias, Rita de Fátima Rodrigues Guimarães, and Bruno Barberi Gnecco. 2022. An olfactory display for virtual reality glasses. Multimedia Systems (2022), 1–11.
[44]
Amit P. Desai, Lourdes Peña-Castillo, and Oscar Meruvia-Pastor. 2017. A window to your smartphone: Exploring interaction and communication in immersive VR with augmented virtuality. In 2017 14th Conference on Computer and Robot Vision (CRV). 217–224.
[45]
Jiaqi Dong, Zeyang Xia, and Qunfei Zhao. 2021. Augmented reality assisted assembly training oriented dynamic gesture recognition and prediction. Applied Sciences 11, 21 (2021).
[46]
Meiya Dong, Jumin Zhao, Dianqi Wang, Xin Ding, Zhaobin Liu, Biaokai Zhu, and Jin Yuze. 2020. Central-eye: Gaze tracking research on visual rendering method in industrial virtual reality scene. In Proceedings of the ACM Turing Celebration Conference - China (Hefei, China) (ACM TURC’20). Association for Computing Machinery, New York, NY, USA, 51–57.
[47]
William Dumoulin, Nicolas Thiry, and Rim Slama. 2021. Real time hand gesture recognition in industry. In 2021 3rd International Conference on Video, Signal and Image Processing (Wuhan, China) (VSIP 2021). Association for Computing Machinery, New York, NY, USA, 1–7.
[48]
D. Engelbart. 1988. The Augmented Knowledge Workshop. Association for Computing Machinery, New York, NY, USA, 185–248.
[49]
John Erkoyuncu and Samir Khan. 2020. Olfactory-based augmented reality support for industrial maintenance. IEEE Access 8 (2020), 30306–30321.
[50]
Ahmed Farooq, Grigori Evreinov, Roope Raisamo, and Arto Hippula. 2019. Developing intelligent multimodal IVI systems to reduce driver distraction. In Intelligent Human Systems Integration 2019, Waldemar Karwowski and Tareq Ahram (Eds.). Springer International Publishing, Cham, 91–97.
[51]
Ahmed Farooq, Philipp Weitz, Grigori Evreinov, Roope Raisamo, and Daisuke Takahata. 2016. Touchscreen overlay augmented with the stick-slip phenomenon to generate kinetic energy. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (Tokyo, Japan) (UIST’16 Adjunct). Association for Computing Machinery, New York, NY, USA, 179–180.
[52]
Steven Feiner, Blair MacIntyre, Tobias Höllerer, and Anthony Webster. 1997. A touring machine: Prototyping 3D mobile augmented reality systems for exploring the urban environment. Personal Technologies 1, 4 (1997), 208–217.
[53]
Paula Fraga-Lamas, Tiago M. Fernández-Caramés, Óscar Blanco-Novoa, and Miguel A. Vilar-Montesinos. 2018. A review on industrial augmented reality systems for the Industry 4.0 shipyard. IEEE Access 6 (2018), 13358–13375.
[54]
Euan Freeman, Graham Wilson, Dong-Bach Vo, Alex Ng, Ioannis Politis, and Stephen Brewster. 2017. Multimodal Feedback in HCI: Haptics, Non-Speech Audio, and Their Applications. Association for Computing Machinery and Morgan & Claypool, 277–317.
[55]
Meiqing Fu and Rui Liu. 2018. The application of virtual reality and augmented reality in dealing with project schedule risks. In Proceedings of the Construction Research Congress. 429–438.
[56]
Niklas Gard, Anna Hilsmann, and Peter Eisert. 2019. Projection distortion-based object tracking in shader lamp scenarios. IEEE Transactions on Visualization and Computer Graphics 25, 11 (2019), 3105–3113.
[57]
Liang Gong, Åsa Fast-Berglund, and Björn Johansson. 2021. A framework for extended reality system development in manufacturing. IEEE Access 9 (2021), 24796–24813.
[58]
Juan Miguel Gonzalez Lopez, Ramon Octavio Jimenez Betancourt, Juan M Ramirez Arredondo, Efrain Villalvazo Laureano, and Fernando Rodriguez Haro. 2019. Incorporating virtual reality into the teaching and training of grid-tie photovoltaic power plants design. Applied Sciences 9, 21 (2019).
[59]
S. Goose, S. Sudarsky, Xiang Zhang, and N. Navab. 2003. Speech-enabled augmented reality supporting mobile industrial maintenance. IEEE Pervasive Computing 2, 1 (2003), 65–70.
[60]
Christina Granquist, Susan Y. Sun, Sandra R. Montezuma, Tu M. Tran, Rachel Gage, and Gordon E. Legge. 2021. Evaluation and comparison of artificial intelligence vision aids: Orcam MyEye 1 and seeing AI. Journal of Visual Impairment & Blindness 115, 4 (2021), 277–285.
[61]
Jeremy Hartmann, Yen-Ting Yeh, and Daniel Vogel. 2020. AAR: Augmenting a Wearable Augmented Reality Display with an Actuated Head-Mounted Projector. Association for Computing Machinery, New York, NY, USA, 445–458.
[62]
Takuma Hashimoto, Suzanne Low, Koji Fujita, Risa Usumi, Hiroshi Yanagihara, Chihiro Takahashi, Maki Sugimoto, and Yuta Sugiura. 2018. TongueInput: Input method by tongue gestures using optical sensors embedded in mouthpiece. In the 57th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE), Nara, Japan. 1219–1224. DOI:
[63]
Pien Heuts. 2017. DHL experiments with augmented reality. HesaMag: The European Trade Union Institute’s Health and Safety at Work Magazine 16 (2017), 22–26.
[64]
Richard Holloway, Henry Fuchs, and Warren Robinett. 1992. Virtual-worlds research at the University of North Carolina at Chapel Hill as of February 1992. In Visual Computing, Tosiyasu L. Kunii (Ed.). Springer Japan, Tokyo, 109–128.
[65]
Weidong Huang, Leila Alem, Franco Tecchia, and Henry Been-Lirn Duh. 2018. Augmented 3D hands: A gesture-based mixed reality system for distributed collaboration. Journal on Multimodal User Interfaces 12, 2 (2018), 77–89.
[66]
Daisuke Iwai, Shoichiro Mihara, and Kosuke Sato. 2015. Extended depth-of-field projector by fast focal sweep projection. IEEE Transactions on Visualization and Computer Graphics 21, 4 (2015), 462–470.
[67]
Takayuki Iwamoto, Mari Tatezono, and Hiroyuki Shinoda. 2008. Non-contact method for producing tactile sensation using airborne ultrasound. In Haptics: Perception, Devices and Scenarios, Manuel Ferre (Ed.). Springer Berlin, Berlin, 504–513.
[68]
Marius Kaminskas and Francesco Ricci. 2012. Contextual music information retrieval and recommendation: State of the art and challenges. Computer Science Review 6, 2 (2012), 89–119.
[69]
Eric R. Kandel, James H. Schwartz, Thomas M. Jessell, Steven Siegelbaum, A. James Hudspeth, and Sarah Mack. 2000. Principles of Neural Science. Vol. 4. McGraw-Hill New York.
[70]
Erika Kerruish. 2019. Arranging sensations: Smell and taste in augmented and virtual reality. The Senses and Society 14, 1 (2019), 31–45.
[71]
Tanya Keshari and Suja Palaniswamy. 2019. Emotion recognition using feature-level fusion of facial expressions and body gestures. In 2019 International Conference on Communication and Electronics Systems (ICCES). 1184–1189.
[72]
R. Kijima and T. Ojika. 1997. Transition between virtual environment and workstation environment with projective head mounted display. In Proceedings of IEEE 1997 Annual International Symposium on Virtual Reality. 130–137.
[73]
Pascal Knierim, Thomas Kosch, Valentin Schwind, Markus Funk, Francisco Kiss, Stefan Schneegass, and Niels Henze. 2017. Tactile drones - providing immersive tactile feedback in virtual reality through quadcopters. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI EA’17). Association for Computing Machinery, New York, NY, USA, 433–436.
[74]
Philip Kortum. 2008. HCI Beyond the GUI: Design for Haptic, Speech, Olfactory, and other Nontraditional Interfaces. Elsevier.
[75]
Myron W. Krueger, Thomas Gionfriddo, and Katrin Hinrichsen. 1985. VIDEOPLACEan artificial reality. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (San Francisco, California, USA) (CHI’85). Association for Computing Machinery, New York, NY, USA, 35–40.
[76]
Kuldeep Singh. 2020. The growing list of XR devices. XRPractices 0 (2020). https://medium.com/xrpractices/the-growing-list-of-xr-devices-f102262e4a58
[77]
Andre Kunert, Tim Weissker, Bernd Froehlich, and Alexander Kulik. 2020. Multi-window 3D interaction for collaborative virtual reality. IEEE Transactions on Visualization and Computer Graphics 26, 11 (2020), 3271–3284.
[78]
Philipp Kurth, Vanessa Lange, Christian Siegl, Marc Stamminger, and Frank Bauer. 2018. Auto-calibration for dynamic multi-projection mapping on arbitrary surfaces. IEEE Transactions on Visualization and Computer Graphics 24, 11 (2018), 2886–2894.
[79]
Takahiro Kusabuka and Takuya Indo. 2020. IBUKI: Gesture input method based on breathing. In Adjunct Publication of the 33rd Annual ACM Symposium on User Interface Software and Technology (Virtual Event, USA) (UIST’20 Adjunct). Association for Computing Machinery, New York, NY, USA, 102–104.
[80]
Heiner Lasi, Peter Fettke, Hans-Georg Kemper, Thomas Feld, and Michael Hoffmann. 2014. Industry 4.0. Business & Information Systems Engineering 6, 4 (2014), 239–242.
[81]
Joseph J. LaViola Jr., Ernst Kruijff, Ryan P. McMahan, Doug Bowman, and Ivan P. Poupyrev. 2017. 3D User Interfaces: Theory and Practice. Addison-Wesley Professional.
[82]
Cha Lee, Stephen DiVerdi, and Tobias Höllerer. 2007. An immaterial depth-fused 3D display. In Proceedings of the 2007 ACM Symposium on Virtual Reality Software and Technology (Newport Beach, California) (VRST’07). Association for Computing Machinery, New York, NY, USA, 191–198.
[83]
Nianlong Li, Teng Han, Feng Tian, Jin Huang, Minghui Sun, Pourang Irani, and Jason Alexander. 2020. Get a Grip: Evaluating Grip Gestures for VR Input Using a Lightweight Pen. Association for Computing Machinery, New York, NY, USA, 1–13.
[84]
Xiao Li, Wen Yi, Hung-Lin Chi, Xiangyu Wang, and Albert P. C. Chan. 2018. A critical review of virtual and augmented reality (VR/AR) applications in construction safety. Automation in Construction 86 (2018), 150–162.
[85]
Vasiliki Liagkou, Dimitrios Salmas, and Chrysostomos Stylios. 2019. Realizing virtual reality learning environment for Industry 4.0. Procedia CIRP 79 (2019), 712–717. 12th CIRP Conference on Intelligent Computation in Manufacturing Engineering, 18-20 July 2018, Gulf of Naples, Italy.
[86]
Dominik Lucke, Carmen Constantinescu, and Engelbert Westkämper. 2008. Smart factory - a step towards the next generation of manufacturing. In Manufacturing Systems and Technologies for the New Frontier, Mamoru Mitsuishi, Kanji Ueda, and Fumihiko Kimura (Eds.). Springer London, London, 115–118.
[87]
Zhihan Lv, Liangbing Feng, Haibo Li, and Shengzhong Feng. 2014. Hand-free motion interaction on Google Glass. In SIGGRAPH Asia 2014 Mobile Graphics and Interactive Applications (Shenzhen, China) (SA’14). Association for Computing Machinery, New York, NY, USA, Article 21, 1 pages.
[88]
Chenguang Ma, Jinli Suo, Qionghai Dai, Ramesh Raskar, and Gordon Wetzstein. 2013. High-rank coded aperture projection for extended depth of field. In IEEE International Conference on Computational Photography (ICCP). 1–9.
[89]
Margaret W. Matlin and Hugh J. Foley. 1992. Sensation and Perception. Allyn & Bacon.
[90]
Kristina Matveiuk. 2019. Role of XR wearables in intralogistics field: Insight into AR applications. (2019).
[91]
Sven Mayer, Jens Reinhardt, Robin Schweigert, Brighten Jelke, Valentin Schwind, Katrin Wolf, and Niels Henze. 2020. Improving humans’ ability to interpret deictic gestures in virtual reality. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI’20). Association for Computing Machinery, New York, NY, USA, 1–14.
[92]
James E. Melzer and James W. Porter. 2008. Helmet-mounted display (HMD) upgrade for the US Army’s AVCATT simulation program. In Head-and Helmet-Mounted Displays XIII: Design and Applications, Vol. 6955. International Society for Optics and Photonics, 695504.
[93]
Lorenzo Micaroni, Marina Carulli, Francesco Ferrise, Alberto Gallace, and Monica Bordegoni. 2019. An olfactory display to study the integration of vision and olfaction in a virtual reality environment. Journal of Computing and Information Science in Engineering 19, 3 (2019).
[94]
Riccardo Miccini and Simone Spagnol. 2020. HRTF individualization using deep learning. In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). 390–395.
[95]
Paul Milgram and Fumio Kishino. 1994. A taxonomy of mixed reality visual displays. IEICE Transactions on Information and Systems 77, 12 (1994), 1321–1329.
[96]
Dimitris Mourtzis, John Angelopoulos, and Nikos Panopoulos. 2021. Smart manufacturing and tactile internet based on 5G in Industry 4.0: Challenges, applications and new trends. Electronics 10, 24 (2021).
[97]
Hiromi Nakamura and Homei Miyashita. 2012. Development and evaluation of interactive system for synchronizing electric taste and visual content. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Austin, Texas, USA) (CHI’12). Association for Computing Machinery, New York, NY, USA, 517–520.
[98]
Matthias Neges, Stefan Adwernat, and Michael Abramovici. 2018. Augmented virtuality for maintenance training simulation under various stress conditions. Procedia Manufacturing 19 (2018), 171–178. Proceedings of the 6th International Conference in Through-life Engineering Services, University of Bremen, 7th and 8th November 2017.
[99]
Ian Oakley, Marilyn Rose McGee, Stephen Brewster, and Philip Gray. 2000. Putting the feel in ‘look and feel’. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (The Hague, The Netherlands) (CHI’00). Association for Computing Machinery, New York, NY, USA, 415–422.
[100]
Sandra Odenheimer, Deepika Goyal, Veena Goel Jones, Ruth Rosenblum, Lam Ho, and Albert S. Chan. 2018. Patient acceptance of remote scribing powered by Google Glass in outpatient dermatology: Cross-sectional study. J. Med. Internet Res. 20, 6 (21 Jun. 2018), e10762.
[101]
Anne-Hélène Olivier, Julien Bruneau, Richard Kulpa, and Julien Pettré. 2018. Walking with virtual people: Evaluation of locomotion interfaces in dynamic environments. IEEE Transactions on Visualization and Computer Graphics 24, 7 (2018), 2251–2263.
[102]
Florina Besnea (PETCU), Stefan Cismaru, Andrei Trasculescu, Ionut Resceanu, Marian Ionescu, Hani Hamdan, and Nicu Bizdoaca. 2021. Integration of a haptic glove in a virtual reality-based environment for medical training and procedures. Acta Technica Napocensis - Series: Applied Mathematics, Mechanics, and Engineering 64, 1-S2 (2021). https://atna-mam.utcluj.ro/index.php/Acta/article/view/1524
[103]
Micah D. J. Peters, Christina M. Godfrey, Hanan Khalil, Patricia McInerney, Deborah Parker, and Cassia Baldini Soares. 2015. Guidance for conducting systematic scoping reviews. JBI Evidence Implementation 13, 3 (2015), 141–146.
[104]
Thomas Philbeck and Nicholas Davis. 2018. The Fourth Industrial Revolution: Shaping a new era. Journal of International Affairs 72, 1 (2018), 17–22. https://www.jstor.org/stable/26588339
[105]
Claudio S. Pinhanez. 2001. The everywhere displays projector: A device to create ubiquitous graphical interfaces. In Proceedings of the 3rd International Conference on Ubiquitous Computing (Atlanta, Georgia, USA) (UbiComp’01). Springer-Verlag, Berlin, 315–331.
[106]
F. Gabriele Pratticò, Federico De Lorenzis, and Fabrizio Lamberti. 2021. Look at it this way: A comparison of metaphors for directing the user’s gaze in extended reality training systems. In 2021 7th International Conference of the Immersive Learning Research Network (iLRN). 1–8.
[107]
Uwe Proske and Simon C. Gandevia. 2012. The proprioceptive senses: Their roles in signaling body shape, body position and movement, and muscle force. Physiological Reviews 92, 4 (2012), 1651–1697. PMID: 23073629.
[108]
Andreas Pusch and Frédéric Noël. 2019. Augmented reality for operator training on industrial workplaces – comparing the Microsoft HoloLens vs. small and big screen tactile devices. In Product Lifecycle Management in the Digital Twin Era, Clement Fortin, Louis Rivest, Alain Bernard, and Abdelaziz Bouras (Eds.). Springer International Publishing, Cham, 3–13.
[109]
Felix Putze, Dennis Weiß, Lisa-Marie Vortmann, and Tanja Schultz. 2019. Augmented reality interface for smart home control using SSVEP-BCI and eye gaze. In 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC). 2812–2817.
[110]
Jing Qian, Jiaju Ma, Xiangyu Li, Benjamin Attal, Haoming Lai, James Tompkin, John F. Hughes, and Jeff Huang. 2019. Portal-ble: Intuitive free-hand manipulation in unbounded smartphone-based augmented reality. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (New Orleans, LA, USA) (UIST’19). Association for Computing Machinery, New York, NY, USA, 133–145.
[111]
Roope Raisamo. 1999. Multimodal Human-Computer Interaction: A Constructive and Empirical Study. Tampere University Press.
[112]
Ismo Rakkolainen, Stephen DiVerdi, Alex Olwal, Nicola Candussi, Tobias Hüllerer, Markku Laitinen, Mika Piirto, and Karri Palovuori. 2005. The interactive fogscreen. In ACM SIGGRAPH 2005 Emerging Technologies (Los Angeles, California) (SIGGRAPH’05). Association for Computing Machinery, New York, NY, USA, 8–es.
[113]
Ismo Rakkolainen, Ahmed Farooq, Jari Kangas, Jaakko Hakulinen, Jussi Rantala, Markku Turunen, and Roope Raisamo. 2021. Technologies for multimodal interaction in extended reality-a scoping review. Multimodal Technologies and Interaction 5, 12 (2021), 81.
[114]
Nimesha Ranasinghe, Adrian Cheok, Ryohei Nakatsu, and Ellen Yi-Luen Do. 2013. Simulating the sensation of taste for immersive experiences. In Proceedings of the 2013 ACM International Workshop on Immersive Media Experiences (Barcelona, Spain) (ImmersiveMe’13). Association for Computing Machinery, New York, NY, USA, 29–34.
[115]
Nimesha Ranasinghe, Thi Ngoc Tram Nguyen, Yan Liangkun, Lien-Ya Lin, David Tolley, and Ellen Yi-Luen Do. 2017. Vocktail: A virtual cocktail for pairing digital taste, smell, and color sensations. In Proceedings of the 25th ACM International Conference on Multimedia (Mountain View, California, USA) (MM’17). Association for Computing Machinery, New York, NY, USA, 1139–1147.
[116]
Ramesh Raskar. 2004. Projectors: Advanced graphics and vision techniques. In ACM SIGGRAPH 2004 Course Notes (Los Angeles, CA) (SIGGRAPH’04). Association for Computing Machinery, New York, NY, USA, 23–es.
[117]
H. Regenbrecht, T. Lum, P. Kohler, C. Ott, M. Wagner, W. Wilke, and E. Mueller. 2004. Using augmented virtuality for remote collaboration. Presence 13, 3 (2004), 338–354.
[118]
H. Regenbrecht, C. Ott, M. Wagner, T. Lum, P. Kohler, W. Wilke, and E. Mueller. 2003. An augmented virtuality approach to 3D videoconferencing. In The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.290–291.
[119]
Jun Rekimoto and Katashi Nagao. 1995. The world through the computer: Computer augmented interaction with real world environments. In Proceedings of the 8th Annual ACM Symposium on User Interface and Software Technology (Pittsburgh, Pennsylvania, USA) (UIST’95). Association for Computing Machinery, New York, NY, USA, 29–36.
[120]
Jannick P. Rolland, Richard L. Holloway, and Henry Fuchs. 1995. Comparison of optical and video see-through, head-mounted displays. In Telemanipulator and Telepresence Technologies, Vol. 2351. SPIE, 293–307.
[121]
Jannick P. Rolland and Hong Hua. 2005. Head-mounted display systems. Encyclopedia of Optical Engineering 2 (2005).
[122]
Simone Romano, Nicola Capece, Ugo Erra, Giuseppe Scanniello, and Michele Lanza. 2019. On the use of virtual reality in software visualization: The case of the city metaphor. Information and Software Technology 114 (2019), 92–106.
[123]
Hugo Romat, Christopher Collins, Nathalie Henry Riche, Michel Pahud, Christian Holz, Adam Riddle, Bill Buxton, and Ken Hinckley. 2020. Tilt-responsive techniques for digital drawing boards. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology (Virtual Event, USA) (UIST’20). Association for Computing Machinery, New York, NY, USA, 500–515.
[124]
K. Martin Sagayam, Alex J. Timothy, Chiung Ching Ho, Lawrence Edward Henesey, and Robert Bestak. 2020. Augmented reality-based solar system for e-magazine with 3-D audio effect. International Journal of Simulation and Process Modelling 15, 6 (2020), 524–534.
[125]
Rianty Saimon, Giap Weng Ng, and Siti Hasnah Tanalol. 2022. Application of speech augmented reality for Sabah tourism industry. In Proceedings of the 8th International Conference on Computational Science and Technology, Rayner Alfred and Yuto Lim (Eds.). Springer Singapore, Singapore, 75–87.
[126]
Yusuke Sakai, Toshimitsu Watanabe, Yoshio Ishiguro, Takanori Nishino, and Kazuya Takeda. 2019. Effects on user perception of a’modified’ speed experience through in-vehicle virtual reality. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct Proceedings (Utrecht, Netherlands) (AutomotiveUI’19). Association for Computing Machinery, New York, NY, USA, 166–170.
[127]
Antti Sand, Vera Remizova, I. Scott MacKenzie, Oleg Spakov, Katariina Nieminen, Ismo Rakkolainen, Anneli Kylliäinen, Veikko Surakka, and Julia Kuosmanen. 2020. Tactile feedback on mid-air gestural interaction with a large fogscreen. In Proceedings of the 23rd International Conference on Academic Mindtrek (Tampere, Finland) (AcademicMindtrek’20). Association for Computing Machinery, New York, NY, USA, 161–164.
[128]
Gian Maria Santi, Alessandro Ceruti, Alfredo Liverani, and Francesco Osti. 2021. Augmented reality in Industry 4.0 and future innovation programs. Technologies 9, 2 (2021), 33.
[129]
Anissa Saylany, Michael Spadola, Rachel Blue, Nikhil Sharma, Ali K. Ozturk, and Jang Won Yoon. 2020. The use of a novel heads-up display (HUD) to view intra-operative x-rays during a one-level cervical arthroplasty. World Neurosurgery 138 (2020), 369–373.
[130]
Dieter Schmalstieg and Tobias Hollerer. 2016. Augmented Reality: Principles and Practice. Addison-Wesley Professional.
[131]
Stefania Serafin, Michele Geronazzo, Cumhur Erkut, Niels C. Nilsson, and Rolf Nordahl. 2018. Sonic interactions in virtual reality: State of the art, current challenges, and future directions. IEEE Computer Graphics and Applications 38, 2 (2018), 31–43.
[132]
Kristian T. Simsarian and Karl-Petter Åkesson. 1997. Windows on the world: An example of augmented virtuality.
[133]
Charles Spence, Marianna Obrist, Carlos Velasco, and Nimesha Ranasinghe. 2017. Digitizing the chemical senses: Possibilities & pitfalls. International Journal of Human-Computer Studies 107 (2017), 62–74. Multisensory Human-Computer Interaction.
[134]
Misha Sra, Xuhai Xu, and Pattie Maes. 2018. BreathVR: Leveraging breathing as a directly controlled interface for virtual reality games. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (Montreal QC, Canada) (CHI’18). Association for Computing Machinery, New York, NY, USA, 1–12.
[135]
Piotr Stawicki, Felix Gembler, Cheuk Yin Chan, Mihaly Benda, Aya Rezeika, Abdul Saboor, Roland Grichnik, and Ivan Volosyak. 2018. SSVEP-based BCI in virtual reality - control of a vacuum cleaner robot. In 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC). 534–537.
[136]
Zhongda Sun, Minglu Zhu, Zhaocong Chen, Xuechuan Shan, and Chengkuo Lee. 2021. Haptic-feedback ring enabled human-machine interface (HMI) aiming at immersive virtual reality experience. In 2021 21st International Conference on Solid-State Sensors, Actuators and Microsystems (Transducers). 333–336.
[137]
Ivan E. Sutherland. 1968. A head-mounted three dimensional display. In Proceedings of the December 9-11, 1968, Fall Joint Computer Conference, Part I (San Francisco, California) (AFIPS’68 (Fall, part I)). Association for Computing Machinery, New York, NY, USA, 757–764.
[138]
Chie Suzuki, Takuji Narumi, Tomohiro Tanikawa, and Michitaka Hirose. 2014. Affecting tumbler: Affecting our flavor perception with thermal feedback. In Proceedings of the 11th Conference on Advances in Computer Entertainment Technology (Funchal, Portugal) (ACE’14). Association for Computing Machinery, New York, NY, USA, Article 19, 10 pages.
[139]
Andrzej Szajna, Roman Stryjski, Waldemar Woźniak, Norbert Chamier-Gliszczyński, and Mariusz Kostrzewski. 2020. Assessment of augmented reality in manual wiring production process with use of mobile AR glasses. Sensors 20, 17 (2020).
[140]
Zhiyong Tang, Jianbing Yang, Zhongcai Pei, and Xiao Song. 2020. A novel intelligence-based pan-tilt platform system for measuring the trajectories of parachute. In 2020 15th IEEE Conference on Industrial Electronics and Applications (ICIEA). 1–6.
[141]
Greg Toornman. 2018. AGCO corporation’s vision: Building a globally digitised supply network. Journal of Supply Chain Management, Logistics and Procurement 1, 4 (2018), 368–385.
[142]
Juin-Ling Tseng. 2021. Intelligent augmented reality system based on speech recognition. International Journal of Circuits, Systems and Signal Processing 15 (2021), 178–186.
[143]
Hameed Ullah, Fahad Mumtaz Malik, Anjum Saeed, Zeeshan Ali Akbar, and Sajjad Hussain. 2019. Sampled-data control of pan-tilt platform using discrete-time high gain observer. In IOP Conference Series: Materials Science and Engineering, Vol. 707. IOP Publishing, 012005.
[144]
Andries van Dam. 1997. Post-WIMP user interfaces. Commun. ACM 40, 2 (Feb. 1997), 63–67.
[145]
Jan B. F. van Erp, Ki-Uk Kyung, Sebastian Kassner, Jim Carter, Stephen Brewster, Gerhard Weber, and Ian Andrew. 2010. Setting the standards for haptic and tactile interactions: ISO’s work. In Haptics: Generating and Perceiving Tangible Sensations, Astrid M. L. Kappers, Jan B. F. van Erp, Wouter M. Bergmann Tiest, and Frans C. T. van der Helm (Eds.). Springer Berlin, Berlin, 353–358.
[146]
Maria Torres Vega, Taha Mehmli, Jeroen van der Hooft, Tim Wauters, and Filip De Turck. 2018. Enabling virtual reality for the tactile internet: Hurdles and opportunities. In 2018 14th International Conference on Network and Service Management (CNSM). 378–383.
[147]
Peng Wang, Xiaoliang Bai, Mark Billinghurst, Shusheng Zhang, Sili Wei, Guangyao Xu, Weiping He, Xiangyu Zhang, and Jie Zhang. 2021. 3DGAM: Using 3D gesture and CAD models for training on mixed reality remote collaboration. Multimedia Tools and Applications 80, 20 (2021), 31059–31084.
[148]
Tzu-Yang Wang, Yuji Sato, Mai Otsuki, Hideaki Kuzuoka, and Yusuke Suzuki. 2019. Effect of full body avatar in augmented reality remote collaboration. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). 1221–1222.
[149]
Xiangyu Wang and Phillip S. Dunston. 2008. User perspectives on mixed reality tabletop visualization for face-to-face collaborative design review. Automation in Construction 17, 4 (2008), 399–412.
[150]
Zenglei Wang, Shusheng Zhang, and Xiaoliang Bai. 2021. A mixed reality platform for assembly assistance based on gaze interaction in industry. The International Journal of Advanced Manufacturing Technology 116, 9 (2021), 3193–3205.
[151]
Nancy J. Wei, Bryn Dougherty, Aundria Myers, and Sherif M. Badawy. 2018. Using Google Glass in surgical settings: Systematic review. JMIR mHealth and uHealth 6, 3 (2018), e9409.
[152]
Mario Wolf, Pascalis Trentsios, Niklas Kubatzki, Christoph Urbanietz, and Gerald Enzner. 2020. Implementing continuous-azimuth binaural sound in unity 3D. In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). 384–389.
[153]
Hongjin Xu, Lihui Wang, Satoshi Tabata, Yoshihiro Watanabe, and Masatoshi Ishikawa. 2021. Extended depth-of-field projection method using a high-speed projector with a synchronized oscillating variable-focus lens. Appl. Opt. 60, 13 (May 2021), 3917–3924.
[154]
Li Da Xu, Eric L. Xu, and Ling Li. 2018. Industry 4.0: State of the art and future trends. International Journal of Production Research 56, 8 (2018), 2941–2962.
[155]
Yasuyuki Yanagida. 2012. A survey of olfactory displays: Making and delivering scents. In SENSORS, 2012 IEEE. 1–4.
[156]
Powen Yao, Vangelis Lympouridis, and Michael Zyda. 2021. Virtual equipment system: Face mask and voodoo doll for user privacy and self-expression options in virtual reality. In 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). 747–748.
[157]
Guangtao Zhang and John Paulin Hansen. 2019. A virtual reality simulator for training gaze control of wheeled tele-robots. In 25th ACM Symposium on Virtual Reality Software and Technology (Parramatta, NSW, Australia) (VRST’19). Association for Computing Machinery, New York, NY, USA, Article 49, 2 pages.
[158]
Xianjun Sam Zheng, Cedric Foucault, Patrik Matos da Silva, Siddharth Dasari, Tao Yang, and Stuart Goose. 2015. Eye-wearable technology for machine maintenance: Effects of display position and hands-free operation. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (Seoul, Republic of Korea) (CHI’15). Association for Computing Machinery, New York, NY, USA, 2125–2134.
[159]
Mao Zhipeng, Wu Jianfeng, Li Jianqing, Zhou Lianjie, Li Xiaomin, and Yang Yurong. 2012. A thermal tactile display device with multiple heat sources. In 2012 International Conference on Industrial Control and Electronics Engineering. 192–195.

Cited By

View all
  • (2025)Intelligent Human–Computer Interaction for Building Information Models Using Gesture RecognitionInventions10.3390/inventions1001000510:1(5)Online publication date: 16-Jan-2025
  • (2024)Cybersecurity and Privacy Challenges in Extended Reality: Threats, Solutions, and Risk Mitigation StrategiesVirtual Worlds10.3390/virtualworlds40100014:1(1)Online publication date: 30-Dec-2024
  • (2024)Remote Extended Reality With Markerless Motion Tracking for Sitting Posture TrainingIEEE Robotics and Automation Letters10.1109/LRA.2024.34604129:11(9860-9867)Online publication date: Nov-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Computing Surveys
ACM Computing Surveys  Volume 56, Issue 9
September 2024
980 pages
EISSN:1557-7341
DOI:10.1145/3613649
Issue’s Table of Contents

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 25 April 2024
Online AM: 14 March 2024
Accepted: 07 March 2024
Revised: 30 December 2023
Received: 30 June 2022
Published in CSUR Volume 56, Issue 9

Check for updates

Author Tags

  1. Extended reality (XR)
  2. augmented reality (AR)
  3. virtual reality (VR)
  4. mixed reality (MR)
  5. and augmented virtuality (AV)
  6. 4IR
  7. Industry 4.0

Qualifiers

  • Survey

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)1,092
  • Downloads (Last 6 weeks)179
Reflects downloads up to 20 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Intelligent Human–Computer Interaction for Building Information Models Using Gesture RecognitionInventions10.3390/inventions1001000510:1(5)Online publication date: 16-Jan-2025
  • (2024)Cybersecurity and Privacy Challenges in Extended Reality: Threats, Solutions, and Risk Mitigation StrategiesVirtual Worlds10.3390/virtualworlds40100014:1(1)Online publication date: 30-Dec-2024
  • (2024)Remote Extended Reality With Markerless Motion Tracking for Sitting Posture TrainingIEEE Robotics and Automation Letters10.1109/LRA.2024.34604129:11(9860-9867)Online publication date: Nov-2024
  • (2024)Action Recognition System Using Full-body XR Devices for Sports Metaverse Games2024 15th International Conference on Information and Communication Technology Convergence (ICTC)10.1109/ICTC62082.2024.10827474(1962-1965)Online publication date: 16-Oct-2024
  • (2024)Exploring emerging technologies: librarians’ awareness, challenges and ethical perspectives in Thai library contexts, with a focus on the metaverseDigital Library Perspectives10.1108/DLP-12-2023-011140:3(377-391)Online publication date: 28-Jun-2024

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Full Text

View this article in Full Text.

Full Text

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media