Skip to main content
Log in

The effect of image-cyclic-based guidance on user's skill enhancement in virtual assembly task

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

The concept of 2 Dimensional (2D) and 3 Dimensional (3D) are represented by different bodies for diverse applications. The 2D is an old concept of image representation as it displays only the x and y axis, whereas the 3D image displays the x, y, and z axis simultaneously. The image of 3D on a screen looks like an image in the real world. With the rapid development of virtual technology, the virtual assembly task is used for a user the check the clear view, just like a real-world view. Various cognitive aids (such as change of color, arrows, etc.) are provided in virtual assembly tasks to assist users in task realization. These aids increase users' performance but lead to reduce learning because there is less cognitive load on the users. In this research, we propose the development of Image Cyclic Based Guidance (ICBG) on users' skills enhancement in virtual electric motor assembly tasks for poly-technical students who enable them to assemble the electric motor according to the correct course of action in a 3D-friendly environment without any mental load or expert. We describe the potential contribution of VEM (Virtual Electric Motor) for enhancing student learning based on ICBG, where students assist in assembling an electric motor and its different parts and accurately assemble these parts using VEM. We evaluated our proposed system through polytechnical students. During the evaluation, it revealed that ICBG showed a significant difference that the performance of students was considerably better than the others who did not use it.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Algorithm 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Data availability

N/A.

References

  1. Van Merriënboer JJ, Kirschner PA (2017) Ten steps to complex learning: A systematic approach to four-component instructional design. Routledge

  2. Tsovaltzi D, Rummel N, McLaren BM, Pinkwart N, Scheuer O, Harrer A et al (2010) Extending a virtual chemistry laboratory with a collaboration script to promote conceptual learning. International Journal of Technology Enhanced Learning 2:91–110

    Article  Google Scholar 

  3. Gray R (2017) Transfer of training from virtual to real baseball batting. Frontiers in psychology 8:2183

    Article  Google Scholar 

  4. Hoareau C, Querrec R, Buche C, Ganier F (2017) Evaluation of internal and external validity of a virtual environment for learning a long procedure. International Journal of Human–Computer Interaction 33:786–798

    Article  Google Scholar 

  5. Gupta A, Fox D, Curless B, Cohen M (2012) DuploTrack: a real-time system for authoring and guiding duplo block assembly, in Proceedings of the 25th annual ACM symposium on User interface software and technology, 389-402

  6. Huang K, Starner T, Do E, Weinberg G, Kohlsdorf D, Ahlrichs C et al (2010) Mobile music touch: mobile tactile stimulation for passive learning, in Proceedings of the SIGCHI conference on human factors in computing systems, 791-800

  7. Juraschek M, Büth L, Posselt G, Herrmann C (2018) Mixed reality in learning factories. Procedia Manufacturing 23:153–158

    Article  Google Scholar 

  8. Blume BD, Ford JK, Baldwin TT, Huang JL (2010) Transfer of training: A meta-analytic review. Journal of management 36:1065–1105

    Article  Google Scholar 

  9. Merriam SB, Baumgartner LM (2020) Learning in adulthood: A comprehensive guide. John Wiley & Sons

  10. Michel N, Cater JJ III, Varela O (2009) Active versus passive teaching styles: An empirical study of student learning outcomes. Human resource development quarterly 20:397–418

    Article  Google Scholar 

  11. Gick ML, Holyoak KJ (1987) The cognitive basis of knowledge transfer, in Transfer of learning, ed: Elsevier, 9-46

  12. Soori M, Arezoo B, Dastres R (2023) Advanced virtual manufacturing systems: A review, Journal of Advanced Manufacturing Science and Technology

  13. Zhang C, Zhou G, Ma D, Wang R, Xiao J, Zhao D (2023) A deep learning-enabled human-cyber-physical fusion method towards human-robot collaborative assembly. Robotics and Computer-Integrated Manufacturing 83:102571

    Article  Google Scholar 

  14. Trebuna P, Pekarcikova M, Duda R, Svantner T (2023) Virtual Reality in Discrete Event Simulation for Production–Assembly Processes. Applied Sciences 13:5469

    Article  Google Scholar 

  15. Irfanullah, Hussain T, Iqbal A, Yang B, Hussain A (2022) Real time violence detection in surveillance videos using Convolutional Neural Networks, Multimedia Tools and Applications, vol. 81, pp. 38151-38173, 2022/11/01

  16. Carlson P, Peters A, Gilbert SB, Vance JM, Luse A (2015) Virtual training: Learning transfer of assembly tasks. IEEE transactions on visualization and computer graphics 21:770–782

    Article  Google Scholar 

  17. Wolfartsberger J, Zimmermann R, Obermeier G, Niedermayr D (2023) Analyzing the potential of virtual reality-supported training for industrial assembly tasks. Computers in Industry 147:103838

    Article  Google Scholar 

  18. Auyeskhan U, Steed CA, Park S, Kim D-H, Jung ID, Kim N (2023) Virtual reality-based assembly-level design for additive manufacturing decision framework involving human aspects of design. Journal of Computational Design and Engineering 10:1126–1142

    Article  Google Scholar 

  19. Ruffaldi E, Peppoloni L, Filippeschi A (2015) "Sensor fusion for complex articulated body tracking applied in rowing," Proceedings of the Institution of Mechanical Engineers. Part P: Journal of Sports Engineering and Technology 229:92–102

    Google Scholar 

  20. Reyes-Zárate GG, Montes EC, Gonzalez-Mendivil JA, Espejo UH (2023) Comparison in Virtual Reality Based on Efficiency for Product Assembly, Human Interaction and Emerging Technologies (IHIET-AI 2023): Artificial Intelligence and Future Applications, vol. 70

  21. Büttner S, Mucha H, Funk M, Kosch T, Aehnelt M, Robert S et al (2017) The design space of augmented and virtual reality applications for assistive environments in manufacturing: a visual approach, in Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments, 433-440

  22. Feng S, He X, He W, Billinghurst M (2023) Can you hear it? Stereo sound-assisted guidance in augmented reality assembly. Virtual Reality 27:591–601

    Article  Google Scholar 

  23. Lee YS, Rashidi A, Talei A, Beh HJ, Rashidi S (2023) A Comparison Study on the Learning Effectiveness of Construction Training Scenarios in a Virtual Reality Environment, in Virtual Worlds, 36-52

  24. Hitomi K (2017) Manufacturing Systems Engineering: A unified approach to manufacturing technology, productionmanagement, and industrial economics. Routledge

  25. Borrego A, Latorre J, Alcañiz M, Llorens R (2018) Comparison of Oculus Rift and HTC Vive: feasibility for virtual reality-based exploration, navigation, exergaming, and rehabilitation. Games for health journal 7:151–156

    Article  Google Scholar 

  26. Funk M, Lischke L, Mayer S, Shirazi AS, Schmidt A (2018) Teach me how! interactive assembly instructions using demonstration and in-situ projection, in Assistive Augmentation, ed: Springer, 49-73

  27. Lorenz M, Brade J, Klimant P, Heyde C-E, Hammer N (2023) Age and gender effects on presence, user experience and usability in virtual environments–first insights. PloS one 18:e0283565

    Article  Google Scholar 

  28. V. TN, A. TG, and T. Manjunath (2023) 3D Hand Tracking in Virtual Environments, Grenze International Journal of Engineering & Technology (GIJET), vol. 9

  29. McAnally K, Wallwork K, Wallis G (2023) The efficiency of visually guided movement in real and virtual space. Virtual Reality 27:1187–1197

    Article  Google Scholar 

  30. Chittaro L, Venkataraman S (2006) Navigation aids for multi-floor virtual buildings: A comparative evaluation of two approaches, in Proceedings of the ACM symposium on Virtual Reality Software and Technology, 227-235

  31. Kase H, Nishizawa J, Tabata K, Takagi K, Aoki T (2023) Spatial awareness application using mixed reality for 3D X-ray CT examination. Journal of Instrumentation 18:P03032

    Article  Google Scholar 

  32. Nguyen TTH, Duval T, Fleury C (2013) Guiding techniques for collaborative exploration in multi-scale shared virtual environments, in GRAPP International Conference on Computer Graphics Theory and Applications, 327-336

  33. Jäntti M, Aho M, Kalermo-Poranen J (2023) Studying challenges, tasks and benefits of using virtual spaces for educational, Learning and showcasing purposes, in EDULEARN23 Proceedings, 6745-6754

  34. Dalgarno B, Bishop AG, Bedgood Jr DR (2003) The potential of virtual laboratories for distance education science teaching: reflections from the development and evaluation of a virtual chemistry laboratory, in Proceedings of The Australian Conference on Science and Mathematics Education

  35. Rehman IU, Ullah S, Rabbi I (2014) The effect of semantic multi-modal aids using guided virtual assembly environment, in 2014 International Conference on Open Source Systems & Technologies, 87-92

  36. Ladino Nocua AC, Cruz Gonzalez JP, Castiblanco Jimenez IA, Gomez Acevedo JS, Marcolin F, Vezzetti E (2021) Assessment of cognitive student engagement using heart rate data in distance learning during COVID-19. Education Sciences 11:540

    Article  Google Scholar 

  37. Castiblanco Jimenez IA, Gomez Acevedo JS, Marcolin F, Vezzetti E, Moos S (2023) Towards an integrated framework to measure user engagement with interactive or physical products, International Journal on Interactive Design and Manufacturing (IJIDeM), vol. 17, 45-67

  38. Sáiz-Manzanares MC, Marticorena-Sánchez R, Martín Antón LJ, González-Díez I, Carbonero Martín MÁ (2023) Using Eye Tracking Technology to Analyse Cognitive Load in Multichannel Activities in University Students, International Journal of Human–Computer Interaction, pp. 1-19

Download references

Acknowledgments

Saifur Rahman and Tariq Hussain contributed equally to this work and are the first co-authors.

Funding

This work was supported by the National Natural Science Foundation of China (Grant No. 62172366), and "Pioneer" and "Leading Goose" R & D. Program of Zhejiang Province (2023C01150).

Author information

Authors and Affiliations

Authors

Contributions

S.R and T.H conceptualized this study, conducted experiments, wrote the original draft, and revised the manuscript; B.Y and A.H conducted the experimental plan, supervised the work, and revised the manuscript; S.R and N.A. contributed to the evaluation of the developed technique, analysis of results and revised the manuscript. All authors reviewed the manuscript.

Corresponding author

Correspondence to Bailin Yang.

Ethics declarations

Ethics approval

We confirm that relevant guidelines and regulations are carried out in all methods.

Consent to publish

The authors declare that the research was conducted without any commercial or financial relationships that could be construed as a potential conflict of interest.

Conflict of interest

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix 1

Appendix 1

Q1: Questionnaire interview for the students in G1

Q.No.

Questions

1

Can the assembly task be easily performed in a virtual electric workshop using a general environment?

2

Is searching/finding the electric motor parts in the general virtual electric workshop easy?

3

Can i perform the assembly task in the virtual electric motor workshop using a general environment without any cognitive load?

4

Does the proposed is more suitable for actual electric motor assembly tasks?

5

After performing the assembly task using a general environment, Can i feel confident and efficiently perform the assembling in an actual electric motor workshop?

Q2: Questionnaire interview for the students in G2

Q.No.

Questions

1

Can an arrow-based technique efficiently perform the assembly task in the virtual electric workshop?

2

Is searching/finding the electric motor parts in the arrow-based virtual electric workshop easy?

3

Can I perform the assembly task in the virtual electric motor workshop using an arrow-based technique without any cognitive load?

4

Does the proposed system more suitable for actual electric motor assembly tasks?

5

After performing the assembly task using the arrow-based technique environment, Can I feel confident and quickly assemble in the electric motor workshop?

Q3: Questionnaire interview for the students in G3

Q.No.

Questions

1

Can the assembly task be easily performed in virtual electric workshops using textual tips?

2

Does the search/finding of the electric motor parts in the textual tips virtual electric workshop very easy?

3

Can i perform the assembly task in the virtual electric motor workshop textual tips technique without any cognitive load?

4

Does the proposed is more suitable for actual electric motor assembly tasks?

5

After performing the assembly task using the textual tips technique environment, Can i feel confident and efficiently perform the assembling in the actual electric motor workshop?

Q4: Questionnaire interview for the students in G4

Q.No.

Questions

1

Can the assembly task be easily performed in the virtual electric workshop using the ICBG technique?

2

Does the search/finding of the electric motor parts in the ICBG virtual electric workshop very easy?

3

Can I perform the assembly task in the virtual electric motor workshop ICBG technique without any cognitive load?

4

Does the proposed is more suitable for actual electric motor assembly tasks?

5

After performing the assembly task using the ICBG technique environment, can I feel confident and can efficiently perform the assembling in the actual electric motor workshop?

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rahman, S., Ali, N., Hussain, T. et al. The effect of image-cyclic-based guidance on user's skill enhancement in virtual assembly task. Multimed Tools Appl 83, 41823–41846 (2024). https://doi.org/10.1007/s11042-023-17175-y

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-023-17175-y

Keywords

Navigation