Skip to main content
Log in

Micro-information-level AR instruction: a new visual representation supporting manual classification of similar assembly parts

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

In AR operation guidance training, for assembly parts with similar geometric shapes, there are still two problems in the visual representation of AR instructions: (1) AR instructions cannot accurately represent the micro-geometric differences between similar parts. (2) The AR instruction design specification that reflects the micro-geometric differences of similar parts has not been formulated. Based on such a problem, our team has carried out the following research work: First, the geometric features of parts are defined at the micro-geometric level and micro-information level, thereby explaining the relationships and differences between similar part features. From the above two levels. Secondly, a mathematical model of the geometric features of the parts is established, and the control parameters in the model are given to characterize the feature differences between similar parts. To verify the accuracy of the control parameters, we designed three AR instructions based on the control parameters and verified them through five hypotheses. Our team then analyzed the data from a case study and focused our discussion on test results that did not meet expectations. We have a more in-depth discussion by comparing the differences and analyzing the results. Finally, three implications of AR instructions in representing feature differences between similar parts are given, and future research directions for such work are indicated. It is provided guidance for future research.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16

Similar content being viewed by others

Data Availability

Not applicable.

Code availability

Not applicable.

Notes

  1. https://www.microsoft.com/en-us/hololens

  2. https://www.magicleap.com/

  3. https://www.google.com/glass/start/

References

  1. Almiyad MA, Oakden-Rayner L, Weerasinghe A, Billinghurst M (2017) "Intelligent Augmented Reality Tutoring for Physical Tasks with Medical Professionals," in International Conference on Artificial Intelligence in Education: Springer, pp. 450–454

  2. Ceruti A, Marzocca P, Liverani A, Bil C (2019) Maintenance in aeronautics in an industry 4.0 context: the role of augmented reality and additive manufacturing. Journal of Computational Design and Engineering 6(4):516–526

    Article  Google Scholar 

  3. Chu C-H, Ko C-H (2021) An experimental study on augmented reality assisted manual assembly with occluded components. J Manuf Syst 61:685–695

    Article  Google Scholar 

  4. Feng S, He W, Zhang S, Billinghurst M (2022) Seeing is believing: AR-assisted blind area assembly to support hand–eye coordination. Int J Adv Manuf Technol 119:1–10

    Article  Google Scholar 

  5. Fiorentino M, Monno G, Uva A (2009) Tangible digital master for product lifecycle management in augmented reality. International Journal on Interactive Design and Manufacturing (IJIDeM) 3(2):121–129

    Article  Google Scholar 

  6. Gattullo M, Scurati GW, Fiorentino M, Uva AE, Ferrise F, Bordegoni M (2019) Towards augmented reality manuals for industry 4.0: A methodology. Robot Comput Integr Manuf 56:276–286

  7. Henderson SJ, Feiner SK (2011) "Augmented reality in the psychomotor phase of a procedural task," in 2011 10th IEEE International Symposium on Mixed and Augmented Reality: IEEE, pp. 191–200

  8. Henderson S, Feiner S (2011) Exploring the benefits of augmented reality documentation for maintenance and repair. IEEE Trans Vis Comput Graph 17(10):1355–1368

    Article  Google Scholar 

  9. Huang JM, Ong SK, Nee AYC (2016) "Visualization and interaction of finite element analysis in augmented reality," Comput Aided Des, vol. 84

  10. Kaplan AD, Cruit J, Endsley M, Beers SM, Sawyer BD, Hancock PA (2021) The effects of virtual reality, augmented reality, and mixed reality as training enhancement methods: a meta-analysis. Human Factors 63(4):706–726. https://doi.org/10.1177/0018720820904229

  11. Lai Z-H, Tao W, Leu MC, Yin Z (2020) Smart augmented reality instructional system for mechanical assembly towards worker-centered intelligent manufacturing. J Manuf Syst 55:69–81. https://doi.org/10.1016/j.jmsy.2020.02.010

    Article  Google Scholar 

  12. Laviola E, Gattullo M, Manghisi VM, Fiorentino M, Uva AE (2022) Minimal AR: visual asset optimization for the authoring of augmented reality work instructions in manufacturing. Int J Adv Manufacturing Technol 119(3):1769–1784. https://doi.org/10.1007/s00170-021-08449-6

    Article  Google Scholar 

  13. Liu C, Cao S, Tse W, Xu X (2017) Augmented reality-assisted intelligent window for cyber-physical machine tools. J Manuf Syst 44:280–286

    Article  Google Scholar 

  14. Nurelmadina N et al (2021) A Systematic Review on Cognitive Radio in Low Power Wide Area Network for Industrial IoT Applications. Sustainability 13:1. https://doi.org/10.3390/su13010338

    Article  Google Scholar 

  15. Raji MF et al (2020) A New Approach for Enhancing the Services of the 5G Mobile Network and IOT-Related Communication Devices Using Wavelet-OFDM and Its Applications in Healthcare. Sci Programm 2020:3204695. https://doi.org/10.1155/2020/3204695

    Article  Google Scholar 

  16. Ramachandran et al. (2019) "Mini-Me: An Adaptive Avatar for Mixed Reality Remote Collaboration," Sci Rep

  17. J.-e. Shin, B. Yoon, D. Kim, and W. Woo, "A User-Oriented Approach to Space-Adaptive Augmentation: The Effects of Spatial Affordance on Narrative Experience in an Augmented Reality Detective Game," presented at the Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 2021. [Online]. Available: https://doi.org/10.1145/3411764.3445675.

  18. Siew C, Ong S, Nee A (2019) A practical augmented reality-assisted maintenance system framework for adaptive user support. Robot Comput Integr Manuf 59:115–129

    Article  Google Scholar 

  19. Urbas U, Vrabič R, Vukašinović N (2019) Displaying Product Manufacturing Information in Augmented Reality for Inspection. Procedia CIRP 81:832–837. https://doi.org/10.1016/j.procir.2019.03.208

    Article  Google Scholar 

  20. Vanneste P, Huang Y, Park JY, Cornillie F, Decloedt B, Van den Noortgate W (2020) Cognitive support for assembly operations by means of augmented reality: an exploratory study. Int J Human-Comput Studies 143:102480. https://doi.org/10.1016/j.ijhcs.2020.102480

    Article  Google Scholar 

  21. Wang Z, Bai X, Zhang S, He W, Zhang X, Zhang L, Wang P, Han D, Yan Y (2019) Information-level AR instruction: a novel assembly guidance information representation assisting user cognition. Int J Adv Manuf Technol 106:1–24

    Google Scholar 

  22. Z. Wang, Y. Yan, D. Han, X. Bai, and S. Zhang, "Product Blind Area Assembly Method Based on Augmented Reality and Machine Vision," JNWPU, 10.1051/jnwpu/20193730496 vol. 37, no. 3, pp. 496–502, // 2019. [Online]. Available: https://doi.org/10.1051/jnwpu/20193730496.

  23. Wang Z, Bai X, Zhang S, He W, Zhang X, Zhang L, Wang P, Han D, Yan Y (2020) Information-level AR instruction: a novel assembly guidance information representation assisting user cognition. Int J Adv Manuf Technol 106(1):603–626

    Article  Google Scholar 

  24. Wang Z et al (2020) Information-level real-time AR instruction: a novel dynamic assembly guidance information representation assisting human cognition. Int J Adv Manuf Technol:1–19

  25. Wang Z et al. (2020) "SHARIdeas: A Visual Representation of Intention Sharing Between Designer and Executor Supporting AR Assembly," in SIGGRAPH Asia 2020 Posters, pp. 1–2

  26. Wang Z et al (2021) M-AR: A Visual Representation of Manual Operation Precision in AR Assembly. Int J Human–Comput Interact 37(19):1799–1814. https://doi.org/10.1080/10447318.2021.1909278

    Article  Google Scholar 

  27. Wang P, Bai X, Billinghurst M, Zhang S, Wei S, Xu G, He W, Zhang X, Zhang J (2021) 3DGAM: using 3D gesture and CAD models for training on mixed reality remote collaboration. Multimed Tools Appl 80(20):31059–31084

    Article  Google Scholar 

  28. Wang Z et al (2021) SHARIDEAS: a smart collaborative assembly platform based on augmented reality supporting assembly intention recognition. Int J Adv Manufacturing Technol 115(1):475–486. https://doi.org/10.1007/s00170-021-07142-y

    Article  Google Scholar 

  29. Westerfield G, Mitrovic A, Billinghurst M (2015) Intelligent augmented reality training for motherboard assembly. Int J Artif Intell Educ 25(1):157–172

    Article  Google Scholar 

Download references

Acknowledgments

We would like to appreciate the anonymous reviewers for their constructive suggestions for enhancing this paper. Besides, thanks to Zhishuo Xiong of the London School of Economics and Political Science for checking the English manuscript of the earlier version and he helped the author correct the grammatical errors in the paper. We particularly thank the CPILab VR / AR team of northwestern polytechnical university for its contribution to this study. We would also like to thank volunteers of university of shanghai for science and technology for participating in this experiment.

Funding

This work is partly supported by Defense Industrial Technology Development Program(No. XXXX2018213A001) and SASTIND China under Grant (JCKY2018205B021).

Author information

Authors and Affiliations

Authors

Contributions

Yang Wang provided some valuable design solutions for this UX experiment. Jie Zhang and Yueqing Zhang established the basic hardware environment for our research. Yuxiang Yan broke through the technical difficulty of this research, and Xiangyu Zhang did a lot of work for the collection of experimental data. In particular, we would like to thank Prof. Weiping He and Associate Prof. Xiaoliang Bai for their constructive comments on the improvement of the experiment.

Corresponding author

Correspondence to Yang Wang.

Ethics declarations

Conflicts of interests/competing interests

Our team declare that we have no financial and personal relationships with other people or organizations that can inappropriately influence our work, there is no professional or other personal interest of any nature or kind in any product, service and/or company that could be construed as influencing the position presented in, or the review of, the manuscript entitled.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, Z., Wang, Y., Bai, X. et al. Micro-information-level AR instruction: a new visual representation supporting manual classification of similar assembly parts. Multimed Tools Appl 82, 11589–11618 (2023). https://doi.org/10.1007/s11042-022-13574-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-022-13574-9

Keywords

Navigation