Skip to main content
Log in

Enabling consistent hand-based interaction in mixed reality by occlusions handling

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

A mixed reality environment, namely the space resulting from displaying virtual contents co-registered to the real space, represents an effective paradigm for applying the potential of virtual reality in the everyday life instead of having it confined within a computer screen. In this context, gesture-based interaction seems to be the most suited approach for human-machine interfacing. However, in order that interaction to be visually consistent, the tridimensional composition of the virtual objects onto the real background should be per-formed respecting the distance of each rendered pixel according to the user viewpoint. This paper describes a simple yet effective hand/finger-based interaction system and a virtual-to-real occlusion-handling approach, able to process in real time the stereoscopic video see-through stream to achieve pixel-wise z-order info, crucial to evaluating whether each rendered pixel should be displayed. The experiments confirm the efficacy of the proposed method in a simulation context.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Abate AF, Narducci F, Ricciardi S (2014) An image based approach to hand occlusions in mixed reality environments. In: Virtual, augmented and mixed reality. designing and developing virtual and augmented environments. Springer International Publishing, pp 319–328

  2. Bernardet U, i Badia SB, Duff A, Inderbitzin M, Le Groux S, Manzolli J, Mathews Z, Mura A, Valjamae A, Verschure PFMJ (2010) The eXperience induction machine: a new paradigm for mixed-reality interaction design and psychological experimentation. In: The engineering of mixed reality systems. Springer London pp 357–379

  3. Bichlmeier C, Heining SM, Feuerstein M, Navab N (2009) The virtual mirror: a new interaction paradigm for augmented reality environments. IEEE Trans Med Imaging 28(9):1498–1510

    Article  Google Scholar 

  4. Buchmann V, Violich S, Billinghurst M, Cockburn A (2004) FingARtips: gesture based direct manipulation in augmented reality. In: Proceedings of the 2nd International Conference on Computer Graphics and Interactive Techniques (GRAPHITE 2004), ACM, pp 212–221

  5. Caggianese, G, Neroni P, Gallo L (2014) Natural interaction and wearable augmented reality for the enjoyment of the cultural heritage in outdoor conditions. In: Augmented and virtual reality. Springer International Publishing, pp 267–282

  6. Corbett-Davies S, Dunser A, Green R, Clark A (2013) An advanced interaction framework for augmented reality based exposure treatment. In: IEEE Virtual Reality (VR 2013), IEEE, pp 19–22

  7. Cosco F, Garre C, Bruno F, Muzzupappa M, Otaduy MA (2013) Visuo-haptic mixed reality with unobstructed tool-hand integration. IEEE Trans Vis Comput Graph 19(1):159–172

    Article  Google Scholar 

  8. Felzenszwalb PF, Huttenlocher DP (2006) Efficient belief propagation for early vision. Int J Comput Vis 70.1 41–54

  9. Fischer J, Bartz D, Straßer W (2004) Occlusion handling for medical augmented reality using a volumetric phantom model. In Proceedings of the ACM symposium on Virtual reality software and technology (2004), ACM, pp 174–177

  10. Furmanski C, Azuma R, Daily M (2002) Augmented-reality visualizations guided by cognition: perceptual heuristics for combining visible and obscured information. In: Proceedings of the International Symposium on Mixed and Augmented Reality (ISMAR 2002), IEEE, pp 215–320

  11. Gordon G, Billinghurst M, Bell M, Woodfill J, Kowalik B, Erendi A, Tilander J (2002) The use of dense stereo range data in augmented reality. In: Proceedings of the 1st International Symposium on Mixed and Augmented Reality (2002), IEEE Computer Society, pp 14–23

  12. Hirschmuller H (2008) Stereo processing by semiglobal matching and mutual information. IEEE Trans Pattern Anal Mach Intell 30(2):328–341. doi:10.1109/TPAMI.2007.1166

    Article  Google Scholar 

  13. Humenberger M, Zinner C, Weber M, Kubinger W, Vincze M (2010) A fast stereo matching algorithm suitable for embedded real-time systems. Comput Vis Image Underst 114(11):1180–1202

    Article  Google Scholar 

  14. Kanade T, Okutomi M (1994) A stereo matching algorithm with an adaptive window: theory and experiment. IEEE Trans Pattern Anal Mach Intell 920–932

  15. Kantonen T, Woodward C, Katz N (2010) Mixed reality in virtual world teleconferencing. In: Virtual Reality Conference (VR), IEEE, pp 179–182

  16. Lee W, Park J (2005) Augmented foam: a tangible augmented reality for product design. In Mixed and augmented reality. In: Proceedings of the Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality (2005), IEEE, pp. 106–109

  17. Liu C, Huot S, Diehl J, Mackay WE, Beaudouin-Lafon M (2012) Evaluating the benefits of realtime feedback in mobile augmented reality with hand-held devices. CHI’12 - 30th International Conference on Human Factors in Computing Systems - 2012

  18. Malkawi AM, Srinivasan RS (2005) A new paradigm for human-building interaction: the use of CFD and augmented reality. Autom Constr 14(1):71–84

    Article  Google Scholar 

  19. Medioni G, Nevatia R (1985) Segment-based stereo matching. Comput Vis Graph Image Process 2–18

  20. Mistry P, Maes P (2009) SixthSense – a wearable gestural interface. In the Proceedings of SIGGRAPH Asia 2009, Sketch. Yokohama, Japan

  21. Piumsomboon T, Clark A, Billinghurst M, Cockburn A (2013). User-defined gestures for augmented reality. In: Human-Computer Interaction–INTERACT 2013. Springer, Berlin Heidelberg pp 282–299

  22. Sekuler AB, Palmer SE (1992) Perception of partly occluded objects: a microgenetic analysis. J Exp Psychol Gen 121:95–111

    Article  Google Scholar 

  23. Seo DW, Lee JY (2013) Direct hand touchable interactions in augmented reality environments for natural and intuitive user experiences. Exp Syst Appl 40(9):3784–3793

    Article  Google Scholar 

  24. Shah MM, Arshad H, Sulaiman R (2012) Occlusion in augmented reality. In: Proceedings of the 8th International Conference on Information Science and Digital Content Technology (ICIDT 2012), IEEE, pp 372–378

  25. van Krevelen DWF, Poelman R (2010) A survey of augmented reality technologies, applications and limitations. Int J Virtual Reality 9(2):1–20

    Google Scholar 

  26. Walairacht S, Yamada K, Hasegawa S, Koike Y, Sato M (2002) 4+ 4 fingers manipulating virtual objects in mixed-reality environment. Presence: Teleoperators and Virtual Environments (2002), MIT Press J, pp 134–143

  27. Wang RY, Popovi J (2009) Real-time hand- tracking with a color glove. Published in J ACM Trans Graph (TOG). Proceedings of ACM SIGGRAPH 2009, vol 28(3), ACM New York, NY, USA

  28. Wei W, Yue Q, QingXing W (2011) An augmented reality application framework for complex equipment collaborative maintenance. Springer, Berlin/Heidelberg, vol 6874, pp 154–16, ISBN: 978-3-642- 23733–1

  29. Weichert F, Bachmann D, Rudak B, Fisseler D (2013) Analysis of the accuracy and robustness of the leap motion controller. Sensors 13(5):6380–6393

    Article  Google Scholar 

  30. Xu Gang, Zhang Z (1996) Epipolar geometry in stereo, motion and object recognition: a unified approach, vol 6. Springer

  31. Yang Q, Wang L, Yang R., Wang S, Liao M, Nister D (2006) Real-time global stereo matching using hierarchical belief propagation. In: The British Machine Vision Conference, 2006, pp 989–998

  32. Zhang Z (2012) Microsoft kinect sensor and its effect. MultiMedia, IEEE 19(2):4–10

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to F. Narducci.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Narducci, F., Ricciardi, S. & Vertucci, R. Enabling consistent hand-based interaction in mixed reality by occlusions handling. Multimed Tools Appl 75, 9549–9562 (2016). https://doi.org/10.1007/s11042-016-3276-7

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-016-3276-7

Keywords

Navigation