Skip to main content
Log in

Natural and hybrid bimanual interaction for virtual assembly tasks

  • Original Article
  • Published:
Virtual Reality Aims and scope Submit manuscript

Abstract

This paper focuses on the simulation of bimanual assembly/disassembly operations for training or product design applications. Most assembly applications have been limited to simulate only unimanual tasks or bimanual tasks with one hand. However, recent research has introduced the use of two haptic devices for bimanual assembly. We propose a more natural and low-cost bimanual interaction than existing ones based on Markerless motion capture (Mocap) systems. Specifically, this paper presents two interactions based on a Markerless Mocap technology and one interaction based on combining Markerless Mocap technology with haptic technology. A set of experiments following a within-subjects design have been implemented to test the usability of the proposed interfaces. The Markerless Mocap-based interactions were validated with respect to two-haptic-based interactions, as the latter has been successfully integrated into bimanual assembly simulators. The pure Markerless Mocap interaction proved to be either the most or least efficient depending on the configuration (with 2D or 3D tracking, respectively). Usability results among the proposed interactions and the two-haptic based interaction showed no significant differences. These results suggest that Markerless Mocap or hybrid interactions are valid solutions for simulating bimanual assembly tasks when the precision of the motion is not critical. The decision on which technology to use should depend on the trade-off between the precision requested to simulate the task, the cost, and inner features of the technology.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  • Abate AF, Guida M, Leoncini P, Nappi M, Ricciardi S (2009) A haptic-based approach to virtual training for aerospace industry. J Vis Lang Comput 20:318–325. doi:10.1016/j.jvlc.2009.07.003

    Article  Google Scholar 

  • Adams JR, Clowden D, Hannaford B (2001) Virtual training for a manual assembly task. Haptics-e, Vol. 2(2)

  • Avizzano CA, Marcheschi S, Angerilli M (2003) A multi-finger haptic interface for visually impaired people. Works.on ROMAN, pp 165–170. doi:10.1109/ROMAN.2003.1251838

  • Bloomfield A, Deng Y, Wampler J, Rondot P, Harth D, Mcmanus M, Badler NI (2003) A taxonomy and comparison of haptic actions for disassembly tasks. In: Proceedings of IEEE VR Conference, Los Angeles, CA, USA. pp 225–231

  • Bordegoni M, Cugini U, Belluco Paolo, Aliverti M (2009) Evaluation of a haptic-based interaction system for virtual manual assembly. Virtual Mixed Real LNCS 5622:303–312. doi:10.1007/978-3-642-02771-0_34

    Article  Google Scholar 

  • Bowman D, Kruijff E, LaViola J, Poupyrev I (2004) 3D user interfaces: theory and practice. Addison-Wesley, Boston

    Google Scholar 

  • Cao Y, Xia Y, Wang Z (2010) A close-form iterative algorithm for depth inferring from a single image. Comput Vis (ECCV). 6315:729–742. doi:10.1007/978-3-642-15555-0_53

    Google Scholar 

  • Cheng-jun C, Yun-feng W, Niu L (2010) Research on interaction for virtual assembly system with force feedback. In: Proceedings ICIC, Wuxi, China, 2: 147–150. doi:10.1109/ICIC.2010.131

  • Gupta SK, An DK, Brough JE, Kavetsky RA, Schwartz M, Thakur A (2008) A survey of the virtual environments-based assembly training applications. Virtual manufacturing workshop (UMCP), Turin, Italy

  • Gutiérrez T, Rodríguez J, Vélaz Y, Casado S, Sánchez EJ, Suescun A (2010) IMA-VR: a multimodal virtual training system for skills transfer in industrial maintenance and assembly tasks. In: Proceedings ROMAN, pp 428–433. doi:10.1109/ROMAN.2010.5598643

  • Isard M, Blake A (1998) Condensation—conditional density propagation for visual tracking. Int J Comput Vis 29(1):5–28

    Article  Google Scholar 

  • Jun Y, Liu J, Ning R, Zhang Y (2005) Assembly process modeling for virtual assembly process planning. Int J Comput Integr Manuf 18(6):442–445. doi:10.1080/09511920400030153

    Article  Google Scholar 

  • Jung B, Latoschik M, Wachsmuth I (1998) Knowledge-based assembly simulation for virtual prototype modeling. In: Proceedings IECON, Aachen, Germany, 4: 2152–2157. doi:10.1109/IECON.1998.724054

  • Lee J, Rhee G, Seo D (2010) Hand gesture-based tangible interactions for manipulating virtual objects in a mixed reality environment. Int J Adv Manuf Tech 51(9):1069–1082. doi:10.1007/s00170-010-2671-x

    Article  Google Scholar 

  • Leino S-P, Lind S, Poyade M, Kiviranta S, Multanen P, Reyes-Lecuona A, Mäkiranta A, Muhammad A (2009) Enhanced industrial maintenance work task planning by using virtual engineering tools and haptic user interfaces. Virtual Mixed Real LNCS 5622:346–354. doi:10.1007/978-3-642-02771-0_39

    Article  Google Scholar 

  • Lu X, Qi Y, Zhou T, Yao X (2012) Constraint-based virtual assembly training system for aircraft engine. Adv Comput Environ Sci Adv Intell Soft Comput 142:105–112. doi:10.1007/978-3-642-27957-7_13

    Article  Google Scholar 

  • Moeslund TB, Hilton A, Krüger V (2006) A survey of advances in vision-based human motion capture and analysis. Comput Vis Image Underst 104(2):90–126. doi:10.1016/j.cviu.2006.08.002

    Article  Google Scholar 

  • Oikonomidis I, Kyriazis N, Argyros AA (2012) Tracking the articulated motion of two strongly interacting hands. To appear in the proceedings of IEEE conference on CVPR, Rhode Island, USA

  • Belluco P, Bordegoni M, Polistina, S (2010) Multimodal navigation for a haptic-based virtual assembly application. In: Conference on WINVR. Iowa, USA. pp 295–301. doi:10.1115/WINVR2010-3743

  • Poppe R (2007) Vision-based human motion analysis: an overview. Comput Vis Image Underst 108(1–2):4–18. doi:10.1016/j.cviu.2006.10.016

    Article  Google Scholar 

  • Poyade M, Reyes-Lecuona A, Leino S-P, Kiviranta S, Viciana-Abad R, Lind S (2009) A high-level haptic interface for enhanced interaction within virtools. Virtual Mixed Real LNCS 5622:365–374. doi:10.1007/978-3-642-02771-0_41

    Article  Google Scholar 

  • Romero J, Kjellström H, Kragic H (2010) Hands in action: real-time 3D reconstruction of hands in interaction with objects. In: Proceedings IEEE ICRA, pp 458–463. doi:10.1109/ROBOT.2010.5509753

  • Seth A, Su H-J, Vance JM (2008) Development of a dual-handed haptic assembly system: SHARP. J Comput Inf Sci Eng 8(4):044502

    Article  Google Scholar 

  • Seth A, Vance JM, Oliver JH (2011) Virtual reality for assembly methods prototyping—a review. Virtual Real Virtual Manuf Constr 15(1):5–20. doi:10.1007/s10055-009-0153-y

    Article  Google Scholar 

  • Shotton J, Fitzgibbon A, Cook M, Sharp T, Finocchio M, Moore R, Kipman A, Blake A (2011) Real-time human pose recognition in parts from single depth images. In: Proceedings CVPR’11. 2: 1297–1304

  • Siddiqui M, Medioni G (2010) Human pose estimation from a single view point, real-time range sensor. Conference in CVCG at CVPR, San Francisco, California, USA, pp. 1–8. doi:10.1109/CVPRW.2010.5543618

  • Unzueta L (2008) Markerless full-body human motion capture and combined motor action recognition for human-computer interaction. Ph. D. thesis, University of Navarra, tecnun

  • Volkov S, Vance JM (2001) Effectiveness of haptic sensation for the evaluation of virtual prototypes. ASME J Comput Inf Sci Eng 1(2):123–128. doi:10.1115/1.1384566

    Article  Google Scholar 

  • Wang RY, Popovic J (2009) Real-time hand-tracking with a color glove. In: Proceedeings SIGGRAPH’09. 28(3). doi:10.1145/1531326.1531369

  • Wang RY, Paris S, Popovic J (2011) 6D hands: Markerless hand-tracking for computer aided design. In: Proceeding UIST, pp 549–558. doi:10.1145/2047196.2047269

  • Wren CR, Azarbayejani A, Darrell T (1997) Pfinder: real-time tracking of the human body. IEEE Trans Pattern Anal Mach Intell 19(7):780–785

    Article  Google Scholar 

  • Zhu Y, Fujimura K (2007) Constrained optimization for human pose estimation from depth sequences. Proc ACCV 1:408–418

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yaiza Vélaz.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Vélaz, Y., Lozano-Rodero, A., Suescun, A. et al. Natural and hybrid bimanual interaction for virtual assembly tasks. Virtual Reality 18, 161–171 (2014). https://doi.org/10.1007/s10055-013-0240-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10055-013-0240-y

Keywords

Navigation