Elsevier

Computers & Graphics

Volume 34, Issue 6, December 2010, Pages 689-697
Computers & Graphics

Computer Graphics in Spain: a Selection of Papers from CEIG 2009
Haptic rendering of objects with rigid and deformable parts

https://doi.org/10.1016/j.cag.2010.08.006Get rights and content

Abstract

In many haptic applications, the user interacts with the virtual environment through a rigid tool. Tool-based interaction is suitable in many applications, but the constraint of using rigid tools is not applicable to some situations, such as the use of catheters in virtual surgery, or of a rubber part in an assembly simulation. Rigid-tool-based interaction is also unable to provide force feedback regarding interaction through the human hand, due to the soft nature of human flesh. In this paper, we address some of the computational challenges of haptic interaction through deformable tools, which forms the basis for direct-hand haptic interaction. We describe a haptic rendering algorithm that enables interactive contact between deformable objects, including self-collisions and friction. This algorithm relies on a deformable tool model that combines rigid and deformable components, and we present the efficient simulation of such a model under robust implicit integration.

Research highlights

► The handle of a tool is always rigid due to the grasp. ► Coupling the device to the handle allows rendering contact with complex tools. ► Force feedback is computed using a linear model of this coupling. ► This model is aware of the dynamics and contact state of the tool.

Introduction

Haptic rendering is the computational technology that allows us to interact with virtual worlds through the sense of touch. It relies on an algorithm that simulates a virtual world in a physically based manner and computes interaction forces, and on a robotic device that transmits those interaction forces to the user. Haptics science is a multidisciplinary field that brings together psychophysics research for the understanding of tactile cues and human perception; mechanical engineering for the design of robotic devices; control theory for the analysis of the coupling between the real and virtual worlds; and computer science, in particular computer graphics, for the simulation of the virtual world and the design of the haptic rendering algorithm [29].

In this paper, we focus on the haptic rendering of objects that are composed of both rigid and deformable parts. Many of the objects in the real world, including our own flesh, fit this description. In particular, our haptic rendering algorithm serves as the main building block for model-based computation of direct-hand haptic interaction. The hand is the main tactile sensor used by humans to capture information from the world [25]. It allows us to manipulate the world and provides us with two-way interaction with our environment. Despite the importance of the human hand for haptic interaction, current haptic devices and haptic rendering techniques suffer important limitations that have not allowed hand-based virtual touch to reach its full potential. Haptic rendering is typically carried out either through a pen-like robotic device, or through vibrotactile devices. In addition to hardware limitations, most haptic rendering algorithms are limited to point-based interaction between the user and the objects in the environment. There are some notable exceptions, both devices (e.g., exoskeleton structures [23], [7], [33]) and object-object rendering algorithms, also called 6-Degree-of-Freedom (6-DoF) haptic rendering [31], [39], [6].

We follow a strategy for haptic rendering of tool-based contact where a virtual replica of the tool is simulated in a virtual world, and haptic interaction takes place through a viscoelastic coupling between the haptic device and the tool [11]. Following this strategy, direct-hand touch can be rendered by simulating a virtual hand as the tool (as shown in Fig. 1), and coupling the device to this simulated hand. In Section 3, we summarize the adaptation of a multirate 6-DoF haptic rendering algorithm to model multipoint contact with objects composed of rigid and deformable parts, such as a hand. This algorithm does not suffer from the limitations of traditional single-point contact models for capturing the interaction between soft fingers and the environment.

In Section 4, taking the hand as an example model, we describe how to connect rigid and deformable components and how to actuate them through standard haptic devices. We exploit the position and orientation tracked by a haptic device to guide the rigid components of the virtual hand, and we model the coupling between the rigid and deformable components of the hand using stiff connections and implicit integration. As a first step toward full-hand haptic rendering, we consider the hand to be a shape formed by one rigid component and elastic flesh, and we do not take into account the articulations of the fingers.

A computational algorithm, described in Section 5, allows us to simulate a virtual hand using existing implementations of rigid-body and deformable-body simulations as black boxes. The use of implicit integration couples the degrees of freedom of the rigid and deformable components of the virtual hand, but our algorithm allows us to work around these couplings and still use existing implementations as black boxes.

We show our haptic rendering algorithm applied to the interaction between a hand model and other complex deformable objects with friction (see Fig. 5), with several moving deformable objects (see Fig. 6), and with self-collisions between the fingers (see Fig. 3). In Section 6, we also analyze the impact of various parameters of the virtual scene on the computational cost of our haptic rendering algorithm.

Section snippets

Related work

The rendering of haptic interaction with interesting virtual environments relies to a large extent on the ability to simulate efficiently effects such as deformations and contact among the objects in the virtual environment. We refer the reader to surveys on those topics for more information [18], [35], [28], [47]. The efficiency of the techniques for solving contact and deformation problems is growing steadily (see [5], [45], [20], [22], [43], [40] for some recent examples), but the complexity

Overview of the haptic rendering algorithm

Haptic rendering typically distinguishes interaction through a tool object from direct interaction with the hand. Tool-based rendering algorithms simulate a virtual model of the tool and compute contact between the tool and other objects in the environment using robust contact-modeling algorithms. Direct-hand rendering algorithms, on the other hand, track points on the finger tips and model contact forces based on single-point penalty-based models (see [13] for an example). Here, we present a

Hand model with rigid and deformable parts

To model the elasticity of the hand (our deformable tool), we use a linear co-rotational finite element model [34]. To apply the haptic rendering algorithm described in the previous section, we need to identify a rigid handle for each rigid frame tracked by the haptic device. In our case, using the Phantom Omni as the haptic device, we need one rigid handle. We choose to locate the rigid handle at the palm of the hand model. We need to define the rigid-body properties of the handle, but we do

Implicit solution for rigid and deformable parts

As part of the haptic rendering algorithm, we need to solve a constrained dynamics problem involving the deformable hand, the rigid handle, and possibly other dynamic objects in the virtual environment. Then, the coupling force between the tool-hand and the handle must be linearized w.r.t. the handle state, accounting for contact constraints and inertial effects. Both the constrained dynamics solver and the linearization require an efficient solution to the coupled dynamics of the tool-hand and

Experiments

We executed our experiments on a quad-core 2.4 GHz PC with 3 GB of memory (although we used only two processors, for the visual and haptic loops) and a GeForce 8800 GTS. We manipulated the models using a Phantom Omni haptic device from SensAble Technologies.

We tested our algorithm with some 3D scenarios consisting of one deformable tool (a model of a human hand, in most experiments) and a set of rigid and deformable objects. A video showing some of our experiments can be downloaded from //www.gmrv.es/cgarre/DivX_GarreOtaduyCG.avi

Discussion and future work

We have presented a haptic rendering algorithm for computing force interaction through deformable tools such as a human hand. Our algorithm differs from previous rendering algorithms in that it employs a deformable tool model that couples rigid and deformable components, and we have presented a computational algorithm that efficiently solves implicit integration under such coupling, thus allowing effective haptic rendering.

Our proposed algorithm constitutes a step forward toward full-hand

Acknowledgements

We would like to thank the anonymous reviewers for their helpful comments, and the GMRV group at URJC. This work was funded in part by the URJC - Comunidad de Madrid Project CCG08-URJC/DPI-3647 and by the Spanish Science and Innovation Dept. Projects TIN2009-07942 and PSE-300000-5.

References (49)

  • Adachi Y, Kumano T, Ogino K. Intermediate representation for stiff virtual objects. In: Virtual reality annual...
  • Astley OR, Hayward V. Multirate haptic simulation achieved by coupling finite element meshes through norton...
  • Baraff D, Witkin AP. Large steps in cloth simulation. In: Proceedings of ACM SIGGRAPH,...
  • Barbagli F, Prattichizzo D, Salisbury KJ. Dynamic local models for stable multi-contact haptic interaction with...
  • Barbič J, James DL. Real-time subspace integration for St. Venant-Kirchhoff deformable models. In: Proceedings of ACM...
  • J. Barbič et al.

    Six-DoF haptic rendering of contact between geometrically complex reduced deformable models

    IEEE Transactions on Haptics

    (2008)
  • M. Bouzit et al.

    The rutger master ii—new design force-feedback glove

    IEEE Transactions on Mechatronics

    (2004)
  • Çavuşoğlu MC, Tendick F. Multirate simulation for high fidelity haptic interaction with deformable objects in virtual...
  • Cobos S, Ferre M, Sanchez Uran M, Ortego J, Pena C. Efficient human hand kinematics for manipulation tasks. In:...
  • Colgate JE, Brown JM. Factors affecting the z-width of a haptic display. In: IEEE international conference on robotics...
  • Colgate JE, Stanley MC, Brown JM. Issues in the haptic display of tool use. In: Proceedings of IEEE/RSJ international...
  • D. Constantinescu et al.

    Local model of interaction for realistic manipulation of rigid virtual worlds

    International Journal of Robotics Research

    (2005)
  • de Pascale M, Sarcuni G, Prattichizzo D. Real-time soft-finger grasping of physically based quasi-rigid objects. In:...
  • Duriez C, Andriot C, Kheddar A. Signorini's contact model for deformable objects in haptic simulations. In: Proceedings...
  • C. Duriez et al.

    Realistic haptic rendering of interacting deformable objects in virtual environments

    Proceedings of IEEE TVCG

    (2006)
  • Garre C, Otaduy MA. Haptic rendering of complex deformations through handle-space force linearization. In: Proceedings...
  • Garre C, Otaduy MA. Toward haptic rendering of full-hand touch. In: Proceedings of Spanish computer graphics conference...
  • Gibson SF, Mirtich BV. A survey of deformable modeling in computer graphics. Technical Report, Mitsubishi Electric...
  • G.H. Golub et al.

    Matrix computations

    (1996)
  • Harmon D, Vouga E, Tamstorf R, Grinspun E. Robust treatment of simultaneous collisions. In: Proceedings of ACM...
  • D.E. Johnson et al.

    6-DOF haptic rendering using spatialized normal cone search

    IEEE TVCG

    (2005)
  • Kaufman DM, Sueda S, James DL, Pai DK. Staggered projections for frictional contact in multibody systems. In:...
  • Kawasaki H, Mouri T, Osama M, Sugihashi Y, Ohtuka Y, Ikenohata S, et al. Development of five-fingered haptic interface:...
  • Y.J. Kim et al.

    Six-degree-of-freedom haptic rendering using incremental and localized computations

    Presence

    (2003)
  • View full text