Abstract
Microsurgery is technically challenging, demanding both rigorous precision under the operating microscope and great care when handling tissue. Applying excessive force can result in irreversible tissue injury, but sufficient force must be exerted to carry out manoeuvres in an efficient manner. Technological advances in hand-held instruments have allowed the integration of force sensing capabilities into surgical tools, resulting in the possibility of force feedback during an operation. This paper presents a novel method of graduated online visual force-feedback for hand-held microsurgical instruments. Unlike existing visual force-feedback techniques, the force information is integrated into the surgical scene by highlighting the area around the point of contact while preserving salient anatomical features. We demonstrate that the proposed technique can be integrated seamlessly with image guidance techniques. Critical anatomy beyond the exposed tissue surface is revealed using an augmented reality overlay when the user is exerting large forces within their proximity. The force information is further used to improve the quality of the augmented reality by displacing the overlay based on the forces exerted. Detailed user studies were performed to assess the efficacy of the proposed method.
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Payne, C.J., Yang, G.Z.: Hand-held medical robots. Annals of Biomedical Engineering 42(8), 1594–1605 (2014)
Akinbiyi, T., Reiley, C.E., Saha, S., Burschka, D., Hasser, C.J., Yuh, D.D., Okamura, A.M.: Dynamic augmented reality for sensory substitution in robot-assisted surgical systems. In: Conf. Proc. IEEE Eng. Med. Biol. Soc. pp. 315–333 (2006)
Kitagawa, M., Dokko, D., Okamura, A.M., Yuh, D.D.: Effect of sensory substitution on suture-manipulation forces for robotic surgical systems. J. of Thorac. and Cardiovas. Surg. 129(1), 151–158 (2005)
Lerotic, M., Chung, A.J., Mylonas, G.P., Yang, G.Z.: pq-space based non-photorealistic rendering for augmented reality. In: Ayache, N., Ourselin, S., Maeder, A. (eds.) MICCAI 2007, Part II. LNCS, vol. 4792, pp. 102–109. Springer, Heidelberg (2007)
Marcus, H.J., Pratt, P., Hughes-Hallett, A., Cundy, T.P., Marcus, A.P., Yang, G.Z., Darzi, A., Nandi, D.: Comparative effectiveness and safety of image guidance systems in neurosurgery: a preclinical randomized study. J. of Neurosur. (2015)
Miller, K., Chinzei, K., Orssengo, G., Bednarz, P.: Mechanical properties of brain tissue in-vivo: Experiment and computer simulation. J. of Biomechanics 33(11), 1369–1376 (2000)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Gras, G., Marcus, H.J., Payne, C.J., Pratt, P., Yang, GZ. (2015). Visual Force Feedback for Hand-Held Microsurgical Instruments. In: Navab, N., Hornegger, J., Wells, W., Frangi, A. (eds) Medical Image Computing and Computer-Assisted Intervention -- MICCAI 2015. MICCAI 2015. Lecture Notes in Computer Science(), vol 9349. Springer, Cham. https://doi.org/10.1007/978-3-319-24553-9_59
Download citation
DOI: https://doi.org/10.1007/978-3-319-24553-9_59
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-24552-2
Online ISBN: 978-3-319-24553-9
eBook Packages: Computer ScienceComputer Science (R0)