Abstract
Surgical training using 3D virtual reality simulators has become an important routine in medical education. Recent research points to Leap Motion as an exciting interface for the control of virtual surgical instruments due to its simplicity and low-cost characteristics. However, previous studies using Leap Motion only evaluated movements of the whole hand, without considering individual finger movements during the manipulation of surgical instruments. This work investigates the use of Leap Motion as an interface for the capture of basic hand and finger movements during a simulated hysteroscopy using a 3D-printed hysteroscope model. We created a virtual simulated uterine environment containing a hysteroscope controlled by movements of the hand and fingers of a user actuating on a 3D-printed model hysteroscope. The model hysteroscope was positioned in a pivot basis allowing the capture of the following basic movements: leftward/rightward, upward/downward, forward/backward, and extrusion/retraction of the virtual resection loop (which rests on the end of the virtual virtual resectoscope). The findings indicate that the arc-shaped paths of the hysteroscope’s alpha plane (rightward/leftward) and beta plane (upward/downward) movements are satisfactorily simulated by the virtual reality system. Using Intraclass Correlation, was noted that the similarity between the calculated (ideal standard) and measured arcs was highly significant on both planes (r = 0.9599 on the alpha plane, and r = 0.9208 on the beta plane). Also, the forward/backward trajectory is a straight line; the pinch gesture decreases its accuracy when increase its distance from the Interaction Box of Leap Motion. The results were satisfactorily since compared with previous works, which used Leap Motion for the capture of hands-free gesturing.
Similar content being viewed by others
References
Alvarez-Lopez F., Maina M.F., SaigÍ-RubiÓ F.: Natural user interfaces is it a solution to accomplish ubiquitous training in minimally invasive surgery? Surgical Innovation 4: 1–2, 2016
Jayakumar A., et al.: Interactive gesture based cataract surgery simulation.. In: 2015 Fifth International Conference on Advances in Computing and Communications (ICACC). IEEE, 2015, pp 350–353
Juanes J. A., et al.: Digital environment for movement control in surgical skill training. J. Med. Sys. 40 (6): 133, 2016
Lahanas V., Loukas C., Georgiou K., Lababidi H., Al-Jaroudi D.: Virtual reality-based assessment of basic laparoscopic skills using the Leap Motion controller. Surgical Endoscopy 31 (12): 5012–5023, 2017. https://doi.org/10.1007/s00464-017-5503-3
Larsen C. R., et al.: Objective assessment of gynecologic laparoscopic skills using the LapSimGyn virtual reality simulator. SurgEndosc 20: 1460–1466, 2006
Mello M. M. S., et al.: Correlação entre a curva de aprendizado e a dor referida pela paciente durante o procedimento histeroscópico ambulatorial sem anestesia. Rev. méd. Minas Gerais 24 (S9): 31–34, 2014
Nagendran M., et al. (2013) Virtual reality training for surgical trainees in laparoscopic surgery. The Cochrane Library
Ng Y. W., Fong Y. F.: Get ‘real’ with hysteroscopy using the pig bladder: A ‘uterine’model for hysteroscopy training. AnnalsoftheAcademyof Medicine 42 (1): 18–23, 2013
Oropesa I., et al.: Feasibility of tracking laparoscopic instruments in a box trainer using a Leap Motion Controller. Measurement 80: 115–124, 2016
Pace W. A. P., et al.: Perfil das pacientes do ambulató,rio da pós graduação em vídeo histeroscopia da Faculdade de Ciência Médicas de Minas Gerais–HUSJ. e-Scientia 6 (2): 17–25, 2013
Panel P. M., Neveu E., Villain C., Debras F., Fernandes H., Debras E.: Hysteroscopic resection on virtual reality simulator: What do we measure? Journal of Gynecology Obstetrics and Human Reproduction V (47): 247–252, 2018
Partridge R. W., et al.: The LEAPTM gesture interface device and take-home laparoscopic simulators: A study of construct and concurrent validity. Surgical Innovation 23 (1): 70–77, 2016
Piedra J. A., et al.: Virtual environment for the training of the hands in minimally invasive thoracic surgery.. In: 2016 8th International Conference on games and virtual worlds for serious applications (VS-Games), 2016, pp 1–4
Potter L., Araullo J., Carter L.: The leap motion controller: Aview on sign language.. In: Proceedings of the 25th Australian computer-human interaction conference: Augmentation, application, innovation, collaboration. ACM, 2013, pp 175–178
Rodrigues M., Di Martino P., Mairos J.: Excision of intracavitary masses in office hysteroscopy–what are the limits? Excisão de massas intracavitárias por histeroscopia office–quais os limites? Acta Obstet Ginecol Port 8 (3): 252–256, 2014
Sun X., et al.: Smart sensor-based motion detection system for hand movement training in open surgery. J. Med. Sys. 41 (2): 24, 2017
Vapenstad C., Buzink S. N: Procedural virtual reality simulation in minimally invasive surgery. Surgical Endoscopy 27 (2): 364–377, 2013
Weichert F., et al.: Analysis of the accuracy and robustness of the leap motion controller. Sensors 13 (5): 6380–6393, 2013
Zendejas B., et al.: State of the evidence on simulation-based training for laparoscopic surgery: A systematic review. Annals of Surgery 257 (4): 586–593, 2013
Funding
The authors also gratefully acknowledge the financial support of the Federal University of Para (UFPA).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
This article does not contain any studies with human participants or animals performed by any of the authors.
Conflict of interests
All Authors declares that has no conflict of interest.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
This article is part of the Topical Collection on Education & Training
Rights and permissions
About this article
Cite this article
Ferreira, S.C., Chaves, R.O., Seruffo, M.C.d.R. et al. Empirical Evaluation of a 3D Virtual Simulator of Hysteroscopy Using Leap Motion for Gestural Interfacing. J Med Syst 44, 198 (2020). https://doi.org/10.1007/s10916-020-01662-y
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s10916-020-01662-y