Skip to main content

Advertisement

Log in

The development of non-contact user interface of a surgical navigation system based on multi-LSTM and a phantom experiment for zygomatic implant placement

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose

Image-guided surgical navigation system (SNS) has proved to be an increasingly important assistance tool for mini-invasive surgery. However, using standard devices such as keyboard and mouse as human–computer interaction (HCI) is a latent vector of infectious medium, causing risks to patients and surgeons. To solve the human–computer interaction problem, we proposed an optimized structure of LSTM based on a depth camera to recognize gestures and applied it to an in-house oral and maxillofacial surgical navigation system (Qin et al. in Int J Comput Assist Radiol Surg 14(2):281–289, 2019).

Methods

The proposed optimized structure of LSTM named multi-LSTM allows multiple input layers and takes into account the relationships between inputs. To combine the gesture recognition with the SNS, four left-hand signs waving along four directions were designed to correspond to four operations of the mouse, and the motion of right hand was used to control the movement of the cursor. Finally, a phantom study for zygomatic implant placement was conducted to evaluate the feasibility of multi-LSTM as HCI.


Results

3D hand trajectories of both wrist and elbow from 10 participants were collected to train the recognition network. Then tenfold cross-validation was performed for judging signs, and the mean accuracy was 96% ± 3%. In the phantom study, four implants were successfully placed, and the average deviations of planned–placed implants were 1.22 mm and 1.70 mm for the entry and end points, respectively, while the angular deviation ranged from 0.4° to 2.9°.

Conclusion

The results showed that this non-contact user interface based on multi-LSTM could be used as a promising tool to eliminate the disinfection problem in operation room and alleviate manipulation complexity of surgical navigation system.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Qin C, Cao Z, Fan S, Wu Y, Sun Y, Politis C, Wang C, Chen X (2019) An oral and maxillofacial navigation system for implant placement with automatic identification of fiducial points. Int J Comput Assist Radiol Surg 14(2):281–289

    Article  PubMed  Google Scholar 

  2. Sukegawa S, Kanno T, Furuki Y (2018) Application of computer- assisted navigation systems in oral and maxillofacial surgery. Jpn Dent Sci Rev 4(3):139–149

    Article  Google Scholar 

  3. Chen X, Xu L, Wang H, Wang F, Wang Q, Kikinis R (2017) Development of a surgical navigation system based on 3D Slicer for intraoperative implant placement surgery. Med Eng Phys 41:81–89

    Article  PubMed  PubMed Central  Google Scholar 

  4. Ebert LC, Hatch G, Thali MJ (2013) Invisible touch—control of a DICOM viewer with finger gestures using the Kinect depth camera. J Forensic Radiol Imaging 1(1):10–14

    Article  Google Scholar 

  5. Cheng H, Yang L, Liu Z (2016) Survey on 3D hand gesture recognition. IEEE Trans Circuits Syst Video Technol 26(9):1659–1673

    Article  Google Scholar 

  6. Gkalelis N, Kim H, Hilton A, Nikolaidis N, Pitas I (2009) The i3DPost multi-view and 3D human action/interaction database. In: Proc. conf. vis. media prod, pp 159–168

  7. Ren Z, Yuan J, Zhang Z (2011) Robust hand gesture recognition based on finger-earth mover’s distance with a commodity depth camera. In: Proc. ACM MM, pp 1093–1096

  8. Gallo L (2014) Hand shape classification using depth data for unconstrained 3D interaction. J Ambient Intell Smart Environ 6(1):93–105

    Google Scholar 

  9. Bhuyan MK, Ajay Kumar D, Macdorman KF, Iwahori Y (2014) A novel set of features for continuous hand gesture recognition. J Multimodal User Interfaces 8(4):333–343

    Article  Google Scholar 

  10. Cheng H, Luo J, Chen X (2014) A windowed dynamic time warping approach for 3D continuous hand gesture recognition. In: IEEE international conference on multimedia and expo (ICME)

  11. Liou WG, Hsieh CY, Lin WY (2011) Trajectory-based sign language recognition using discriminant analysis in higher-dimensional feature space. In: Proc. IEEE ICME, pp 1–4

  12. Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780

    Article  CAS  PubMed  Google Scholar 

  13. Graves A, Jaitly N, Mohamed AR (2014) Hybrid speech recognition with deep bidirectional LSTM. In: IEEE automatic speech recognition & understanding, pp 273–278

  14. Faysal U, Coskun Y, Sener BC, Atilla S (2013) Rehabilitation of posterior maxilla with zygomatic and dental implant after tumor resection: a case report. Case Rep Dent 2013:1–5

    Google Scholar 

  15. Aparicio C, Manresa C, Francisco K, Claros P, Alández J, González-Martín O, Albrektsson T (2000) Zygomatic implants: indications, techniques and outcomes, and the zygomatic success code. Periodontology 66:41–58

    Article  Google Scholar 

  16. Wang F, Monje A, Lin GH, Wu Y, Monje F, Wang HL, Davó R (2015) Reliability of four zygomatic implant-supported prostheses for the rehabilitation of the atrophic maxilla: a systematic review. Int J Oral Maxillofac Implants 30(2):293–298

    Article  CAS  PubMed  Google Scholar 

  17. West JB, Fitzpatrick JM, Toms SA, Maurer CR Jr, Maciunas RJ (2001) Fiducial point placement and the accuracy of point-based, rigid body registration. Neurosurgery 48(4):810–816

    CAS  PubMed  Google Scholar 

  18. Chen X, Xu L, Yang Y, Egger J (2016) A semi-automatic computer-aided method for surgical template design. Sci Rep 4(6):20280

    Article  Google Scholar 

  19. Bautista MA, Hernandezvela A, Escalera S, Igual L, Pujol O, Moya J, Violant V, Anguera MT (2016) A gesture recognition system for detecting behavioral patterns of ADHD. IEEE Trans Cybern 46(1):136–147

    Article  PubMed  Google Scholar 

  20. Li YT, Wachs JP (2014) HEGM: a hierarchical elastic graph matching for hand gesture recognition. Pattern Recognit 47(1):80–88

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by grants from the National Key R&D Program of China (2017YFB1302903; 2017YFB1104100), the National Natural Science Foundation of China (81828003), the PHC CAI YUANPEI Program (41366SA), the Foundation of Science and Technology Commission of Shanghai Municipality (16441908400; 18511108200), and the Shanghai Jiao Tong University Foundation on Medical and Technological Joint Science Research (YG2016ZD01).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiaojun Chen.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Qin, C., Ran, X., Wu, Y. et al. The development of non-contact user interface of a surgical navigation system based on multi-LSTM and a phantom experiment for zygomatic implant placement. Int J CARS 14, 2147–2154 (2019). https://doi.org/10.1007/s11548-019-02031-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-019-02031-y

Keywords

Navigation