Skip to main content

Advertisement

Log in

Early prediction for physical human robot collaboration in the operating room

  • Published:
Autonomous Robots Aims and scope Submit manuscript

Abstract

To enable a natural and fluent human robot collaboration flow, it is critical for a robot to comprehend their human peers’ on-going actions, predict their behaviors in the near future, and plan its actions correspondingly. Specifically, the capability of making early predictions is important, so that the robot can foresee the precise timing of a turn-taking event and start motion planning and execution early enough to smooth the turn-taking transition. Such proactive behavior would reduce human’s waiting time, increase efficiency and enhance naturalness in collaborative task. To that end, this paper presents the design and implementation of an early turn-taking prediction algorithm, catered for physical human robot collaboration scenarios. Specifically, a robotic scrub nurse system which can comprehend surgeon’s multimodal communication cues and perform turn-taking prediction is presented. The developed algorithm was tested on a collected data set of simulated surgical procedures in a surgeon–nurse tandem. The proposed turn-taking prediction algorithm is found to be significantly superior to its algorithmic counterparts, and is more accurate than human baseline when little partial input is given (less than 30% of full action). After observing more information, the algorithm can achieve comparable performances as humans with a F1 score of 0.90.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

Download references

Acknowledgements

The authors would like to thank Dr. Rashid Mazhar and Dr. Carlos Velasquez from Hamad Medical Corporation (Qatar) for their collaboration and discussion of the project. That authors would also like to thank all the members of ISAT lab for their inspiring discussions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Juan Pablo Wachs.

Additional information

This is one of the several papers published in Autonomous Robots comprising the Special Issue on Learning for Human-Robot Collaboration.

Research supported by the NPRP award (NPRP 6-449-2-181) from the Qatar National Research Fund (a member of The Qatar Foundation). The statements made herein are solely the responsibility of the authors.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhou, T., Wachs, J.P. Early prediction for physical human robot collaboration in the operating room. Auton Robot 42, 977–995 (2018). https://doi.org/10.1007/s10514-017-9670-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10514-017-9670-9

Keywords