Skip to main content

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 1153))

  • 3203 Accesses

Abstract

A remote collaboration system simplifies a collaboration task among multiple users at a distance. It allows local users see the real world with virtual guidance/annotations (provided by remote experts) seamlessly rendering over target objects in the real world, so they can easily follow the remote guidance to complete complicated tasks. An effective collaboration system often utilizes (1) computer vision algorithms (e.g. 3D reconstruction, and visual tracking and localization algorithms) to analyze objects and scenes in the real world, as well as (2) augmented reality (AR) to accurately superimpose virtual guidance/annotations over the target object in a real environment. However, some complex working environments with deformable linear objects (DLO), e.g. wire harness, prevents remote experts from easily annotating objects in the video, virtual scenes, or clear non-verbal communication due to the difficulty of DLO tracking. In this project, we proposed a remote guidance system, which is able to identify and track multiple DLOs, starting with minimal user input, so it allows remote experts to provide local users with simple and clear instructions/guidance in a complex environment.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Fussell, A.R., Setlock, L.D., Yang, J., Ou, J., Mauer, E., Kramer, A.D.I.: Gestures over video streams to support remote collaboration on physical tasks. HCI 19, 273–309 (2004)

    Google Scholar 

  2. Kirk, D., Fraser, D.S.: Comparing remote gesture technologies for supporting collaborative physical tasks. In: Proceedings of CHI 2006, pp. 1191–1200. ACM Press (2006)

    Google Scholar 

  3. Kirk, D., Rodden, T., Fraser, D.S.: Turn it this way: grounding collaborative action with remote gesture. In: Proceedings of CHI 2007, pp. 1039–1048. ACM Press (2007)

    Google Scholar 

  4. Gergle, D., Kraut, R.E., Fussell, S.R.: Action as language in a shared visual space. In: Proceedings of CSCW 2004, pp. 487–496. ACM Press (2004)

    Google Scholar 

  5. Gurevich, P., Lanir, J., Cohen, B., Stone, R.: TeleAdvisor: a versatile augmented reality tool for remote assistance. In: SIGCHI, p. 6 (2012)

    Google Scholar 

  6. Azuma, R.T.: A survey of augmented reality. Presence Teleoperators Virtual Environ. 6(4), 355–385 (1997)

    Article  Google Scholar 

  7. Adcock, M., Gunn, C.: Using projected light for mobile remote guidance. Comput. Support. Coop. Work 24(6), 591–611 (2015)

    Article  Google Scholar 

  8. Wong, K.: HandsOn: a portable system for collaboration on virtual 3D objects using binocular optical head-mounted display. Master thesis, MIT (2015)

    Google Scholar 

  9. Tait, M., Billinghurst, M.: The effect of view independence in a collaborative AR system. Comput. Support. Coop. Work 24(6), 563–589 (2015)

    Article  Google Scholar 

  10. Wang, S., et al.: Augmented reality as a telemedicine platform for remote procedural training. Sensors 17, 2294 (2017)

    Article  Google Scholar 

  11. Piumsomboon, T., Lee, G.A., Hart, J.D., Ens, B., Lindeman, R.W., Thomas, B.H., Billinghurst, M.: Mini-Me: an adaptive avatar for mixed reality remote collaboration. In: CHI 2018, Paper 46. ACM, New York (2018)

    Google Scholar 

  12. Lee, Y., Masai, K., Kunze, K., Sugimoto, M., Billinghurst, M.: A remote collaboration system with empathy glasses. In: ISMAR (2016)

    Google Scholar 

  13. Tecchia, F., Alem, L., Huang, W.: 3D helping hands: a gesture based MR system for remote collaboration. In: Proceedings of ACM VRCAI, pp. 323–328 (2012)

    Google Scholar 

  14. O’Neill, J., Castellani, S., Roulland, F., Hairon, N., Juliano, C, Dai, L.: From ethnographic study to mixed reality: a remote collaborative troubleshooting system. In: CSCW11, pp. 225–234 (2011)

    Google Scholar 

  15. Javdani, S., Tandon, S., Tang, J., O’Brien, J.F., Abbeel, P.: Modeling and perception of deformable one-dimensional objects. In: 2011 IEEE International Conference on Robotics and Automation (ICRA), pp. 1607–1614. IEEE (2011)

    Google Scholar 

  16. Jackson, R.C., Yuan, R., Chow, D.L., Newman, W., Cavusoglu, M.C.: Automatic initialization and dynamic tracking of surgical suture threads. In: ICRA, p. 4710 (2015)

    Google Scholar 

  17. Saha, M., Isto, P.: Manipulation planning for deformable linear objects. IEEE Trans. Robot. 23, 1141–1150 (2007)

    Article  Google Scholar 

  18. Lui, W.H., Saxena, A.: Tangled: learning to untangle ropes with RGB-D perception. In: IROS (2013)

    Google Scholar 

  19. Nair, A., Chen, D., Agrawal, P., Isola, P., Abbeel, P., Malik, J., Levine, S.: Combining self-supervised learning and imitation for vision-based rope manipulation. In: 2017 IEEE International Conference on Robotics and Automation (ICRA), pp. 2146–2153 (2017)

    Google Scholar 

  20. Schulman, J., Lee, A., Ho, J., Abbeel, P.: Tracking deformable objects with point clouds. In: ICRA, pp. 1130–1137 (2013)

    Google Scholar 

  21. Hopcroft, J.E., Kearney, J.K., Krat, D.B.: A case study of exible object manipulation. Int. J. Robot. Res. 10, 41–50 (1991)

    Article  Google Scholar 

  22. Remde, A., Henrich, D., Worn, H. Picking-up deformable linear objects with industrial robots (1999)

    Google Scholar 

  23. Yue, S., Henrich, D.: Manipulating deformable linear objects: sensor-based fast manipulation during vibration. In: Robotics and Automation, Proceedings. ICRA 2002 (2002)

    Google Scholar 

  24. Alvarez, N., Yamazaki, K., Matsubara, T.: An approach to realistic physical simulation of digitally captured deformable linear objects. In: IEEE International Conference on Simulation, Modeling, and Programming for Autonomous Robots (SIMPAR), p. 135. IEEE (2016)

    Google Scholar 

  25. Koo, K.M., Jiang, X., Kikuchi, K., Konno, A., Uchiyama, M.: Development of a robot car wiring system. In: 2008 IEEE/ASME International Conference on Advanced Intelligent Mechatronics. AIM 2008, pp. 862–867. IEEE (2008)

    Google Scholar 

  26. Sun, X., Cheung, N.-M., Yao, H., Guo, Y.: Non-rigid object tracking via deformable patches using shape-preserved KCF and level sets. In: CVPR (2017)

    Google Scholar 

  27. Lukežič, A., Zajc, L.Č., Kristan, M.: Deformable parts correlation filters for robust visual tracking. IEEE Trans. Cybern. 48, 1849–1861 (2017)

    Article  Google Scholar 

  28. Zhong, Y., Jain, A.K., Dubuisson-Jolly, M.-P.: Object tracking using deformable templates. TPAMI 22, 544–549 (2000)

    Article  Google Scholar 

  29. Leymarie, F., Levine, M.D.: Tracking deformable objects in the plane using an active contour model. TPAMI 15, 617–634 (1993)

    Article  Google Scholar 

  30. Dai, J., Qi, H., Xiong, Y., Li, Y., Zhang, G., Hu, H., Wei, Y.: Deformable convolutional networks. In: ICCV (2017)

    Google Scholar 

  31. Liu, W., Song, Y., Chen, D., He, S., Yu, Y., Yan, T., Hancke, G., Lau, R.: Deformable object tracking with gated fusion. IEEE Trans. Image Process. 28, 3766–3777 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  32. De Gregorio, D., Palli, G., Di Stefano, L.: Let’s take a walk on superpixels graphs: deformable linear objects segmentation and model estimation. In: ACCV (2018)

    Google Scholar 

  33. Danelljan, M., Häger, G., Shahbaz Khan, F., Felsberg, M.: Accurate scale estimation for robust visual tracking. In: BMCV (2014)

    Google Scholar 

  34. Achanta, R., Shaji, A., Smith, K., Lucchi, A., Fua, P., Susstrunk, S.: SLIC superpixels compared to state-of-the-art superpixel methods. TPAMI 34, 2274 (2012)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hao Tang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zou, H., Tang, H. (2020). Remote Collaboration in a Complex Environment. In: Hassanien, AE., Azar, A., Gaber, T., Oliva, D., Tolba, F. (eds) Proceedings of the International Conference on Artificial Intelligence and Computer Vision (AICV2020). AICV 2020. Advances in Intelligent Systems and Computing, vol 1153. Springer, Cham. https://doi.org/10.1007/978-3-030-44289-7_76

Download citation

Publish with us

Policies and ethics