Abstract
A remote collaboration system simplifies a collaboration task among multiple users at a distance. It allows local users see the real world with virtual guidance/annotations (provided by remote experts) seamlessly rendering over target objects in the real world, so they can easily follow the remote guidance to complete complicated tasks. An effective collaboration system often utilizes (1) computer vision algorithms (e.g. 3D reconstruction, and visual tracking and localization algorithms) to analyze objects and scenes in the real world, as well as (2) augmented reality (AR) to accurately superimpose virtual guidance/annotations over the target object in a real environment. However, some complex working environments with deformable linear objects (DLO), e.g. wire harness, prevents remote experts from easily annotating objects in the video, virtual scenes, or clear non-verbal communication due to the difficulty of DLO tracking. In this project, we proposed a remote guidance system, which is able to identify and track multiple DLOs, starting with minimal user input, so it allows remote experts to provide local users with simple and clear instructions/guidance in a complex environment.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Fussell, A.R., Setlock, L.D., Yang, J., Ou, J., Mauer, E., Kramer, A.D.I.: Gestures over video streams to support remote collaboration on physical tasks. HCI 19, 273–309 (2004)
Kirk, D., Fraser, D.S.: Comparing remote gesture technologies for supporting collaborative physical tasks. In: Proceedings of CHI 2006, pp. 1191–1200. ACM Press (2006)
Kirk, D., Rodden, T., Fraser, D.S.: Turn it this way: grounding collaborative action with remote gesture. In: Proceedings of CHI 2007, pp. 1039–1048. ACM Press (2007)
Gergle, D., Kraut, R.E., Fussell, S.R.: Action as language in a shared visual space. In: Proceedings of CSCW 2004, pp. 487–496. ACM Press (2004)
Gurevich, P., Lanir, J., Cohen, B., Stone, R.: TeleAdvisor: a versatile augmented reality tool for remote assistance. In: SIGCHI, p. 6 (2012)
Azuma, R.T.: A survey of augmented reality. Presence Teleoperators Virtual Environ. 6(4), 355–385 (1997)
Adcock, M., Gunn, C.: Using projected light for mobile remote guidance. Comput. Support. Coop. Work 24(6), 591–611 (2015)
Wong, K.: HandsOn: a portable system for collaboration on virtual 3D objects using binocular optical head-mounted display. Master thesis, MIT (2015)
Tait, M., Billinghurst, M.: The effect of view independence in a collaborative AR system. Comput. Support. Coop. Work 24(6), 563–589 (2015)
Wang, S., et al.: Augmented reality as a telemedicine platform for remote procedural training. Sensors 17, 2294 (2017)
Piumsomboon, T., Lee, G.A., Hart, J.D., Ens, B., Lindeman, R.W., Thomas, B.H., Billinghurst, M.: Mini-Me: an adaptive avatar for mixed reality remote collaboration. In: CHI 2018, Paper 46. ACM, New York (2018)
Lee, Y., Masai, K., Kunze, K., Sugimoto, M., Billinghurst, M.: A remote collaboration system with empathy glasses. In: ISMAR (2016)
Tecchia, F., Alem, L., Huang, W.: 3D helping hands: a gesture based MR system for remote collaboration. In: Proceedings of ACM VRCAI, pp. 323–328 (2012)
O’Neill, J., Castellani, S., Roulland, F., Hairon, N., Juliano, C, Dai, L.: From ethnographic study to mixed reality: a remote collaborative troubleshooting system. In: CSCW11, pp. 225–234 (2011)
Javdani, S., Tandon, S., Tang, J., O’Brien, J.F., Abbeel, P.: Modeling and perception of deformable one-dimensional objects. In: 2011 IEEE International Conference on Robotics and Automation (ICRA), pp. 1607–1614. IEEE (2011)
Jackson, R.C., Yuan, R., Chow, D.L., Newman, W., Cavusoglu, M.C.: Automatic initialization and dynamic tracking of surgical suture threads. In: ICRA, p. 4710 (2015)
Saha, M., Isto, P.: Manipulation planning for deformable linear objects. IEEE Trans. Robot. 23, 1141–1150 (2007)
Lui, W.H., Saxena, A.: Tangled: learning to untangle ropes with RGB-D perception. In: IROS (2013)
Nair, A., Chen, D., Agrawal, P., Isola, P., Abbeel, P., Malik, J., Levine, S.: Combining self-supervised learning and imitation for vision-based rope manipulation. In: 2017 IEEE International Conference on Robotics and Automation (ICRA), pp. 2146–2153 (2017)
Schulman, J., Lee, A., Ho, J., Abbeel, P.: Tracking deformable objects with point clouds. In: ICRA, pp. 1130–1137 (2013)
Hopcroft, J.E., Kearney, J.K., Krat, D.B.: A case study of exible object manipulation. Int. J. Robot. Res. 10, 41–50 (1991)
Remde, A., Henrich, D., Worn, H. Picking-up deformable linear objects with industrial robots (1999)
Yue, S., Henrich, D.: Manipulating deformable linear objects: sensor-based fast manipulation during vibration. In: Robotics and Automation, Proceedings. ICRA 2002 (2002)
Alvarez, N., Yamazaki, K., Matsubara, T.: An approach to realistic physical simulation of digitally captured deformable linear objects. In: IEEE International Conference on Simulation, Modeling, and Programming for Autonomous Robots (SIMPAR), p. 135. IEEE (2016)
Koo, K.M., Jiang, X., Kikuchi, K., Konno, A., Uchiyama, M.: Development of a robot car wiring system. In: 2008 IEEE/ASME International Conference on Advanced Intelligent Mechatronics. AIM 2008, pp. 862–867. IEEE (2008)
Sun, X., Cheung, N.-M., Yao, H., Guo, Y.: Non-rigid object tracking via deformable patches using shape-preserved KCF and level sets. In: CVPR (2017)
Lukežič, A., Zajc, L.Č., Kristan, M.: Deformable parts correlation filters for robust visual tracking. IEEE Trans. Cybern. 48, 1849–1861 (2017)
Zhong, Y., Jain, A.K., Dubuisson-Jolly, M.-P.: Object tracking using deformable templates. TPAMI 22, 544–549 (2000)
Leymarie, F., Levine, M.D.: Tracking deformable objects in the plane using an active contour model. TPAMI 15, 617–634 (1993)
Dai, J., Qi, H., Xiong, Y., Li, Y., Zhang, G., Hu, H., Wei, Y.: Deformable convolutional networks. In: ICCV (2017)
Liu, W., Song, Y., Chen, D., He, S., Yu, Y., Yan, T., Hancke, G., Lau, R.: Deformable object tracking with gated fusion. IEEE Trans. Image Process. 28, 3766–3777 (2019)
De Gregorio, D., Palli, G., Di Stefano, L.: Let’s take a walk on superpixels graphs: deformable linear objects segmentation and model estimation. In: ACCV (2018)
Danelljan, M., Häger, G., Shahbaz Khan, F., Felsberg, M.: Accurate scale estimation for robust visual tracking. In: BMCV (2014)
Achanta, R., Shaji, A., Smith, K., Lucchi, A., Fua, P., Susstrunk, S.: SLIC superpixels compared to state-of-the-art superpixel methods. TPAMI 34, 2274 (2012)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Zou, H., Tang, H. (2020). Remote Collaboration in a Complex Environment. In: Hassanien, AE., Azar, A., Gaber, T., Oliva, D., Tolba, F. (eds) Proceedings of the International Conference on Artificial Intelligence and Computer Vision (AICV2020). AICV 2020. Advances in Intelligent Systems and Computing, vol 1153. Springer, Cham. https://doi.org/10.1007/978-3-030-44289-7_76
Download citation
DOI: https://doi.org/10.1007/978-3-030-44289-7_76
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-44288-0
Online ISBN: 978-3-030-44289-7
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)