Abstract
Recently, autonomous underwater vehicles (AUVs) are being used in many applications. There is a limitation for underwater vehicle’s operation that takes a longer duration than the power capacity of underwater vehicles. Since underwater battery recharging station is supposed to be installed in the deep-sea bottom, the deep-sea docking experiments cannot avoid turbidity and dark environment. In this study, we propose a newly designed active 3D marker to improve the visibility of the 3D marker. A docking experiment apparatus was built. Two kinds of the 3D marker were used in the experiment, which were the active (light) and passive (non-light) 3D markers. The experimental results show the active 3D marker can be more recognizable in turbidity and dark environment than the passive 3D marker.





















Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Jasper A (2012) Oil/gas pipeline leak inspection and repair in underwater poor visibility conditions: challenges and perspectives. J Environ Prot 3(05):394
Kume A, Maki T, Sakamaki T, Ura T (2013) A method for obtaining high-coverage 3D images of rough seafloor using AUV-real-time quality evaluation and path-planning-. JRM 25(2):364–374
Ribas D, Palomeras N, Ridao P, Carreras M, Mallios A (2011) Girona 500 auv: from survey to intervention. IEEE/ASME Trans Mech 17(1):46–53
Krupinski S, Allibert G, Hua MD, Hamel T (2012) Pipeline tracking for fully-actuated autonomous underwater vehicle using visual servo control. In: 2012 American control conference (ACC), pp 6196–6202
Yu SC, Ura T, Fujii T, Kondo H (2001) Navigation of autonomous underwater vehicles based on artificial underwater landmarks. In: MTS/IEEE oceans 2001. An ocean Odyssey. Conference proceedings (IEEE Cat. No. 01CH37295), vol 1, pp 409–416
Cowen S, Briest S, Dombrowski J (1997) Underwater docking of autonomous undersea vehicles using optical terminal guidance. In: Oceans’ 97. MTS/IEEE conference proceedings, vol 2, pp 1143–1147
Eustice RM, Pizarro O, Singh H (2008) Visually augmented navigation for autonomous underwater vehicles. IEEE J Ocean Eng 33(2):103–122
Jung J, Choi S, Choi HT, Myung H (2016) Localization of AUVs using depth information of underwater structures from a monocular camera. In: 2016 13th International conference on ubiquitous robots and ambient intelligence (URAI), pp 444–446
Ghosh S, Ray R, Vadali SRK, Shome SN, Nandy S (2016) Reliable pose estimation of underwater dock using single camera: a scene invariant approach. Mach Vis Appl 27(2):221–236
Park JY, Jun BH, Lee PM, Oh J (2009) Experiments on vision guided docking of an autonomous underwater vehicle using one camera. Ocean Eng 36(1):48–61
Feezor MD, Sorrell FY, Blankinship PR, Bellingham JG (2001) Autonomous underwater vehicle homing/docking via electromagnetic guidance. IEEE J Ocean Eng 26(4):515–521
Negahdaripour S, Xu X, Khamene A (1998) A vision system for real-time positioning, navigation, and video mosaicing of sea floor imagery in the application of ROVs/AUVs. In: Proceedings fourth IEEE workshop on applications of computer vision. WACV’98 (Cat. No. 98EX201), pp 248–249
Palomeras N, Pen̈alver A, Massot-Campos M, Negre P, Fernandez J, Ridao P, Oliver-Codina G, et al (2016) I-AUV docking and panel intervention at sea. Sensors 16(10):1673
Maire FD, Prasser D, Dunbabin M, Dawson M (2009) A vision based target detection system for docking of an autonomous underwater vehicle. In: Proceedings of the 2009 Australasion conference on robotics and automation, Australian robotics and automation association
Sagara S, Ambar RB, Takemura F (2013) A stereo vision system for underwater vehicle-manipulator systems-proposal of a novel concept using pan-tilt-slide cameras-. JRM 25(5):785–794
Myint M, Yonemori K, Lwin KN, Yanou A, Minami M (2018) Dual-eyes vision-based docking system for autonomous underwater vehicle: an approach and experiments. J Intell Robot Syst 92(1):159–186
Myint M, Yonemori K, Yanou A, Ishiyama S, Minami M (2015) Robustness of visual-servo against air bubble disturbance of underwater vehicle system using three-dimensional marker and dual-eye cameras. In: OCEANS 2015-MTS/IEEE Washington, pp 1–8
Myint M, Yonemori K, Yanou A, Lwin KN, Minami M, Ishiyama S (2016) Visual servoing for underwater vehicle using dual-eyes evolutionary real-time pose tracking. JRM 28(4):543–558
Myint M, Yonemori K, Yanou A, Lwin KN, Minami M, Ishiyama S (2016) Visual-based deep sea docking simulation of underwater vehicle using dual-eyes cameras with lighting adaptation. In: OCEANS 2016-Shanghai, pp 1–8
Maki T, Shiroku R, Sato Y, Matsuda T, Sakamaki T, Ura T (2013) Docking method for hovering type AUVs by acoustic and visual positioning. In: 2013 IEEE international underwater technology symposium (UT), pp 1–6
Lwin KN, Myint M, Mukada N, Yamada D, Matsuno T, Saitou K et al (2019) Sea docking by dual-eye pose estimation with optimized genetic algorithm parameters. J Intell Robot Syst 96(2):245–266
Lwin KN, Mukada N, Myint M, Yamada D, Minami M, Matsuno T, Godou W (2018) Docking at pool and sea by using active marker in turbid and day/night environment. Artif Life Robot 23(3):409–419
Garcia R, Gracias N (2011) Detection of interest points in turbid underwater images. In: OCEANS 2011 IEEE-Spain, pp 1–9
Codevilla F, Gaya JDO, Duarte N, Botelho S (2004) Achieving turbidity robustness on underwater images local feature detection. Int J Comput Vis 60(2):91–110
Kanda Y, Mukada N, Yamada D, Lwin K, Myint M, Yamashita K, ..., Minami M (2018) Development and evaluation of active/lighting marker in turbidity and illumination variation environments. In: 2018 OCEANS-MTS/IEEE kobe techno-oceans (OTO), pp 1–7
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
This work was presented in part at the 24th International Symposium on Artificial Life and Robotics (Beppu, Oita, January 23–25, 2019).
About this article
Cite this article
Hsu, HY., Toda, Y., Watanabe, K. et al. Visibility improvement in relation to turbidity and distance, and application to docking. Artif Life Robotics 25, 453–465 (2020). https://doi.org/10.1007/s10015-020-00606-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10015-020-00606-6