Skip to main content
Log in

Visibility improvement in relation to turbidity and distance, and application to docking

  • Original Article
  • Published:
Artificial Life and Robotics Aims and scope Submit manuscript

Abstract

Recently, autonomous underwater vehicles (AUVs) are being used in many applications. There is a limitation for underwater vehicle’s operation that takes a longer duration than the power capacity of underwater vehicles. Since underwater battery recharging station is supposed to be installed in the deep-sea bottom, the deep-sea docking experiments cannot avoid turbidity and dark environment. In this study, we propose a newly designed active 3D marker to improve the visibility of the 3D marker. A docking experiment apparatus was built. Two kinds of the 3D marker were used in the experiment, which were the active (light) and passive (non-light) 3D markers. The experimental results show the active 3D marker can be more recognizable in turbidity and dark environment than the passive 3D marker.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21

Similar content being viewed by others

References

  1. Jasper A (2012) Oil/gas pipeline leak inspection and repair in underwater poor visibility conditions: challenges and perspectives. J Environ Prot 3(05):394

    Article  Google Scholar 

  2. Kume A, Maki T, Sakamaki T, Ura T (2013) A method for obtaining high-coverage 3D images of rough seafloor using AUV-real-time quality evaluation and path-planning-. JRM 25(2):364–374

    Article  Google Scholar 

  3. Ribas D, Palomeras N, Ridao P, Carreras M, Mallios A (2011) Girona 500 auv: from survey to intervention. IEEE/ASME Trans Mech 17(1):46–53

    Article  Google Scholar 

  4. Krupinski S, Allibert G, Hua MD, Hamel T (2012) Pipeline tracking for fully-actuated autonomous underwater vehicle using visual servo control. In: 2012 American control conference (ACC), pp 6196–6202

  5. Yu SC, Ura T, Fujii T, Kondo H (2001) Navigation of autonomous underwater vehicles based on artificial underwater landmarks. In: MTS/IEEE oceans 2001. An ocean Odyssey. Conference proceedings (IEEE Cat. No. 01CH37295), vol 1, pp 409–416

  6. Cowen S, Briest S, Dombrowski J (1997) Underwater docking of autonomous undersea vehicles using optical terminal guidance. In: Oceans’ 97. MTS/IEEE conference proceedings, vol 2, pp 1143–1147

  7. Eustice RM, Pizarro O, Singh H (2008) Visually augmented navigation for autonomous underwater vehicles. IEEE J Ocean Eng 33(2):103–122

    Article  Google Scholar 

  8. Jung J, Choi S, Choi HT, Myung H (2016) Localization of AUVs using depth information of underwater structures from a monocular camera. In: 2016 13th International conference on ubiquitous robots and ambient intelligence (URAI), pp 444–446

  9. Ghosh S, Ray R, Vadali SRK, Shome SN, Nandy S (2016) Reliable pose estimation of underwater dock using single camera: a scene invariant approach. Mach Vis Appl 27(2):221–236

    Article  Google Scholar 

  10. Park JY, Jun BH, Lee PM, Oh J (2009) Experiments on vision guided docking of an autonomous underwater vehicle using one camera. Ocean Eng 36(1):48–61

    Article  Google Scholar 

  11. Feezor MD, Sorrell FY, Blankinship PR, Bellingham JG (2001) Autonomous underwater vehicle homing/docking via electromagnetic guidance. IEEE J Ocean Eng 26(4):515–521

    Article  Google Scholar 

  12. Negahdaripour S, Xu X, Khamene A (1998) A vision system for real-time positioning, navigation, and video mosaicing of sea floor imagery in the application of ROVs/AUVs. In: Proceedings fourth IEEE workshop on applications of computer vision. WACV’98 (Cat. No. 98EX201), pp 248–249

  13. Palomeras N, Pen̈alver A, Massot-Campos M, Negre P, Fernandez J, Ridao P, Oliver-Codina G, et al (2016) I-AUV docking and panel intervention at sea. Sensors 16(10):1673

    Article  Google Scholar 

  14. Maire FD, Prasser D, Dunbabin M, Dawson M (2009) A vision based target detection system for docking of an autonomous underwater vehicle. In: Proceedings of the 2009 Australasion conference on robotics and automation, Australian robotics and automation association

  15. Sagara S, Ambar RB, Takemura F (2013) A stereo vision system for underwater vehicle-manipulator systems-proposal of a novel concept using pan-tilt-slide cameras-. JRM 25(5):785–794

    Article  Google Scholar 

  16. Myint M, Yonemori K, Lwin KN, Yanou A, Minami M (2018) Dual-eyes vision-based docking system for autonomous underwater vehicle: an approach and experiments. J Intell Robot Syst 92(1):159–186

    Article  Google Scholar 

  17. Myint M, Yonemori K, Yanou A, Ishiyama S, Minami M (2015) Robustness of visual-servo against air bubble disturbance of underwater vehicle system using three-dimensional marker and dual-eye cameras. In: OCEANS 2015-MTS/IEEE Washington, pp 1–8

  18. Myint M, Yonemori K, Yanou A, Lwin KN, Minami M, Ishiyama S (2016) Visual servoing for underwater vehicle using dual-eyes evolutionary real-time pose tracking. JRM 28(4):543–558

    Article  Google Scholar 

  19. Myint M, Yonemori K, Yanou A, Lwin KN, Minami M, Ishiyama S (2016) Visual-based deep sea docking simulation of underwater vehicle using dual-eyes cameras with lighting adaptation. In: OCEANS 2016-Shanghai, pp 1–8

  20. Maki T, Shiroku R, Sato Y, Matsuda T, Sakamaki T, Ura T (2013) Docking method for hovering type AUVs by acoustic and visual positioning. In: 2013 IEEE international underwater technology symposium (UT), pp 1–6

  21. Lwin KN, Myint M, Mukada N, Yamada D, Matsuno T, Saitou K et al (2019) Sea docking by dual-eye pose estimation with optimized genetic algorithm parameters. J Intell Robot Syst 96(2):245–266

    Article  Google Scholar 

  22. Lwin KN, Mukada N, Myint M, Yamada D, Minami M, Matsuno T, Godou W (2018) Docking at pool and sea by using active marker in turbid and day/night environment. Artif Life Robot 23(3):409–419

    Article  Google Scholar 

  23. Garcia R, Gracias N (2011) Detection of interest points in turbid underwater images. In: OCEANS 2011 IEEE-Spain, pp 1–9

  24. Codevilla F, Gaya JDO, Duarte N, Botelho S (2004) Achieving turbidity robustness on underwater images local feature detection. Int J Comput Vis 60(2):91–110

    Article  Google Scholar 

  25. Kanda Y, Mukada N, Yamada D, Lwin K, Myint M, Yamashita K, ..., Minami M (2018) Development and evaluation of active/lighting marker in turbidity and illumination variation environments. In: 2018 OCEANS-MTS/IEEE kobe techno-oceans (OTO), pp 1–7

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Horng-Yi Hsu.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This work was presented in part at the 24th International Symposium on Artificial Life and Robotics (Beppu, Oita, January 23–25, 2019).

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hsu, HY., Toda, Y., Watanabe, K. et al. Visibility improvement in relation to turbidity and distance, and application to docking. Artif Life Robotics 25, 453–465 (2020). https://doi.org/10.1007/s10015-020-00606-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10015-020-00606-6

Keywords

Navigation