Skip to main content
Log in

Online underwater optical mapping for trajectories with gaps

  • Original Research Paper
  • Published:
Intelligent Service Robotics Aims and scope Submit manuscript

Abstract

This paper proposes a vision-only online mosaicing method for underwater surveys. Our method tackles a common problem in low-cost imaging platforms, where complementary navigation sensors produce imprecise or even missing measurements. Under these circumstances, the success of the optical mapping depends on the continuity of the acquired video stream. However, this continuity cannot be always guaranteed due to the motion blurs or lack of texture, common in underwater scenarios. Such temporal gaps hinder the extraction of reliable motion estimates from visual odometry, and compromise the ability to infer the presence of loops for producing an adequate optical map. Unlike traditional underwater mosaicing methods, our proposal can handle camera trajectories with gaps between time-consecutive images. This is achieved by constructing minimum spanning tree which verifies whether the current topology is connected or not. To do so, we embed a trajectory estimate correction step based on graph theory algorithms. The proposed method was tested with several different underwater image sequences and results were presented to illustrate the performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Notes

  1. Identity mapping is no rotation (0 degree with respect to the chosen global frame), no translation and scale equals to 1.

  2. The first image frame is usually considered as a global frame in the absence of any other relevant information.

  3. http://www.ifremer.fr/biocean/acces_gb/rapports/Appel_2cruisefr.htql?numcruise=203. Accessed on August 25th, 2015.

References

  1. Gleason A, Gracias N, Lirman D, Gintert B, Smith T, Dick M, Reid R (2010) Landscape video mosaic from a mesophotic coral reef. Coral Reefs 29(2):253

    Article  Google Scholar 

  2. GNOM Baby (2015). http://www.gnomrov.com/products/gnom-baby/. Accessed 16 Mar 2016

  3. VideoRay Scout (2015). http://shop.videoray.com/shop-front#!/VideoRay-Scout-Remotely-Operated-Vehicle-ROV-System/p/39381588/category=0. Accessed 16 Mar 2016

  4. SeaBotix LBV150-4 MiniROV (2013). http://www.seabotix.com/products/lbv150-4.htm. Accessed 16 Mar 2016

  5. Proteus 500 ROV (2014). http://www.hydroacousticsinc.com/products/rov-remote-operated-vehicles/rov-product-specs.html. Accessed 16 Mar 2016

  6. Caballero F, Merino L, Ferruz J, Ollero A (2007) Homography based Kalman filter for mosaic building. applications to UAV position estimation. In: IEEE international conference on robotics and automation, pp 2004–2009

  7. Elibol A, Gracias N, Garcia R, Gleason A, Gintert B, Lirman D, Reid PR (2011) Efficient autonomous image mosaicing with applications to coral reef monitoring. In: IROS 2011 workshop on robotics for environmental monitoring

  8. Garcia-Fidalgo E, Ortiz A, Bonnin-Pascual F, Company JP (2015) A mosaicing approach for vessel visual inspection using a micro-aerial vehicle. In: 2015 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 104–110

  9. Garcia R, Puig J, Ridao P, Cufí X (2002) Augmented state Kalman filtering for AUV navigation. In: IEEE international conference on robotics and automation, Washington, D.C., vol 3, pp 4010–4015

  10. Eustice R, Pizarro O, Singh H (2004) Visually augmented navigation in an unstructured environment using a delayed state history. In: 2004 IEEE international conference on robotics and automation, 2004. Proceedings. ICRA’04, vol 1. IEEE, pp 25–32

  11. Richmond K, Rock SM (2006) An operational real-time large-scale visual mosaicking and navigation system. In: OCEANS 2006. IEEE, pp 1–6

  12. Mahon I, Williams SB, Pizarro O, Johnson-Roberson M (2008) Efficient view-based slam using visual loop closures. IEEE Trans Robot 24(5):1002–1014

    Article  Google Scholar 

  13. Kim A, Eustice R (2009) Pose-graph visual slam with geometric model selection for autonomous underwater ship hull inspection. In: IEEE/RSJ international conference on intelligent robots and systems (IROS’09). IEEE, pp 1559–1565

  14. Vaganay J, Elkins M, Willcox S, Hover F, Damus R, Desset S, Morash J, Polidoro V (2005) Ship hull inspection by hull-relative navigation and control. In: OCEANS, 2005. Proceedings of MTS/IEEE. IEEE, pp 761–766

  15. Bülow H, Birk A (2009) Fast and robust photomapping with an unmanned aerial vehicle (UAV). In: IEEE/RSJ international conference on intelligent robots and systems (IROS’09). IEEE, pp 3368–3373

  16. Ferreira F, Veruggio G, Caccia M, Bruzzone G (2012) Real-time optical slam-based mosaicking for unmanned underwater vehicles. Intell Serv Robot 5(1):55–71

    Article  Google Scholar 

  17. Kekec T, Yildirim A, Unel M (2014) A new approach to real-time mosaicing of aerial images. Robot Auton Syst 62(12):1755–1767

    Article  Google Scholar 

  18. Elibol A, Gracias N, Garcia R (2012) Efficient topology estimation for large scale optical mapping. In: Springer tracts in advanced robotics, vol 82. Springer, New York

  19. Elibol A, Gracias N, Garcia R (2010) Augmented state-extended Kalman filter combined framework for topology estimation in large-area underwater mapping. J Field Robot 27(5):656–674

    Article  Google Scholar 

  20. Elibol A, Garcia R, Gracias N (2011) A new global alignment approach for underwater optical mapping. Ocean Eng 38(10):1207–1219

    Article  Google Scholar 

  21. Elibol A, Gracias N, Garcia R (2013) Fast topology estimation for image mosaicing using adaptive information thresholding. Robot Auton Syst 61(2):125–136

    Article  Google Scholar 

  22. Moon H, Tully S, Kantor G, Choset H (2007) Square root iterated Kalman filter for bearing-only SLAM. In: The 4th international conference on ubiquitous robots and ambient intelligence (URAI’07), Republic of Korea

  23. Garcia R, Gracias N (2011) Detection of interest points in turbid underwater images. In: OCEANS, IEEE, pp 1–9

  24. Yang X, Cheng KT (2014) Local difference binary for ultrafast and distinctive feature description. IEEE Trans Pattern Anal Mach Intell 36(1):188–194

    Article  Google Scholar 

  25. Lowe D (2004) Distinctive image features from scale-invariant keypoints. Int J Comput Vis 60(2):91–110

    Article  Google Scholar 

  26. Ila V, Porta JM, Andrade-Cetto J (2010) Information-based compact pose SLAM. IEEE Trans Robot 26(1):78–93

    Article  Google Scholar 

  27. Anderson BDO, Moore JB (1979) Optimal filtering. Prentice-Hall, USA

    MATH  Google Scholar 

  28. Torr P, Zisserman A (1998) Robust computation and parametrization of multiple view relations. In: Sixth international conference on computer vision. IEEE, pp 727–732

  29. Haralick RM (1998) Propagating covariance in computer vision. In: 9. Theoretical foundations of computer vision, pp 95–114

  30. Gracias N, Negahdaripour S (2005) Underwater mosaic creation using video sequences from different altitudes. In: MTS/IEEE OCEANS conference, Washigton, D.C., pp 1234–1239

  31. Escartin J, Garcia R, Delaunoy O, Ferrer J, Gracias N, Elibol A, Cufi X, Neumann L, Fornari DJ, Humpris SE, Renard J (2008) Globally aligned photomosaic of the Lucky Strike hydrothermal vent field (Mid-Atlantic Ridge, 3718.5\(^{\prime }\)N): release of georeferenced data, mosaic construction, and viewing software. Geochem Geophys Geosyst 9(12):Q12,009

    Article  Google Scholar 

  32. Lirman D, Gracias N, Gintert B, Gleason A, Reid RP, Negahdaripour S, Kramer P (2007) Development and application of a video-mosaic survey technology to document the status of coral reef communities. Environ Monitor Assess 159:59–73

    Article  Google Scholar 

  33. Ribas D, Palomeras N, Ridao P, Carreras M, Hernandez E (2007) Ictineu AUV wins the first SAUC-E competition. In: IEEE international conference on robotics and automation, Rome

Download references

Acknowledgments

Authors would like to thank anonymous reviewers for their detailed and constructive comments. This research was supported by the MSIP (Ministry of Science, ICT and Future Planning), Korea, under the IT Consilience Creative Program (IITP-2015-R0346-15-1008) supervised by the NIPA (National IT Industry Promotion Agency), the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT and Future Planning (2013062644) and was a part of the project titled ‘Development of an autonomous ship-hull inspection system’ funded by the Ministry of Oceans and Fisheries, Korea. It was also partially funded through MINECO (grant number CTM2013-46718-R), the European Commission’s Seventh Framework Programme as part of the project Eurofleets2 (grant number FP7-INF-2012-312762), and Generalitat de Catalunya through the ACCIO/TecnioSpring program (TECSPR14-1-0050)

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Armagan Elibol.

Additional information

Armagan Elibol’s present address is Department of Mathematical Engineering, Yildiz Technical University, Istanbul, Turkey, aelibol@yildiz.edu.tr.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Elibol, A., Shim, H., Hong, S. et al. Online underwater optical mapping for trajectories with gaps. Intel Serv Robotics 9, 217–229 (2016). https://doi.org/10.1007/s11370-016-0195-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11370-016-0195-4

Keywords

Navigation