Skip to main content
Log in

A survey on real-time motion estimation techniques for underwater robots

  • Special Issue Paper
  • Published:
Journal of Real-Time Image Processing Aims and scope Submit manuscript

Abstract

Over the last few years, we have assisted to an impressive evolution in the state-of-the-art of feature extraction, description and matching. Feature matching-based methods are among the most popular approaches to the problem of motion estimation. Thus, the need of studying the evolution of the feature matching field arises naturally. The application chosen is the motion estimation of a Remotely Operated Vehicle (ROV). A challenging environment such as an underwater environment is an excellent test bed to evaluate the performance of the several recent developed feature extractors and descriptors. The algorithms were tested using the same open source framework to give a fair assessment of their performance especially in terms of computational time. The various possible combinations of algorithms were compared to an approach developed by the authors that showed good performance in the past. A data set collected by the ROV Romeo in typical operations is used to test the methods. Quantitative results in terms of robustness to noise and computational time are presented and demonstrate that the recent trend of binary features is very promising.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

Notes

  1. http://opencv.org/.

References

  1. Agrawal, M., Konolige, K., Blas, M.: CenSurE: Center Surround Extremas for Realtime Feature Detection and Matching. In: Forsyth, D., Torr, P., Zisserman, A. (eds.) Computer Vision ECCV 2008. Lecture Notes in Computer Science, vol. 5305, pp. 102–115. Springer, Berlin, Heidelberg (2008)

    Chapter  Google Scholar 

  2. Alahi, A., Ortiz, R., Vandergheynst, P.: FREAK : Fast retina keypoint. IEEE conference on computer vision and pattern recognition, pp. 510–517 (2012). http://infoscience.epfl.ch/record/175537/files/2069

  3. Alvarez, L., Castaño, C., García, M., Krissian, K., Mazorra, L., Salgado, A., Sánchez, J.: A new energy-based method for 3D motion estimation of incompressible PIV flows. Comput. Vis. Image Underst. 113(7):802–810 (2009). http://www.sciencedirect.com/science/article/pii/S1077314209000320

  4. Argyriou, V., Vlachos, T.: Extrapolation-free arbitrary-shape motion estimation using phase correlation. J. Electronic Imaging 15(1) (2006). doi:10.1117/1.2170582

    Article  Google Scholar 

  5. Ayuso, F., Botella, G., García, C., Prieto, M., Tirado, F.: GPU-based acceleration of bio-inspired motion estimation model. Concurr. Comput. Pract. Exp. 25(8):1037–1056 (2013). doi:10.1002/cpe.2946, http://doi.wiley.com/10.1002/cpe.2946

  6. Baralli, F., Couillard, M., Ortiz, J., Caldwell, D.G.: GPU-based real-time synthetic aperture sonar processing on-board autonomous underwater vehicles. In: 2013 MTS/IEEE OCEANS-Bergen, IEEE, pp. 1–8 (2013). doi:10.1109/OCEANS-Bergen.6608158, http://ieeexplore.ieee.org/articleDetails.jsp?arnumber=6608158

  7. Barranco, F., Tomasi, M., Diaz, J., Vanegas, M., Ros, E.: Parallel architecture for hierarchical optical flow estimation based on FPGA. IEEE Trans. Very Large Scale Integr. (VLSI) Syst. 20(6):1058–1067 (2012). doi:10.1109/TVLSI.2011.2145423, http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=5772045

  8. Bay, H., Tuytelaars, T., Gool, L.V.: SURF: speeded up robust features. In: Proceedings of the 9th European conference on computer vision, Springer LNCS, vol 3951 part, pp. 404–417 (2006)

  9. Bechlioulis, C.P., Karras, G.C., Nagappa, S., Palomeras, N., Kyriakopoulos, K.J., Carreras, M.: A robust visual servo control scheme with prescribed performance for an autonomous underwater vehicle. In: 2013 IEEE/RSJ International conference on intelligent robots and systems (IROS), Tokyo, pp. 3879–3884 (2013).

  10. Botella, G., Meyer-Base, U., Garcia, A.: Bio-inspired robust optical flow processor system for VLSI implementation. Electron. Lett. 45(25):1304 (2009). doi:10.1049/el.2009.1718, http://ieeexplore.ieee.org/articleDetails.jsp?arnumber=5353348

  11. Botella, G., Garcia, A., Rodriguez-Alvarez, M., Ros, E., Meyer-Baese, U., Molina, M.C.: Robust bioinspired architecture for optical-flow computation. IEEE Trans. Very Large Scale Integr. (VLSI) Syst. 18(4):616–629 (2010). doi:10.1109/TVLSI.2009.2013957, http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=5153096

    Google Scholar 

  12. Botella, G., Meyer-Baese, U., García, A., Rodríguez, M.: Quantization analysis and enhancement of a VLSI gradient-based motion estimation architecture. Digitital Signal Processing 22(6):1174–1187 (2012). doi:10.1016/j.dsp.2012.05.013, http://www.sciencedirect.com/science/article/pii/S1051200412001388

  13. Caccia, M.: Optical triangulation-correlation sensor for ROV slow motion estimation: experimental results (July 2002 at-sea trials). Rob-02, CNR-IAN (2002)

  14. Caccia, M.: Laser-triangulation optical-correlation sensor for ROV slow motion estimation. IEEE J.Ocean. Eng. 31(3), 711–727 (2006)

    Article  Google Scholar 

  15. Calonder, M., Lepetit, V., Strecha, C., Fua, P.: BRIEF: binary robust independent elementary features. In: Daniilidis K, Maragos P, Paragios N (eds) Computer vision ECCV 2010, lecture notes in computer science, vol 6314, Springer, Berlin, Heidelberg, chapt. 56, pp. 778–792 (2010). doi:10.1007/978-3-642-15561-1_56

  16. Caselles, V., Garrido, L., Igual, L.: A contrast invariant approach to motion estimation. In: Kimmel R, Sochen N, Weickert J (eds) Scale Space and PDE Methods in Computer Vision, Lecture Notes in Computer Science, vol. 3459, Springer, Heidelberg, pp. 242–253 (2005). doi:10.1007/11408031_21, http://dx.doi.org/10.1007/11408031_21

  17. Cornelis, N., Van Gool, L.: Fast scale invariant feature detection and matching on programmable graphics hardware. In: Computer vision and pattern recognition workshops, 2008. CVPRW ’08. IEEE computer society conference on, pp. 1–8 (2008). doi:10.1109/CVPRW.2008.4563087

  18. Crawford, A., Denman, H., Kelly, F., Pitie, E., Kokaram, A.: Gradient based dominant motion estimation with integral projections for real time video stabilisation. In: 2004 International conference on image processing, 2004. ICIP ’04., IEEE, vol. 5, pp. 3371–3374 (2004). doi:10.1109/ICIP.2004.1421837, http://ieeexplore.ieee.org/articleDetails.jsp?arnumber=1421837

  19. Cronje, J.: BFROST: binary features from robust orientation segment tests accelerated on the GPU. In: 22nd annual symposium of the pattern recognition association of South Africa (2011). http://researchspace.csir.co.za/dspace/bitstream/10204/5387/1/Cronje_2011

  20. Diaz, J., Ros, E., Pelayo, F., Ortigosa, E., Mota, S.: FPGA-based real-time optical-flow system. IEEE Trans. Circuits Syst. Video Technol. 16(2):274–279 (2006). doi:10.1109/TCSVT.2005.861947, http://ieeexplore.ieee.org/articleDetails.jsp?arnumber=1588967

  21. Ebrahimi, M., Mayol-Cuevas, W.: SUSurE: speeded up surround extrema feature detector and descriptor for realtime applications. In: 2009 IEEE computer society conference on computer vision and pattern recognition workshops, pp. 9–14 (2009). doi:10.1109/CVPRW.2009.5204313, http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=5204313

  22. Elad, M., Teo, P., Hel-Or, Y.: On the design of filters for gradient-based motion estimation. J. Math. Imaging Vis. 23(3):345–365 (2005). doi:10.1007/s10851-005-2027-6, http://link.springer.com/10.1007/s10851-005-2027-6

  23. Eustice, R.M., Pizarro, O., Singh, H.: Visually augmented navigation for autonomous underwater vehicles. IEEE J. Ocean. Eng. 33(2), 103–122 (2008)

    Article  Google Scholar 

  24. Ferreira, F., Veruggio, G., Caccia, M., Bruzzone, G.: Speeded up robust features for vision-based underwater motion estimation and SLAM: comparison with correlation-based techniques. In: Proceedings of MCMC’2009 (2009)

  25. Ferreira, F., Orsenigo, F., Veruggio, G., Pavlakis, P., Caccia, M., Bruzzone, G.: Comparison between feature-based and phase correlation methods for ROV vision-based speed estimation. In: 7th symposium on intelligent autonomous vehicles, IFAC, Lecce, Italy (2010)

  26. Ferreira, F., Veruggio, G., Caccia, M., Bruzzone, G.: A comparison between different feature-based methods for ROV vision-based speed estimation. In: IFAC workshop on navigation, guidance and control of underwater vehicles (ngcuv2012) (2012a)

  27. Ferreira, F., Veruggio, G., Caccia, M., Bruzzone, G.: Comparing region-based and feature-based methods for ROV vision-based motion estimation. In: 9th IFAC conference on manoeuvring and control of marine craft (MCMC 2012) (2012b)

  28. Ferreira, F., Veruggio, G., Caccia, M., Bruzzone, G.: Real-time optical SLAM-based mosaicking for unmanned underwater vehicles. Intelligent Service Robotics 5(1):55–71 (2012c). doi:10.1007/s11370-011-0103-x

    Article  Google Scholar 

  29. Ferreira, F., Veruggio, G., Caccia, M., Bruzzone, G.: ROV vision-based motion estimation: a comparison study. In: 10th International IFAC symposium on robot, control (SYROCO2012) (2012d)

  30. Ferreira, F., Veruggio, G., Caccia, M., Bruzzone, G.: Binary visual features for ROV motion estimation. In: OCEANS-Bergen, 2013 MTS/IEEE, pp. 1–7 (2013a). doi:10.1109/OCEANS-Bergen.6608116

  31. Ferreira, F., Veruggio, G., Caccia, M., Zereik, E., Bruzzone, G.: A real-time mosaicking algorithm using binary features for ROVs. In: control automation (MED), 2013 21st Mediterranean Conference on, pp. 1267–1273 (2013b). doi:10.1109/MED.2013.6608882

  32. Fischer, J., Ruppel, A., Weisshardt, F., Verl, A.: A rotation invariant feature descriptor O-DAISY and its FPGA implementation. In: 2011 IEEE/RSJ international conference on intelligent robots and systems, IEEE, pp. 2365–2370 (2011). doi:10.1109/IROS.2011.6094813, http://ieeexplore.ieee.org/articleDetails.jsp?arnumber=6094813

  33. Garcia-Fidalgo, E.: Experimental assessment of different image descriptors for topological map-building and scene recognition. Master thesis, Universitat de les Illes Balears (2011). http://srv.uib.es/ref/1251

  34. Gil, A., Mozos, O.M., Ballesta, M., Reinoso, O.: A comparative evaluation of interest point detectors and local descriptors for visual SLAM. Mach. Vis. Appl. 21(6):905–920 (2009). doi:10.1007/s00138-009-0195-x, http://link.springer.com/10.1007/s00138-009-0195-x

  35. Girdhar, Y., Giguere, P., Dudek, G.: Autonomous adaptive exploration using realtime online spatiotemporal topic modeling. Int. J. Robot. Res. 1–13 (2013). doi:10.1177/0278364913507325, http://ijr.sagepub.com/content/early/2013/11/12/0278364913507325.abstract

  36. Gracias, N.R., der Zwaan, S., Bernardino, A., Santos-Victor, J.: Mosaic-based navigation for autonomous underwater vehicles. IEEE J. Ocean. Eng. 28(4), 609–624 (2003)

    Article  Google Scholar 

  37. Guzmán, P., Díaz, J., Agís, R., Ros E.: Optical flow in a smart sensor based on hybrid analog-digital architecture. Sensors (Basel, Switzerland) 10(4):2975–2994 (2010). doi:10.3390/s100402975, http://www.mdpi.com/1424-8220/10/4/2975

  38. Harris, C., Stephens, M.: A combined corner and edge detector. In: Taylor, C.J. (ed.) Procedings of the Alvey Vision Conference 1988, Alvey Vision Club, pp. 23.1–23.6 (1988). doi:10.5244/C.2.23, http://www.bmva.org/bmvc/1988/avc-88-023.html

  39. Honegger, D., Greisen, P., Meier, L., Tanskanen, P., Pollefeys, M.: Real-time velocity estimation based on optical flow and disparity matching. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE, pp. 5177–5182 (2012). doi:10.1109/IROS.2012.6385530, http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=6385530

  40. Jin, H., Louis, S., Favaro, P., Soatto, S., Angeles, L.: Real-time feature tracking and outlier rejection with changes in illumination. In: Proceedings Eighth IEEE international conference on computer vision. ICCV 2001, IEEE Comput. Soc, vol 1, pp. 684–689 (2001). doi:10.1109/ICCV.2001.937588, http://ieeexplore.ieee.org/articleDetails.jsp?arnumber=937588

  41. Johnson-Roberson, M., Pizarro, O., Williams, S.B., Mahon, I.: Generation and visualization of large-scale three-dimensional reconstructions from underwater robotic surveys. J. Field. Robot. 27(1):21–51 (2010). doi:http://dx.doi.org/10.1002/rob.v27:1

    Google Scholar 

  42. Keller, Y., Averbuch, A.: Fast motion estimation using bidirectional gradient methods. IEEE Trans. Image Process. 13(8):1042–1054 (2004). doi:10.1109/TIP.2004.823823, http://ieeexplore.ieee.org/articleDetails.jsp?arnumber=1315693

  43. Kudzinava, M.: Feature-based matching of underwater images. Master’s thesis, University of Girona (2007)

  44. Leutenegger, S., Chli, M., Siegwart, R.Y.: BRISK: binary robust invariant scalable keypoints (2011). doi:10.1109/ICCV.2011.6126542, http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=6126542

  45. Lowe, D.: Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 60(2), 91–110 (2004)

    Article  Google Scholar 

  46. Lowe, D.G.: Object recognition from local scale-invariant features. In: computer vision, 1999. The proceedings of the seventh IEEE international conference on, vol 2, pp. 1150–1157 (1999). doi:10.1109/ICCV.1999.790410

  47. Mainali, P., Lafruit, G., Yang, Q., Geelen, B., Gool, L.V., Lauwereins, R.: SIFER: Scale-Invariant Featur. Detect. Error Resil (2013). doi:10.1007/s11263-013-0622-3

    Google Scholar 

  48. Mair, E., Hager, G.D., Burschka, D., Suppa, M., Hirzinger, G.: Adaptive and generic corner detection based on the Accelerated Segment Test. In: Daniilidis K, Maragos P, Paragios N (eds) Computer vision ECCV 2010, lecture notes in computer science, vol 6312, Springer, Berlin, Heidelberg, pp. 183–196 (2010). http://dx.doi.org/10.1007/978-3-642-15552-9_14

  49. Misu, T., Hashimoto, T., Ninomiya, K.: Optical guidance for autonomous landing of spacecraft. IEEE Trans. Aerosp. Electron. Syst. 35(2), 459–473 (1999)

    Article  Google Scholar 

  50. Monson, J., Wirthlin, M., Hutchings, BL.: Implementing high-performance, low-power FPGA-based optical ow accelerators in C. In: 2013 IEEE 24th International Conference on Application-Specic Systems, Architectures and Processors, IEEE, pp. 363–369 (2013). doi:10.1109/ASAP.2013.6567602, http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=6567602

  51. Muja, M., Lowe, D.G.: Fast approximate nearest neighbors with automatic algorithm configuration. In: International conference on computer vision theory and application VISSAPP’09), INSTICC Press, pp. 331–340 (2009)

  52. Muller, T., Rannacher, J., Rabe, C., Franke, U., Müller, T.: Feature- and depth-supported modified total variation optical flow for 3D motion field estimation in real scenes. CVPR 2011 pp. 1193–1200 (2011). http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=5995633

  53. Negahdaripour, S., Xu, X.: Mosaic-based positioning and improved motion-estimation methods for automatic navigation of submersible vehicles. IEEE J. Ocean. Eng. 27(1), 79–99 (2002)

    Article  Google Scholar 

  54. Nicosevici, T., Garcia, R.: Online robust 3D mapping using structure from motion cues. In: OCEANS 2008-MTS/IEEE Kobe Techno-Ocean, pp. 1–7 (2008). doi:10.1109/OCEANSKOBE.2008.4531022

  55. Orsenigo, F., Sarantakos, K.: Tornado: an application for on-the-fly mosaicking through phase correlation. In: Proceedings of MCMC’2009 (2009)

  56. Pauwels, K., Tomasi, M., Díaz, J., Ros, E., Van Hulle, M.M.: A comparison of FPGA and GPU for real-time phase-based optical flow, stereo, and local image features. IEEE Trans. Comput. 61(7):999–1012 (2012). doi:10.1109/TC.2011.120, http://www.computer.org/csdl/trans/tc/2012/07/ttc2012070999.html

  57. Rosin, P.: Measuring corner properties. Comput. Vis. Image Underst. 73(2):291–307 (1999). doi:10.1006/cviu.1998.0719, http://dx.doi.org/10.1006/cviu.1998.0719

  58. Rosten, E., Drummond, T.: Fusing points and lines for high performance tracking. IEEE Int. Conf. Comput. Vis. 2, 1508–1511 (2005). doi:10.1109/ICCV.2005.104

    Google Scholar 

  59. Rosten, E., Drummond, T.: Machine learning for high-speed corner detection. Eur. Conf. Comput. Vis. 1, 430–443 (2006). doi:10.1007/11744023_34

    Google Scholar 

  60. Rublee, E., Rabaud, V., Konolige, K., Bradski, G.: ORB: an efficient alternative to SIFT or SURF. In: International conference on computer vision, Barcelona (2011)

  61. Shi, J., Tomasi, C.: Good features to track. In: Computer vision and pattern recognition, 1994. In: Proceedings CVPR ’94, 1994 IEEE computer society conference on, pp. 593–600 (1994). doi:10.1109/CVPR.1994.323794

  62. Shkurti, F., Rekleitis, I., Dudek, G.: Feature tracking evaluation for pose estimation in underwater environments. In: computer and robot vision (CRV), 2011 Canadian conference on, pp. 160–167 (2011). doi:10.1109/CRV.2011.28

  63. Sinha, S., Frahm, J.M., Pollefeys, M., Genc, Y.: Feature tracking and matching in video using programmable graphics hardware. Machine vision and applications (2007). doi:10.1007/s00138-007-0105-z, http://dx.doi.org/10.1007/s00138-007-0105-z

  64. Spinei, A., Pellerin, D., Fernandes, D., Herault, J.: Fast hardware implementation of gabor filter based motion estimation. Integr. Comput. Aid. Eng. 7(1):67–77 (2000). http://dl.acm.org/citation.cfm?id=1275752.1275757

  65. Stein, F.: Efficient computation of optical flow using the census transform. In: Rasmussen, C.E., Bülthoff, H.H., Schölkopf, B., Giese M.A. (eds) Pattern recognition, Springer, Berlin Heidelberg, chap. Lecture No., pp. 79–86 (2004). doi:10.1007/978-3-540-28649-3_10

  66. Thomas, S.J.: Real-time Stereo Visual SLAM. Master’s thesis, Heriot-Watt University, Universitat de Girona, Universite de Bourgogne (2008)

  67. Tola, E., Lepetit, V., Fua, P.: DAISY: an efficient dense descriptor applied to wide-baseline stereo. IEEE Trans. Pattern Anal. Mach. Intell. 32(5):815–30 (2010). doi:10.1109/TPAMI.2009.77, http://www.ncbi.nlm.nih.gov/pubmed/20299707

    Google Scholar 

  68. Torralba, A., Fergus, R., Weiss, Y., York, N.: Small codes and large image databases for recognition. In: 2008 IEEE Conf. Comput. Vis. Pattern Recognit., IEEE, pp. 1–8 (2008). doi:10.1109/CVPR.2008.4587633, http://ieeexplore.ieee.org/articleDetails.jsp?arnumber=4587633

  69. Town, C., Marshall, A., Sethasathien, N.: Manta matcher: automated photographic identification of manta rays using keypoint features. Ecol Evolution 3(7):1902–1914 (2013). doi:10.1002/ece3.587, http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=3728933&tool=pmcentrez&rendertype=abstract

    Google Scholar 

  70. Viola, P., Jones, M.: Rapid object detection using a boosted cascade of simple features. In: Computer vision and pattern recognition, 2001. CVPR 2001. In: Proceedings of the 2001 IEEE computer society conference on, vol 1, pp. I-511–I-518 (2001). doi:10.1109/CVPR.2001.990517

  71. Wei, Z., Lee, D.J., Nelson, B.E., Archibald, J.K.: Real-time accurate optical flow-based motion sensor. In: 2008 19th international conference on pattern recognition, IEEE, pp. 1–4 (2008). doi:10.1109/ICPR.2008.4761126, http://ieeexplore.ieee.org/articleDetails.jsp?arnumber=4761126

Download references

Acknowledgments

The authors wish to thank Riccardo Bono, Giorgio Bruzzone and Edoardo Spirandelli for their highly professional and kind support in the development and operation at sea of the Romeo ROV.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fausto Ferreira.

Additional information

Research supported in part by the Fundação para a Ciência e Tecnologia (FCT), Portugal with the PhD Grant SFRH/BD/72024/2010 and by the FP7 IP MORPH Project number 288704.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Ferreira, F., Veruggio, G., Caccia, M. et al. A survey on real-time motion estimation techniques for underwater robots. J Real-Time Image Proc 11, 693–711 (2016). https://doi.org/10.1007/s11554-014-0416-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11554-014-0416-z

Keywords

Navigation