Skip to main content

Visual Sensors Benchmark for Development of an Autonomous Navigation Setup for a Hybrid Unmanned Aerial Underwater Vehicle

  • Conference paper
  • First Online:
Synergetic Cooperation Between Robots and Humans (CLAWAR 2023)

Abstract

Special Session: Hybrid and Convertible Unmanned Aerial Vehicles—Environments that have structures in multiple domains are a great challenge to performing inspection and monitoring tasks, often requiring several robots to perform in these ambient. Hybrid vehicles can become an alternative to the use of several devices. Especially, industrial marine ecosystems present a drastic change in the environment that can represent a challenge for these hybrid vehicles both in sensing and locomotion. Consequently, the use of infrared and acoustic sensors is discouraged for this particular vehicle type because they operate only in one specific medium. The infrared just works in air and acoustic sensor in the underwater environment, leading to limitations concerning weight for the vehicle’s structural development. Furthermore, the integration of both sensors adds complexity to the implementation of autonomy. This work presents a benchmark to evaluate the performance of visual sensors in both air and water showing its advantage and limitation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Dudek, G., Giguere, P., Prahacs, C., Saunderson, S., Sattar, J., Torres-Mendez, L.-A., Jenkin, M., German, A., Hogue, A., Ripsman, A., et al.: Aqua: An amphibious autonomous robot. Computer 40(1), 46–53 (2007)

    Article  Google Scholar 

  2. Crespi, A., Badertscher, A., Guignard, A., Ijspeert, A.J.: Amphibot i: an amphibious snake-like robot. RAS 50(4), 163–175 (2005)

    Google Scholar 

  3. Li, M., Guo, S., Hirata, H., Ishihara, H.: Design and performance evaluation of an amphibious spherical robot. RAS 64, 21–34 (2015)

    Google Scholar 

  4. Dos Santos, M.M., De Giacomo, G.G., Drews, P.L., Botelho, S.S.: Matching color aerial images and underwater sonar images using deep learning for underwater localization. IEEE RA-L 5(4), 6365–6370 (2020)

    Google Scholar 

  5. Dos Santos, M.M., De Giacomo, G.G., Drews-Jr, P.L., Botelho, S.S.: Cross-view and cross-domain underwater localization based on optical aerial and acoustic underwater images. IEEE RA-L 7(2), 4969–4974 (2022)

    Google Scholar 

  6. dos Santos, M.M., de Oliveira Evald, P.J.D., de Giacomo, G.G., Drews-Jr, P.L.J., da Costa Botelho, S.S.: A probabilistic underwater localisation based on cross-view and cross-domain acoustic and aerial images. JINT 108(3), 1–16 (2023)

    Google Scholar 

  7. Miranda Pinheiro, P., Dias de Oliveira Evald, J., Bedin Grando, R., Alves Neto, A., Jorge Drews-Jr, P.L.: Hybrid unmanned aerial underwater vehicles: A survey on concepts and technologies. Available at SSRN 4424715 (2023)

    Google Scholar 

  8. Yang, X., Wang, T., Liang, J., Yao, G., Liu, M.: Survey on the novel hybrid aquatic-aerial amphibious aircraft: Aquatic unmanned aerial vehicle (aquauav). Prog. Aerosp. Sci. 74, 131–151 (2015)

    Article  Google Scholar 

  9. Drews, P.L., Neto, A.A., Campos, M.F.: Hybrid unmanned aerial underwater vehicle: Modeling and simulation. In: IEEE/RSJ IROS, pp. 4637–4642 (2014)

    Google Scholar 

  10. Maia, M.M., Soni, P., Diez, F.J.: Demonstration of an aerial and submersible vehicle capable of flight and underwater navigation with seamless air-water transition. arXiv preprint arXiv:1507.01932 (2015)

  11. Horn, A.C., Pinheiro, P.M., Grando, R.B., da Silva, C.B., Neto, A.A., Drews, P.L.: A novel concept for hybrid unmanned aerial underwater vehicles focused on aquatic performance. In: IEEE LARS/SBR, pp. 1–6 (2020)

    Google Scholar 

  12. Pedroso, A.A., da Silva, A.C., Pinheiro, P.M., Drews, P.L.: Prototyping and construction of a hybrid unmanned aerial underwater vehicles. In: IEEE LARS/SBR, pp. 61–66 (2022)

    Google Scholar 

  13. Fan, R., Jiao, J., Pan, J., Huang, H., Shen, S., Liu, M.: Real-time dense stereo embedded in a UAV for road inspection. In: IEEE/CVF CVPRw (2019)

    Google Scholar 

  14. Ma, Y., Li, Q., Chu, L., Zhou, Y., Xu, C.: Real-time detection and spatial localization of insulators for uav inspection based on binocular stereo vision. Remote Sens. 13(2), 230 (2021)

    Article  Google Scholar 

  15. Wang, D., Li, W., Liu, X., Li, N., Zhang, C.: Uav environmental perception and autonomous obstacle avoidance: a deep learning and depth camera combined solution. Comput. Electron. Agric. 175, 105523 (2020)

    Article  Google Scholar 

  16. Rueda-Ayala, V.P., Peña, J.M., Höglind, M., Bengochea-Guevara, J.M., Andújar, D.: Comparing uav-based technologies and rgb-d reconstruction methods for plant height and biomass monitoring on grass ley. Sensors 19(3), 535 (2019)

    Article  Google Scholar 

  17. Bobkov, V., Kudryashov, A., Inzartsev, A.: Method for the coordination of referencing of autonomous underwater vehicles to man-made objects using stereo images. J. Marine Sci. Engin. 9(9), 1038 (2021)

    Article  Google Scholar 

  18. Duecker, D.A., Hansen, T., Kreuzer, E.: Rgb-d camera-based navigation for autonomous underwater inspection using low-cost micro auvs. In: IEEE/OES AUV, pp. 1–7 (2020)

    Google Scholar 

  19. Wang, Y., Ma, X., Wang, J., Hou, S., Dai, J., Gu, D., Wang, H.: Robust auv visual loop-closure detection based on variational autoencoder network. IEEE Trans. Industr. Inf. 18(12), 8829–8838 (2022)

    Article  Google Scholar 

  20. “Bluerov2,” Blue Robotics (2022). https://www.bluerobotics.com/store/rov/bluerov2/

  21. Quigley, M., Conley, K., Gerkey, B., Faust, J., Foote, T., Leibs, J., Wheeler, R., Ng, A.Y., et al., Ros: an open-source robot operating system. IEEE ICRAw 3(3.2), 5 (2009)

    Google Scholar 

  22. Wang, C., Zhao, C., Yang, J.: Monocular odometry in country roads based on phase-derived optical flow and 4-dof ego-motion model. Indust. Robot: Int. J. (2011)

    Google Scholar 

  23. Nenchoo, B., Tantrairatn, S.: Real-time 3d uav pose estimation by visualization. MDPI 39(1), 18 (2020)

    Google Scholar 

  24. Grando, R.B., Pinheiro, P.M., Bortoluzzi, N.P., da Silva, C.B., Zauk, O.F., Piñeiro, M.O., Aoki, V.M., Kelbouscas, A.L., Lima, Y.B., Drews, P.L., et al.: Visual-based autonomous unmanned aerial vehicle for inspection in indoor environments. In: IEEE LARS/SBR, pp. 1–6 (2020)

    Google Scholar 

  25. Ahluwalia, V., Arents, J., Oraby, A., Greitans, M.: Construction and benchmark of an autonomous tracked mobile robot system. RAS 2(1), 15–28 (2022)

    Google Scholar 

  26. Tadic, V., Toth, A., Vizvari, Z., Klincsik, M., Sari, Z., Sarcevic, P., Sarosi, J., Biro, I.: Perspectives of realsense and zed depth sensors for robotic vision applications. Machines 10(3), 183 (2022)

    Article  Google Scholar 

  27. Wang, C., Zhang, Q., Lin, S., Li, W., Wang, X., Bai, Y., Tian, Q.: Research and experiment of an underwater stereo vision system. In: OCEANS 2019-Marseille, pp. 1–5 (2019)

    Google Scholar 

  28. Wang, M.-S.: Eye to hand calibration using anfis for stereo vision-based object manipulation system. Microsyst. Technol. 24, 305–317 (2018)

    Article  Google Scholar 

  29. Du, Y.-C., Muslikhin, M., Hsieh, T.-H., Wang, M.-S.: Stereo vision-based object recognition and manipulation by regions with convolutional neural network. Electronics 9(2), 210 (2020)

    Article  Google Scholar 

  30. Jetson nano, Nvidia (2023). https://www.nvidia.com/pt-br/autonomous-machines/embedded-systems/jetson-nano/

Download references

Acknowledgements

This study was financed by the Human Resource Program of The Brazilian National Agency for Petroleum, Natural Gas, and Biofuels—PRH-ANP, supported by resources from oil companies considering the contract clause n\(^{\underline{o}}\) 50/2015 of R, D &I of the ANP, CAPES, and CNPq. Moreover, we would like to thank to the Technological University of Uruguay (UTEC).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Matheus G. Mateus .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Mateus, M.G. et al. (2024). Visual Sensors Benchmark for Development of an Autonomous Navigation Setup for a Hybrid Unmanned Aerial Underwater Vehicle. In: Youssef, E.S.E., Tokhi, M.O., Silva, M.F., Rincon, L.M. (eds) Synergetic Cooperation Between Robots and Humans. CLAWAR 2023. Lecture Notes in Networks and Systems, vol 810. Springer, Cham. https://doi.org/10.1007/978-3-031-47269-5_20

Download citation

Publish with us

Policies and ethics