Skip to main content

Data Fusion of RADAR and LIDAR for Robot Localization Under Low-Visibility Conditions in Structured Environments

  • Conference paper
  • First Online:
ROBOT2022: Fifth Iberian Robotics Conference (ROBOT 2022)

Part of the book series: Lecture Notes in Networks and Systems ((LNNS,volume 590))

Included in the following conference series:

  • 992 Accesses

Abstract

Optical range sensors such as LiDAR and range cameras have become the most common devices for robot localization and navigation tasks. However, their performance can be degraded by meteorological hazards, such as fog, smoke, or rain. This paper proposes a new method to combine information from LiDAR sensors and low-cost RADAR sensors in structured 2D environments, in order to ensure the availability of useful information in low-visibility conditions due to smoke. Our method makes use of a novel DBScan-Line segmentation for clustering the measurements from the LiDAR sensor, and then it establishes correspondences between these clusters and the measurements from the RADAR sensors. The method has been extensively tested in field experiments with artificial smoke, and the results benchmarked against raw sensors and a state-of-the-art fusion method. Moreover, the fused measurements have been integrated into a localization method, which was able to robustly localize a ground platform in dense fog.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    https://robotics.upo.es/datasets/smoke.

  2. 2.

    https://github.com/robotics-upo/RADAR_experiments.

  3. 3.

    http://github.com/robotics-upo/fiducials.

References

  1. Brooker, G., Hennessy, R., Lobsey, C., Bishop, M., Widzyk-Capehart, E.: Seeing through dust and water vapor: millimeter wave radar sensors for mining applications. J. Field Robot. 24, 527–557 (2007)

    Article  Google Scholar 

  2. Starr, J., Lattimer, B.: Evaluation of navigation sensors in fire smoke environments. Fire Technol. 50, 1459–1481 (2014). https://doi.org/10.1007/s10694-013-0356-3

    Article  Google Scholar 

  3. Khattak, S., Papachristos, C., Alexis, K.: Keyframe-based thermal-inertial odometry. J. Field Robot. 37(4), 552–579 (2020)

    Article  Google Scholar 

  4. Saputra, M.R.U., et al.: Deeptio: a deep thermal-inertial odometry with visual hallucination. IEEE Robot. Autom. Lett. 5(2), 1672–1679 (2020)

    Article  Google Scholar 

  5. Mielle, M., Magnusson, M., Lilienthal, A.J.: A comparative analysis of radar and lidar sensing for localization and mapping. In: 2019 European Conference on Mobile Robots (ECMR), pp. 1–6, Prague, Czech Republic. IEEE (2019)

    Google Scholar 

  6. Fritsche, P., Wagner, B.: Modeling structure and aerosol concentration with fused radar and LiDAR data in environments with changing visibility. In: 2017 IEEE International Conference on Intelligent Robots and Systems (IROS), pp. 2685–26–90, Vancouver-Canada. IEEE (2017)

    Google Scholar 

  7. Peng, Z., Li, C.: Portable microwave radar systems for short-range localization and life tracking: a review. Sensors 19(5), 1136 (2019)

    Google Scholar 

  8. Hong, Z., Petillot, Y., Wang, S.: Radarslam: Radar based large-scale slam in all weathers. In: 2020 IEEE International Conference on Intelligent Robots and Systems (IROS), pp. 5164–5170, Las Vegas-USA, 2020. IEEE (2020)

    Google Scholar 

  9. Gerardo-Castro, M.P., Peynot, T., Ramos, F., Fitch, R: Robust multiple-sensing-modality data fusion using gaussian process implicit surfaces. In: 17th International Conference on Information Fusion (FUSION), pp. 1–8 (2014)

    Google Scholar 

  10. Clarke, B., Worrall, S., Brooker, G., Nebot, E.: Towards mapping of dynamic environments with FMCW radar. In: IEEE Intelligent Vehicles Symposium (IV), pp. 147–152 (2013)

    Google Scholar 

  11. Lu, C. X., et al.: milliEgo: single-chip mmWave radar aided egomotion estimation via deep sensor fusion. In: Proceedings of the 18th Conference on Embedded Networked Sensor Systems (SenSys 2020), 2020, pp. 109–122. IEEE (2020)

    Google Scholar 

  12. Lu, C.X., et al.: See through smoke: robust indoor mapping with low-cost mmwave radar. In: ACM Conference on Mobile Systems, Applications, and Services (MobiSys), Toronto-Canada, pp. 1–14. IEEE (2020)

    Google Scholar 

  13. Mostafa, M., Zahran, S., Moussa, A., El-Sheimy, N., Sesay, A.: Radar and visual odometry integrated system aided navigation for UAVS in GNSS denied environment. Sensors 18(9), 2776 (2018)

    Article  Google Scholar 

  14. Park, Y. S., Kim, J., Kim, A.: Radar localization and mapping for indoor disaster environments via multi-modal registration to prior LiDAR map. In: 2019 IEEE International Conference on Intelligent Robots and Systems (IROS), Macau-China, pp. 1307–1314. IEEE (2019)

    Google Scholar 

  15. Castro, M., Peynot, T.: Laser-to-radar sensing redundancy for resilient perception in adverse environmental conditions. In: Australasian Conference on Robotics and Automation, Wellington-New Zealand, pp. 1–8. IEEE (2012)

    Google Scholar 

  16. Iovescu, C., Rao, S.: The fundamentals of millimeter wave radar sensors (2020). https://bit.ly/3nBZhaW. Accessed 14 Sept 2021

  17. Ester, M., Kriegel, H. P., Sander, J., Xu, X.: A density-based algorithm for discovering clusters in large spatial databases with noise. In: Proceedings of the Second International Conference on Knowledge Discovery and Data Mining, ser. KDD 1996. AAAI Press, pp. 226-231 (1996)

    Google Scholar 

  18. Poppinga, J., Vaskevicius, N., Birk, A., Pathak, K.: Fast plane detection and polygonalization in noisy 3D range images. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3378–3383 (2008)

    Google Scholar 

  19. Wojtanowski, J., Zygmunt, M., Kaszczuk, M., Mierczyk, Z., Muzal, M.: Comparison of 905 nm and 1550 nm semiconductor laser rangefinders’ performance deterioration due to adverse environmental conditions. Opto-Electron. Rev. 22, 09 (2014). https://doi.org/10.2478/s11772-014-0190-2

    Article  Google Scholar 

  20. Kümmerle, R., Grisetti, G., Strasdat, H., Konolige, K., Burgard, W.: G2o: a general framework for graph optimization. In: 2011 IEEE International Conference on Robotics and Automation (ICRA), Shanghai, pp. 3607–3613. IEEE (2011)

    Google Scholar 

  21. Thrun, S., Fox, D., Burgard, W., Dellaert, F.: Robust Monte Carlo localization for mobile robots. Artif. Intell. 128(1–2), 99–141 (2001)

    Article  MATH  Google Scholar 

  22. Hong, H., Lee, B.H.: Probabilistic normal distributions transform representation for accurate 3d point cloud registration. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 3333–3338 (2017)

    Google Scholar 

  23. Chen, Y., Medioni, G.: Object modeling by registration of multiple range images. In: Proceedings 1991 IEEE International Conference on Robotics and Automation, vol. 3, pp. 2724–2729 (1991)

    Google Scholar 

  24. Caballero, F., Merino, L.: DLL: direct LiDAR localization. A map-based localization approach for aerial robots. In: 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 5491–5498 (2021)

    Google Scholar 

  25. Gerkey, B.: Amcl ROS wiki (2017). https://wiki.ros.org/amcl. Accessed 14 Sept 2021

Download references

Acknowledgment

This work is partially supported by the Spanish Ministry of Science, Innovation and Universities (COMCISE RTI2018-100847-B-C22, MCIU/AEI/FEDER, UE) and by Programa Operativo FEDER Andalucia 2014–2020 through the project DeepBot (PY20_00817). The authors would like to thank the enterprise IDMind (Lisbon, Portugal) for their help and support during the development of the experiments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Luis Merino .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Alejo, D., Rey, R., Cobano, J.A., Caballero, F., Merino, L. (2023). Data Fusion of RADAR and LIDAR for Robot Localization Under Low-Visibility Conditions in Structured Environments. In: Tardioli, D., Matellán, V., Heredia, G., Silva, M.F., Marques, L. (eds) ROBOT2022: Fifth Iberian Robotics Conference. ROBOT 2022. Lecture Notes in Networks and Systems, vol 590. Springer, Cham. https://doi.org/10.1007/978-3-031-21062-4_25

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-21062-4_25

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-21061-7

  • Online ISBN: 978-3-031-21062-4

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics