Loading [a11y]/accessibility-menu.js
Traffic Object Detection for Autonomous Driving Fusing LiDAR and Pseudo 4D-Radar Under Bird’s-Eye-View | IEEE Journals & Magazine | IEEE Xplore

Traffic Object Detection for Autonomous Driving Fusing LiDAR and Pseudo 4D-Radar Under Bird’s-Eye-View


Abstract:

To ensure safe and efficient intelligent transportation systems (ITS), autonomous driving systems must have excellent abilities about object detection and environmental p...Show More

Abstract:

To ensure safe and efficient intelligent transportation systems (ITS), autonomous driving systems must have excellent abilities about object detection and environmental perception. Fusing data from multiple sensors can overcome inherent limitations of single-sensor perception in 3D object detection for autonomous driving. LiDAR is great at pinpointing objects but doesn’t capture their velocity. Radar, on the other hand, accurately measures velocity but doesn’t provide height details. Fusing Radar and LiDAR can extend the detection range and improve the detection performance for dynamic objects. Nevertheless, direct integration of two sensors for performance improvement is hindered by different data characteristics and noise distributions. To address this, we propose a novel fusion framework termed LiDAR and Pseudo 4D-Radar fusion under Bird’s-Eye-View, dubbed L4R-BEVFusion, to overcome the challenge of multi-modal fusion. Specifically, Radar is a sensor that lacks object height information, we firstly design a pseudo 4D-Radar generation process that includes the sparse to dense(S2P) module and the height completion(RHC) module to transform the original 3D-Radar feature map into pseudo 4D-Radar feature map that is more dense and has enriched height information. Secondly, the fusion framework encodes the LiDAR features and the pseudo 4D-Radar features into the same Bird’s-Eye-View(BEV) through cross-guided BEV encoder(CGBE) module. Extensive experiments show that L4R-BEVFusion achieves state-of-the-art performance (71.3% NDS and 66.7% mAP) for detecting dynamic objects(only use LiDAR and Radar) on the NuScenes dataset.
Published in: IEEE Transactions on Intelligent Transportation Systems ( Volume: 25, Issue: 11, November 2024)
Page(s): 18185 - 18195
Date of Publication: 25 June 2024

ISSN Information:

Funding Agency:


References

References is not available for this document.