Loading [a11y]/accessibility-menu.js
Pseudo-Anchors: Robust Semantic Features for Lidar Mapping in Highly Dynamic Scenarios | IEEE Journals & Magazine | IEEE Xplore

Pseudo-Anchors: Robust Semantic Features for Lidar Mapping in Highly Dynamic Scenarios


Abstract:

Dynamic environments are challenging for anchor-free mapping using lidar in intelligent driving. This study imitates anchor-based approaches such as magnetic nails by app...Show More

Abstract:

Dynamic environments are challenging for anchor-free mapping using lidar in intelligent driving. This study imitates anchor-based approaches such as magnetic nails by applying novel Static Confidence Criteria (SCC) to the point-cloud semantic candidates to ensure their robustness. We name such verified features Pseudo-Anchors (P-A) as they hold similar properties to the anchor nodes: The P-A nodes are improbably formed by dynamic objects, and nodes’ blockage state can be immediately noticed once they are occluded. Another major challenge for mapping is improving large-scale global performance without sacrificing local consistency. Unrecognized GNSS pose drift may deteriorate local trajectory accuracy through post-processing such as graph optimization. In this study, we use the road network to provide the intersection information as a prior so that the GNSS can be better regarded as a reliable anchor factor. Three experiments are designed for this study. The first is ablations to verify the P-A concept; The second proves that the P-A-based lidar odometry outperformed the LOAM-based mainstream methods in highly dynamic scenarios; The third shows that our usage of the GNSS strengthens large-scale maps’ global consistency while causing less deterioration towards the local one. As a knowledge-based method, the P-A concept shows a high deployment efficiency, indicating the potential for migration to other features or even other sensors.
Published in: IEEE Transactions on Intelligent Transportation Systems ( Volume: 24, Issue: 2, February 2023)
Page(s): 1619 - 1630
Date of Publication: 24 November 2022

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.