Loading [a11y]/accessibility-menu.js
RGBD-SLAM Based on Object Detection With Two-Stream YOLOv4-MobileNetv3 in Autonomous Driving | IEEE Journals & Magazine | IEEE Xplore

RGBD-SLAM Based on Object Detection With Two-Stream YOLOv4-MobileNetv3 in Autonomous Driving


Abstract:

Autonomous driving has gradually become a research hotspot in recent years. Visual Simultaneous Localization and Mapping (SLAM) technology can help unmanned vehicles accu...Show More

Abstract:

Autonomous driving has gradually become a research hotspot in recent years. Visual Simultaneous Localization and Mapping (SLAM) technology can help unmanned vehicles accurately explore the environment at a lower cost, and the readability of the map can be improved by integrating target detection algorithms. However, the location and 3D shape of the object in the map were not obtained. The method of RGBD-SLAM based on object detection with two-stream YOLOv4-MobileNetv3 convolutional neural network is proposed in this paper. RGBD SLAM algorithm and target detection algorithm are combined to build an algorithm model that can generate the global sparse map and build target dense map quickly. The two-stream network is integrated to obtain 2D information about the target, and further combined with the camera pose after the front-end key frame detection of the SLAM algorithm in this paper, and the dense 3D point cloud of the target and the center point position of the object is obtained. Then, the sparse point cloud of the SLAM system and the dense point cloud of the target can be obtained. The experimental results show that the number of point clouds decreases by about 50% and the time for mapping accounts for about 60% of the global dense mapping time. The method of this paper can efficiently decrease the computational space and improve the speed of semantic mapping, which verifies its feasibility and superiority. It can be used to achieve large-area mapping and the ability to update maps during autonomous driving.
Published in: IEEE Transactions on Intelligent Transportation Systems ( Volume: 25, Issue: 3, March 2024)
Page(s): 2847 - 2857
Date of Publication: 26 June 2023

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.