Abstract:
For most existing simultaneous localization and mapping (SLAM) systems, the surroundings of autonomous vehicle are assumed to be static, which is impractical in urban env...Show MoreMetadata
Abstract:
For most existing simultaneous localization and mapping (SLAM) systems, the surroundings of autonomous vehicle are assumed to be static, which is impractical in urban environments and compromises the effectiveness of existing SLAM algorithms. In this work, a dynamic-object-aware light detection and ranging (LiDAR) visual inertial odometry (LVIO), termed Dynam-LVIO, is introduced to enhance the accuracy of SLAM in dynamic environments by constructing both environment map and object map. Initially, reprojection error and iterative closest point (ICP) error are calculated with object map, serving as observations for the error state iterated Kalman filter (ESIKF) to accurately estimate the object state on manifold. Subsequently, the 2-D bounding boxes derived through YOLO-V5 are tracked with the proposed multiobject tracking (MOT) algorithm, LVI-SORT, to achieve stable MOT in complex scenes. Specifically, to improve the accuracy of MOT in fast moving scenes, the 2-D object flow, calculated with object state, vehicle state and object map, is used to predict object state in the prediction process of LVI-SORT. Furthermore, to mitigate MOT failures caused by temporary object occlusion, a hybrid object association is proposed within the association process of LVI-SORT, which is accomplished by incorporating object map points and intersection-over-union (IoU) of bounding boxes to form the cost matrix in the Hungarian algorithm. Finally, the tracked 2-D bounding boxes are leveraged to segment the recent global map into environment map and object map, thereby reducing the impact of dynamic objects on LiDAR visual inertial SLAM. Compared with other benchmark algorithms, the results indicate that the proposed algorithm can enhance localization accuracy by 5%–10% in dynamic scenarios. Concurrently, the proposed algorithm also enhances object tracking accuracy by nearly 3%.
Published in: IEEE Transactions on Instrumentation and Measurement ( Volume: 73)