Abstract:
With the rapid advancements in artificial intelligence and sensor technology, autonomous vehicles are becoming increasingly closer to reality. To ensure the safety and re...Show MoreMetadata
Abstract:
With the rapid advancements in artificial intelligence and sensor technology, autonomous vehicles are becoming increasingly closer to reality. To ensure the safety and reliability of autonomous navigation, robust perception and control systems are crucial. This paper presents a novel integrated module for collision-free autonomous navigation that leverages the power of deep learning and edge computing. By synergistically combining Nvidia's DriveNet for object detection and OpenRoadNet for drive-able free space detection using the Drive Works SDK, the module enables precise perception capabilities and dynamic control of the vehicle's speed based on the calculated probability of collision, resulting in collision-free navigation. Additionally, a customized Pure-Pursuit algorithm ensures smooth and accurate steering angle prediction. The system is implemented on Nvidia Drive Pegasus for perception and control and dSPACE MicroAutoBox-III for actuation, achieving a remarkable real-time processing speed of 30 frames per second (FPS). This research significantly advances Advanced Driver Assistance Systems (ADAS), offering robust navigation suitable for controlled environment deployment.
Date of Conference: 12-14 January 2024
Date Added to IEEE Xplore: 04 September 2024
ISBN Information: