Driver Digital Twin for Online Recognition of Distracted Driving Behaviors | IEEE Journals & Magazine | IEEE Xplore

Driver Digital Twin for Online Recognition of Distracted Driving Behaviors


Abstract:

Deep learning has been widely utilized in intelligent vehicle systems, particularly in the field of driver distraction detection. However, existing methods in this applic...Show More

Abstract:

Deep learning has been widely utilized in intelligent vehicle systems, particularly in the field of driver distraction detection. However, existing methods in this application tend to focus solely on appearance or cognitive state as indicators of distraction, while neglecting the significance of temporal modeling in accurately identifying driver actions. This oversight can result in limitations such as difficulty in comprehending context, incapability to recognize gradual changes, and failure to capture complex behaviors. To address these limitations, this paper introduces a new framework based on the concept of Driver Digital Twin (DDT). The DDT framework serves as a digital replica of the driver, capturing their naturalistic driving data and behavioral models. It consists of a transformer-based driver action recognition module and a novel temporal localization module to detect distracted behaviors. Additionally, we propose a pseudo-labeled multi-task learning algorithm that includes driver emotion recognition as supplementary information for recognizing distractions. We have validated the effectiveness of our approach using three publicly available driver distraction detection benchmarks: SFDDD, AUCDD, and SynDD2. The results demonstrate that our framework achieves state-of-the-art performance in both driver action recognition and temporal localization tasks. It outperforms the leading methods by 6.5 and 0.9 percentage points on SFDDD and AUCDD, respectively. Furthermore, it ranks in the top 5% on the SynDD2 leaderboard.
Published in: IEEE Transactions on Intelligent Vehicles ( Volume: 9, Issue: 2, February 2024)
Page(s): 3168 - 3180
Date of Publication: 12 January 2024

ISSN Information:

Funding Agency:


References

References is not available for this document.