Loading [a11y]/accessibility-menu.js
Hybrid Convolutional-Transformer Neural Network for Driver Distraction Detection | IEEE Conference Publication | IEEE Xplore

Hybrid Convolutional-Transformer Neural Network for Driver Distraction Detection


Abstract:

Distracted driving causes a significant number of traffic accidents each year, making it a crucial area of research to monitor the driver’s state and prevent accidents. A...Show More

Abstract:

Distracted driving causes a significant number of traffic accidents each year, making it a crucial area of research to monitor the driver’s state and prevent accidents. Although convolutional networks are widely used for detecting driver distraction, they face several issues, such as poor generalization ability, high model capacity, and response latency. To tackle these challenges, this paper introduces a nimble and low-latency algorithm named MViTCNet, which utilizes the MobileViT module for the identification of driver distraction while operating within constrained computational resources. The MViTCNet algorithm extracts local features and constructs an original feature map, which is fused with a global feature representation processed by a Transformer to capture finer details and obtain a larger receptive field. Additionally, the classic convolutional module MBConv is employed to reduce spatial redundancy, and downsampling is used to obtain low-resolution feature maps. By stacking multiple modules to learn more complex feature representations, the network structure increases in depth and width, resulting in a better fit for complex data distributions. Experiments on an embedded hardware platform show that MViTCNet achieves an acceptable trade-off, with an accuracy of 91.04% on the StateFarm dataset and 97.80% on the LDDB dataset at a latency of 28.9±5.3 milliseconds, outperforming existing methods.
Date of Conference: 22-24 September 2023
Date Added to IEEE Xplore: 03 November 2023
ISBN Information:
Conference Location: Yibin, China

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.