Skip to main content
Log in

A novel early warning strategy for right-turning blind zone based on vulnerable road users detection

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Blind zone detection of vehicles, as an essential function of Advanced Driver Assistance System, can effectively reduce the occurrence of traffic accidents and has attracted unprecedented attention. This paper develops an active collision avoidance method for right-turning blind zone, based on vulnerable road users (VRUs) detection. The proposed strategy consists of three main steps. First of all, an improved YOLOv4-tiny algorithm based on deep learning, combining two optimization strategies, is proposed to detect VRUs in right-turning blind zone more accurately and robustly. Secondly, a distance measurement method via monocular camera is used for ranging the distance between the host vehicle and the detected VRUs. Finally, a simple and effective vehicle active speed control algorithm is presented, based on distance and vehicle speed information, to provide early warning to the driver. This method was tested in a large driving dataset and in various actual driving situations. Experimental results show that, compared with the lightweight state-of-the-art methods, the improved YOLOv4-tiny has the best detection accuracy for VRUs and can stabilize at a detection speed of 50FPS on 1920*1080 resolution video, and that the measured distance error remains within 4%. A simulation test also proves that the proposed active speed control algorithm can effectively deliver early warning to drivers and avoid traffic accidents.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

References

  1. Grembek O, Kurzhanskiy A, Medury A et al (2019) Making intersections safer with I2V communication. Transp Res Part C Emerg Technol 102:396–410. https://doi.org/10.1016/j.trc.2019.02.017

    Article  Google Scholar 

  2. Dooley D, McGinley B, Hughes C et al (2016) A blind-zone detection method using a rear-mounted fisheye camera with combination of vehicle detection methods. IEEE Trans Intell Transp Syst 17:264–278. https://doi.org/10.1109/TITS.2015.2467357

    Article  Google Scholar 

  3. Janai J, Güney F, Behl A, Geiger A (2017) Computer vision for autonomous vehicles: problems, datasets and state of the art. arXiv e-prints. arXiv:1704.05519

  4. Chun J, Lee I, Park G et al (2013) Efficacy of haptic blind spot warnings applied through a steering wheel or a seatbelt. Transp Res Part F Traffic Psychol Behav 21:231–241. https://doi.org/10.1016/j.trf.2013.09.014

    Article  Google Scholar 

  5. Zhang Z, Xu H, Chao Z et al (2015) A novel vehicle reversing speed control based on obstacle detection and sparse representation. IEEE Trans Intell Transp Syst 16:1321–1334. https://doi.org/10.1109/TITS.2014.2360337

    Article  Google Scholar 

  6. Yagubov R (2019) Truck active reversing control strategy based on modified particle filter and multi-sensors environment perception. IET Intell Transp Syst 13:1057–1068

    Article  Google Scholar 

  7. Suhr JK, Jung HG (2018) Rearview camera-based backover warning system exploiting a combination of pose-specific pedestrian recognitions. IEEE Trans Intell Transp Syst 19:1122–1129. https://doi.org/10.1109/TITS.2017.2709797

    Article  Google Scholar 

  8. Kim D, Choi J, Yoo H et al (2015) Rear obstacle detection system with fisheye stereo camera using HCT. Expert Syst Appl 42:6295–6305. https://doi.org/10.1016/j.eswa.2015.04.035

    Article  Google Scholar 

  9. Yoo H, Son J, Ham B, Sohn K (2016) Real-time rear obstacle detection using reliable disparity for driver assistance. Expert Syst Appl 56:186–196. https://doi.org/10.1016/j.eswa.2016.02.049

    Article  Google Scholar 

  10. Lin B, Chan Y, Fu L, et al (2010) Incorporating appearance and edge features for vehicle detection in the blind-spot area. In: 13th International IEEE conference on intelligent transportation systems. pp 869–874

  11. Wu B-F, Huang H-Y, Chen C-J et al (2013) A vision-based blind spot warning system for daytime and nighttime driver assistance. Comput Electr Eng 39:846–862. https://doi.org/10.1016/j.compeleceng.2013.03.020

    Article  Google Scholar 

  12. Fernández C, Llorca DF, Sotelo MA et al (2013) Real-time vision-based blind spot warning system: experiments with motorcycles in daytime/nighttime conditions. Int J Automot Technol 14:113–122. https://doi.org/10.1007/s12239-013-0013-3

    Article  Google Scholar 

  13. Ra M, Jung HG, Suhr JK, Kim W-Y (2018) Part-based vehicle detection in side-rectilinear images for blind-spot detection. Expert Syst Appl 101:116–128. https://doi.org/10.1016/j.eswa.2018.02.005

    Article  Google Scholar 

  14. Choi K, Jung HG (2019) Cut-in vehicle warning system exploiting multiple rotational images of SVM cameras. Expert Syst Appl 125:81–99. https://doi.org/10.1016/j.eswa.2019.01.081

    Article  Google Scholar 

  15. Guo Y, Kumazawa I, Kaku C (2018) Blind spot obstacle detection from monocular camera images with depth cues extracted by CNN. Automot Innov 1:362–373. https://doi.org/10.1007/s42154-018-0036-6

    Article  Google Scholar 

  16. Baek I, Davies A, Yan G, Rajkumar RR (2018) Real-time detection, tracking, and classification of moving and stationary objects using multiple fisheye images. In: 2018 IEEE intelligent vehicles symposium (IV), pp 447–452

  17. Howard AG, Zhu M, Chen B, et al (2017) MobileNets: efficient convolutional neural networks for mobile vision applications. arXiv e-prints. arXiv:1704.04861

  18. Kaur B, Bhattacharya J (2019) A convolutional feature map-based deep network targeted towards traffic detection and classification. Expert Syst Appl 124:119–129. https://doi.org/10.1016/j.eswa.2019.01.014

    Article  Google Scholar 

  19. Sarkar S, Venugopalan V, Reddy K et al (2017) Deep learning for automated occlusion edge detection in RGB-D frames. J Signal Process Syst 88:205–217. https://doi.org/10.1007/s11265-016-1209-3

    Article  Google Scholar 

  20. Zhuo L, Jiang L, Zhu Z et al (2017) Vehicle classification for large-scale traffic surveillance videos using convolutional neural networks. Mach Vis Appl 28:793–802. https://doi.org/10.1007/s00138-017-0846-2

    Article  Google Scholar 

  21. Qin P, Zhang C, Dang M (2021) GVnet: Gaussian model with voxel-based 3D detection network for autonomous driving. Neural Comput Appl. https://doi.org/10.1007/s00521-021-06061-z

    Article  Google Scholar 

  22. Li Z, Chen Z, Jonathan Wu QM, Liu C (2020) Pedestrian detection via deep segmentation and context network. Neural Comput Appl 32:5845–5857. https://doi.org/10.1007/s00521-019-04057-4

    Article  Google Scholar 

  23. Murakami S (1983) Application of fuzzy controller to automobile speed control system. IFAC Proc 16:43–48. https://doi.org/10.1016/S1474-6670(17)62003-2

    Article  Google Scholar 

  24. Druzhinina M, Stefanopoulou AG, Moklegaard L (2002) Speed gradient approach to longitudinal control of heavy-duty vehicles equipped with variable compression brake. IEEE Trans Control Syst Technol 10:209–220. https://doi.org/10.1109/87.987066

    Article  Google Scholar 

  25. Nobukawa K, Bao S, LeBlanc DJ et al (2016) Gap Acceptance during lane changes by large-truck drivers—an image-based analysis. IEEE Trans Intell Transp Syst 17:772–781. https://doi.org/10.1109/TITS.2015.2482821

    Article  Google Scholar 

  26. Levison W, Kantowitz B, Moyer M, Robinson M (1998) A stopping-distance model for driver speed decision making in curve approach. Proc Hum Factors Ergon Soc Annu Meet 42:1222–1226. https://doi.org/10.1177/154193129804201710

    Article  Google Scholar 

  27. Jia Y, Cebon D (2016) Field testing of a cyclist collision avoidance system for heavy goods vehicles. IEEE Trans Veh Technol 65:4359–4367. https://doi.org/10.1109/TVT.2016.2538801

    Article  Google Scholar 

  28. Redmon J, Divvala S, Girshick R, Farhadi A (2016) You only look once: unified, real-time object detection. In: 2016 IEEE conference on computer vision and pattern recognition (CVPR), pp 779–788

  29. Zou Z, Shi Z, Guo Y, Ye J (2019) Object detection in 20 years: a survey. arXiv e-prints. arXiv:1905.05055

  30. Redmon J, Farhadi A (2017) YOLO9000: Better, faster, stronger. In: 2017 IEEE conference on computer vision and pattern recognition (CVPR), pp 6517–6525

  31. Redmon J, Farhadi A (2018) YOLOv3: An incremental improvement. arXiv e-prints. arXiv:1804.02767

  32. Bochkovskiy A, Wang C-Y, Liao H-YM (2020) YOLOv4: optimal speed and accuracy of object detection. arXiv e-prints. arXiv:2004.10934

  33. https://github.com/AlexeyAB/darknet

  34. Lin T-Y, Dollár P, Girshick R, et al (2016) Feature pyramid networks for object detection. In Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 2117–2125.

  35. Rezatofighi H, Tsoi N, Gwak J, et al (2019) Generalized intersection over union: a metric and a loss for bounding box regression. In: 2019 IEEE/CVF conference on computer vision and pattern recognition (CVPR), pp 658–666

  36. Zheng Z, Wang P, Liu W, et al (2019) Distance-IoU loss: faster and better learning for bounding box regression. arXiv e-prints. arXiv:1911.08287

  37. Moon S, Yi K (2008) Human driving data-based design of a vehicle adaptive cruise control algorithm. Veh Syst Dyn 46:661–690. https://doi.org/10.1080/00423110701576130

    Article  Google Scholar 

  38. Yi K, Ryu N, Yoon HJ et al (2002) Implementation and vehicle tests of a vehicle stop-and-go cruise control system. Proc Inst Mech Eng PART D-JOURNAL Automob Eng 216:537–544. https://doi.org/10.1243/095440702760178479

    Article  Google Scholar 

  39. Jeppsson H, Östling M, Lubbe N (2018) Real life safety benefits of increasing brake deceleration in car-to-pedestrian accidents: simulation of vacuum emergency braking. Accid Anal Prev 111:311–320. https://doi.org/10.1016/j.aap.2017.12.001

    Article  Google Scholar 

  40. Xiong X, Wang M, Cai Y et al (2019) A forward collision avoidance algorithm based on driver braking behavior. Accid Anal Prev 129:30–43. https://doi.org/10.1016/j.aap.2019.05.004

    Article  Google Scholar 

  41. Yu F, Chen H, Wang X, et al (2018) BDD100K: A Diverse driving dataset for heterogeneous multitask learning. arXiv e-prints. arXiv:1805.04687

  42. Everingham M, Van Gool L, Williams CKI et al (2010) The Pascal visual object classes (VOC) Challenge. Int J Comput Vis 88:303–338. https://doi.org/10.1007/s11263-009-0275-4

    Article  Google Scholar 

  43. https://github.com/ultralytics/yolov5

  44. https://github.com/pytorch/pytorch

Download references

Acknowledgements

This work was supported by the National Natural Science Foundation of China under Grants Nos. 51975490 and 5177424; and by the Science and Technology Projects of Sichuan under Grants Nos. 2020YFSY0070 and 2021JDRC0096; and by the Sichuan Science and Technology Program under Grant No. 2020JDTD0027. The asterisk indicates the corresponding author.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zutao Zhang.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (MP4 18599 KB)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Han, L., Zheng, P., Li, H. et al. A novel early warning strategy for right-turning blind zone based on vulnerable road users detection. Neural Comput & Applic 34, 6187–6206 (2022). https://doi.org/10.1007/s00521-021-06800-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-021-06800-2

Keywords

Navigation