An Automatic Loose Defect Detection Method for Catenary Bracing Wire Components Using Deep Convolutional Neural Networks and Image Processing | IEEE Journals & Magazine | IEEE Xplore

An Automatic Loose Defect Detection Method for Catenary Bracing Wire Components Using Deep Convolutional Neural Networks and Image Processing

Publisher: IEEE

Abstract:

The bracing wire is one of the most important components of catenary support devices (CSCs) in the high-speed railway, which is used to connect the messenger wire base an...View more

Abstract:

The bracing wire is one of the most important components of catenary support devices (CSCs) in the high-speed railway, which is used to connect the messenger wire base and bracing wire hook for stabilizing the contact line receiving current. However, it is prone to a loose defect due to the frequent impact of the pantograph. Therefore, in this article, an automatic defect detection method for catenary bracing wires using computer vision technologies is proposed. First, the catenary images are enhanced through a deep convolutional neural network (CNN) called image-adaptive 3-D LUTs (3-D lookup tables) to make the image more clear and easier to identify. Next, the bracing wire is localized by an angle-based CNN called dynamic anchor learning (DAL), which introduces the angle parameter into the object detection network to make the bracing wire more accurate to localization and easier to follow up failure analysis. Last, bracing wires are detected by the Hough transformation (HT), and based on the peak distribution of HT, a looseness defect criterion of bracing wires is proposed according to the curvature change. Experimental results prove that the proposed method can effectively enhance the catenary images, and the bracing wires are precisely localized, and all the defects of bracing wire components under different conditions accurately.
Article Sequence Number: 5016814
Date of Publication: 15 September 2021

ISSN Information:

Publisher: IEEE

Funding Agency:


References

References is not available for this document.