Skip to main content

Advertisement

Log in

Leveraging single-shot detection and random sample consensus for wind turbine blade inspection

  • Original Research Paper
  • Published:
Intelligent Service Robotics Aims and scope Submit manuscript

Abstract

Wind turbines require periodic inspection to ensure efficient power generation and a prolonged lifetime. Traditionally, inspection involves the risk of a person falling while abseiling from the top of the nacelle. To avoid this, drones have been controlled by operators to inspect the blades. However, this task requires expert pilots, who experience fatigue quickly. Alternatively, autonomous drones are not subject to human tiredness and can follow trajectories in a repeatable manner. Motivated by the latter, we introduce a vision-based blade detector capable of recognizing their orientation and relative position to generate a flight plan that allows it to safely collect image data. The proposed blade detector extracts line features with the camera, which are filtered to reduce the search space by using bounding boxes. They are obtained with a single-shot detector based on a convolutional neural network. Finally, a random sample consensus procedure finds the lines that best fit a geometrical model of the wind turbine. We compare our deep learning approach against a color segmentation method, showing that it is up to 6 times faster. We also compare against guided search during random sampling, which exploits the separate boxes detected by the network, seeking to reduce the number of outliers. We conclude with an illustrative example of how our proposed detector could be used for autonomous wind turbine inspection.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23
Fig. 24
Fig. 25
Fig. 26
Fig. 27
Fig. 28
Fig. 29
Fig. 30
Fig. 31
Fig. 32
Fig. 33
Fig. 34
Fig. 35
Fig. 36
Fig. 37
Fig. 38
Fig. 39
Fig. 40
Fig. 41
Fig. 42
Fig. 43
Fig. 44

Similar content being viewed by others

References

  1. SPARC (2017) Robotics 2020 multi-annual roadmap 5(2):178–228

  2. Marzena P, Szymon P, Łukasz K (2017) The use of UAV’s for search and rescue operations. In: Procedia engineering, vol 192, Elsevier Ltd, pp 748–752. https://doi.org/10.1016/j.proeng.2017.06.129

  3. von Gioi RG, Jakubowicz J, Morel JM, Randall G (2010) LSD: a fast line segment detector with a false detection control. IEEE Trans Pattern Anal Mach Intell 32(4):722–732

    Article  Google Scholar 

  4. Wei L, Dragomir A, Dumitru E, Christian S, Scott R, Cheng-Yang F, Berg Alexander C (2015) SSD: single shot multibox detector. In: Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics), vol 9905, LNCS, pp 21–37. https://doi.org/10.1007/978-3-319-46448-0_2

  5. Fischler MA, Bolles RC (1981) Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun ACM 24(6):381–395

    Article  MathSciNet  Google Scholar 

  6. Schafer BE, Picchi D, Engelhardt T, Abel D (2016) Multicopter unmanned aerial vehicle for automated inspection of wind turbines. In: 2016 24th Mediterranean conference on control and automation (MED), IEEE, pp 244–249. ISBN 978-1-4673-8345-5. https://doi.org/10.1109/MED.2016.7536055.http://ieeexplore.ieee.org/document/7536055/

  7. Stokkeland M, Klausen K, Johansen TA (2015) Autonomous visual navigation of unmanned aerial vehicle for wind turbine inspection. In: 2015 International conference on unmanned aircraft systems (ICUAS), IEEE, pp 998–1007. ISBN 978-1-4799-6010-1. 10.1109/ICUAS.2015.7152389.http://ieeexplore.ieee.org/document/7152389/

  8. Høglund S (2014) Autonomous inspection of wind turbines and buildings using an UAV. Master’s thesis, Norwegian University of Science and Technology. http://hdl.handle.net/11250/261287

  9. Kanellakis C, Fresk E, Mansouri SS, Kominiak D, Nikolakopoulos G (2019) Autonomous visual inspection of large-scale infrastructures using aerial robots. CoRR. arXiv preprint arXiv:1901.05510

  10. Hornung A, Wurm KM, Bennewitz M, Stachniss C, Burgard W (2013) OctoMap: an efficient probabilistic 3D mapping framework based on octrees. In: Autonomous robots

  11. Stokkeland M (2014) A computer vision approach for autonomous wind turbine inspection using a multicopter. Master’s thesis, Norwegian University of Science and Technology. http://www.diva-portal.org/smash/get/diva2:744160/FULLTEXT01.pdf

  12. Hough PVC (1962) Method and means for recognizing complex patterns. US Patent No. 3,069,654.https://patents.google.com/patent/US3069654A/en

  13. Duda RO, Hart PE (1972) Use of the Hough transformation to detect lines and curves in pictures. Commun ACM 15(1):11–15. https://doi.org/10.1145/361237.361242

    Article  MATH  Google Scholar 

  14. Horn BKP, Schunck BG (1981) Determining optical flow. Artif Intell 17(1–3):185–203. https://doi.org/10.1016/0004-3702(81)90024-2

    Article  Google Scholar 

  15. Lucas BD, Kanade T (1981) An iterative image registration technique with an application to stereo vision. In: Proceedings of the 7th international joint conference on artificial intelligence, vol 2, IJCAI’81, San Francisco, CA, USA, Morgan Kaufmann Publishers Inc., pp 674–679. http://dl.acm.org/citation.cfm?id=1623264.1623280

  16. Ren S, He K, Girshick R, Sun J (2017) Faster R-CNN: towards real-time object detection with region proposal networks. IEEE Trans Pattern Anal Mach Intell 39(6):1137–1149

    Article  Google Scholar 

  17. Simonyan K, Zisserman A (2015) Very deep convolutional networks for large-scale image recognition. Technical report. http://www.robots.ox.ac.uk/

  18. Cho DM, Tsiotras P, Zhang G, Holzinger M (2013) Robust feature detection, acquisition and tracking for relative navigation in space with a known target. In: AIAA guidance, navigation, and control (GNC) conference, guidance, navigation, and control and co-located conferences, American Institute of Aeronautics and Astronautics. https://doi.org/10.2514/6.2013-5197. https://doi.org/10.2514/6.2013-5197

  19. Boonsuk W (2016) Investigating effects of stereo baseline distance on accuracy of 3D projection for industrial robotic applications. In: 5th IAJC/ISAM joint international conference, ISBN 9781606433799

  20. Abbas SM, Muhammad A (2012) Outdoor RGB-D SLAM performance in slow mine detection. In: ROBOTIK 2012; 7th German conference on robotics

  21. Mur-Artal R, Tardos JD (2017) ORB-SLAM2: an open-source SLAM system for monocular, stereo, and RGB-D cameras. IEEE Trans Rob 33(5):1255–1262. https://doi.org/10.1109/TRO.2017.2705103

    Article  Google Scholar 

  22. Rosten E, Drummond T (2006) Machine learning for high-speed corner detection. In: Proceedings of the 9th European conference on computer vision: part I, ECCV’06, Springer, Berlin, pp 430–443. ISBN 3-540-33832-2, 978-3-540-33832-1. https://doi.org/10.1007/11744023_34. http://dx.doi.org/10.1007/11744023_34

  23. Rublee E, Rabaud V, Konolige K, Bradski G (2011) ORB: an efficient alternative to SIFT or SURF. In: 2011 International conference on computer vision, pp 2564–2571. https://doi.org/10.1109/ICCV.2011.6126544

  24. Rojas-Perez LO, Martinez-Carranza J (2017) Metric monocular SLAM and colour segmentation for multiple obstacle avoidance in autonomous flight. In: 2017 Workshop on research, education and development of unmanned aerial systems (RED-UAS)

  25. Moon H, Martinez-Carranza J, Cieslewski T, Faessler M, Falanga D, Simovic A, Scaramuzza D, Li S, Ozo M, Wagter C, de Croon G, Hwang S, Jung S, Shim H, Kim H, Park M, Au TC, Kim SJ (2019) Challenges and implemented technologies used in autonomous drone racing. Intell Serv Rob. https://doi.org/10.1007/s11370-018-00271-6

    Article  Google Scholar 

  26. Zollhöfer M, Stotko P, Görlitz A, Theobalt C, Nießner M, Klein R, Kolb A (2018) State of the art on 3D reconstruction with RGB-D cameras. Comput Gr Forum 37:625–652

    Article  Google Scholar 

Download references

Acknowledgements

This research project has been funded by Conacyt grant 291137 and Conacyt-INEGI project 268528.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jose Martinez-Carranza.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Parlange, R., Martinez-Carranza, J. Leveraging single-shot detection and random sample consensus for wind turbine blade inspection. Intel Serv Robotics 14, 611–628 (2021). https://doi.org/10.1007/s11370-021-00383-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11370-021-00383-6

Keywords

Navigation