Skip to main content
Log in

DeepHaul: a deep learning and reinforcement learning-based smart automation framework for dump trucks

  • Regular Paper
  • Published:
Progress in Artificial Intelligence Aims and scope Submit manuscript

Abstract

In surface mining operations, the haul truck contributes the largest share to the overall operating cost, along with being a significant contributor to the injuries and fatalities to the operators; therefore, measures need to be taken for improving truck haulage safety and efficiency. In the absence of any major advancement in the automation technology for the mining sector, this study attempted to eliminate the existing technology lull by developing a novel DeepHaul framework for inducing smartness and intelligence within any dump truck, using the advanced algorithmic knowledge of artificial intelligence and machine learning. The DeepHaul framework consisted of two major components: first, inducing an object recognition ability by using deep learning methodology, for any dump truck. Experiments were conducted with different deep learning architectures, different training batch size ranges, and various image sizes for developing an optimum deep learning model with state-of-the-art performance for the haul truck. The second component consisted of the steering action decision making ability for the dump truck based on the given state of the haulage route. A reinforcement learning-based algorithm was designed, implemented, and tested for achieving the aforementioned objective. The algorithm exhibited an accuracy of 100% regarding safety and an average accuracy score of more than 97% regarding haulage efficiency. With the implementation of this DeepHaul framework, the existing autonomous haulage truck control technology can be greatly enhanced by inducing intelligence and smartness within the haul trucks, which would result in improved safety, efficiency, and the effectiveness of mining operations.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23
Fig. 24
Fig. 25
Fig. 26
Fig. 27
Fig. 28
Fig. 29
Fig. 30

Similar content being viewed by others

References

  1. DOE.: Mining Industry Profile. https://www.energy.gov/eere/amo/mining-industry-profile. (2019). Accessed 10 Jul 2019

  2. Ramani, R.V.: Surface mining technology: progress and prospects. Proc. Eng 46, 9–21 (2012)

    Article  Google Scholar 

  3. Ur Rehman, A., Lyche, T., Awuah-Offei, K., Nadendla, V.S.S.: Effect of text message alerts on miners evacuation decisions. Saf. Sci. 130, 104875 (2020). https://doi.org/10.1016/j.ssci.2020.104875

    Article  Google Scholar 

  4. Ruff, T.: Hazard Detection and Warning Devices: Safety Enhancement for Off-Highway Dump Trucks. A Compend NIOSH Min Res Washingt DC (2002)

  5. Ruff, T., Coleman, P., Martini, L.: Machine-related injuries in the US mining industry and priorities for safety research. Int. J. Inj. Contr. Saf. Promot. 18, 11–20 (2011). https://doi.org/10.1080/17457300.2010.487154

    Article  Google Scholar 

  6. Aldinger, J.A., Kenny, J.M., Keran, C.M.: Mobile equipment accidents in surface coal mines. US Bur Mines. Inf. Circ. 9428, 51 (1995)

    Google Scholar 

  7. MSHA.: Mine Injury and Worktime (2019)

  8. Ali, D., Frimpong, S.: Artificial intelligence, machine learning and process automation: existing knowledge frontier and way forward for mining sector. Artif. Intell. Rev. (2020). https://doi.org/10.1007/s10462-020-09841-6

    Article  Google Scholar 

  9. Parreira, J.: An Interactive Simulation Model to Compare an Autonomous Haulage Truck System with a Manually-Operated System. University of British Columbia, Vancouver (2013)

    Google Scholar 

  10. Accenture.: Using autonomous equipment to achieve high performance in the mining industry (2010)

  11. Price, R.: Autonomous haulage systems-the business case. AusIMM Bull 80 (2017)

  12. Zoschke, L., Jackson, M.: Experiences to date with unmanned haul trucks in open pit mines. In: 13th International Technical Meeting of the Satellite Division of the Institute of Navigation (ION GPS 2000), Salt Lake City, UT, pp. 1056–1060 (2000)

  13. Bullock, D.M., Oppenheim, I.J.: A laboratory study of force-cognitive excavation. In: Proceedings of Sixth International Symposium on Automation and Robotics in Construction (1989)

  14. Gocho, T., Yamabe, N., Hamaguchil, T., et al.: Automatic wheel-loader in asphalt plant. In: Proceedings of the 9th International Symposium on Automation and Robotics in Construction (1992)

  15. Shi, X., Lever, P.J., Wang, F.-Y.: Experimental robotic excavation with fuzzy logic and neural networks. In: Robotics and Automation, 1996. Proceedings, 1996 IEEE International Conference. IEEE (1996)

  16. Ji, W., Tang, L., Li, D., et al.: Video-based construction vehicles detection and its application in intelligent monitoring system. CAAI Trans. Intell. Technol. 1, 162–172 (2016)

    Article  Google Scholar 

  17. Golparvar-Fard, M., Heydarian, A., Niebles, J.C.: Vision-based action recognition of earthmoving equipment using spatio-temporal features and support vector machine classifiers. Adv. Eng. Inform. 27, 652–663 (2013)

    Article  Google Scholar 

  18. Somua-Gyimah, G., Frimpong, S., Nyaaba, W., Gbadam, E.: A computer vision system for terrain recognition and object detection tasks in mining and construction environments. In: SME Annual Conference (2019)

  19. Ali, D., Frimpong, S.: Artificial intelligence models for predicting the performance of hydro-pneumatic suspension struts in large capacity dump trucks. Int. J. Ind. Ergon. 67, 283–295 (2018). https://doi.org/10.1016/j.ergon.2018.06.005

    Article  Google Scholar 

  20. Azar, R.E., Dickinson, S., McCabe, B.: Server-customer interaction tracker: computer vision-based system to estimate dirt-loading cycles. J. Constr. Eng. Manag. 139, 785–794 (2012)

    Article  Google Scholar 

  21. Siami-Irdemoosa, E., Dindarloo, S.R.: Prediction of fuel consumption of mining dump trucks: a neural networks approach. Appl. Energy 151, 77–84 (2015)

    Article  Google Scholar 

  22. Ali, D., Frimpong, S.: DeepImpact: a deep learning model for whole body vibration control using impact force monitoring. Neural Comput. Appl. 1–24 (2020)

  23. Ristovski, K., Gupta, C., Harada, K., Tang, H.K.: Dispatch with confidence: integration of machine learning, optimization and simulation for open pit mines. In: 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 1981–1989 (2017)

  24. Sun, X., Zhang, H., Tian, F., Yang, L.: The use of a machine learning method to predict the real-time link travel time of open-pit trucks. Math. Probl. Eng. 2018, 1–14 (2018). https://doi.org/10.1155/2018/4368045

    Article  Google Scholar 

  25. LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521, 436–444 (2015)

    Article  Google Scholar 

  26. Nassif, A.B., Shahin, I., Attili, I., et al.: Speech recognition using deep neural networks: a systematic review. IEEE Access 7, 19143–19165 (2019)

    Article  Google Scholar 

  27. Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolutional neural networks. In: 26th Annual Conference on Advances in Neural Information Processing Systems. Lake Tahoe, CA, USA, pp. 1097–1105 (2012)

  28. Szegedy, C., Liu, W., Jia, Y., et al.: Going deeper with convolutions. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1–9 (2015)

  29. Girshick, R., Donahue, J., Darrell, T., Malik, J.: Rich feature hierarchies for accurate object detection and semantic segmentation. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 580–587 (2014)

  30. Sermanet, P., Eigen, D., Zhang, X., et al.: Overfeat: integrated recognition, localization and detection using convolutional networks (2013). arXiv Prepr arXiv13126229

  31. Zhang, X., Zou, J., He, K., Sun, J.: Accelerating very deep convolutional networks for classification and detection. IEEE Trans. Pattern. Anal. Mach. Intell. 38, 1943–1955 (2015)

    Article  Google Scholar 

  32. Shin, H.C., Roth, H.R., Gao, M., et al.: Deep convolutional neural networks for computer-aided detection: CNN architectures, dataset characteristics and transfer learning. IEEE Trans. Med. Imaging 35, 1285–1298 (2016)

    Article  Google Scholar 

  33. Lakhani, P., Sundaram, B.: Deep learning at chest radiography: automated classification of pulmonary tuberculosis by using convolutional neural networks. Radiology 284, 574–582 (2017)

    Article  Google Scholar 

  34. Ting, D.S.W., Yi, P.H., Hui, F.: Clinical applicability of deep learning system in detecting tuberculosis with chest radiography. Radiology 286, 729–731 (2018)

    Article  Google Scholar 

  35. Gulshan, V., Peng, L., Coram, M., et al.: Development and validation of a deep learning algorithm for detection of diabetic retinopathy in retinal fundus photographs. JAMA 316, 2402–2410 (2016)

    Article  Google Scholar 

  36. Lee, C.S., Tyring, A.J., Deruyter, N.P., et al.: Deep-learning based, automated segmentation of macular edema in optical coherence tomography. Biomed. Opt. Express 8, 3440–3448 (2017)

    Article  Google Scholar 

  37. Abràmoff, M.D., Lou, Y., Erginay, A., et al.: Improved automated detection of diabetic retinopathy on a publicly available dataset through integration of deep learning. Invest. Ophthalmol. Vis. Sci. 57, 5200–5206 (2016)

    Article  Google Scholar 

  38. Gargeya, R., Leng, T.: Automated identification of diabetic retinopathy using deep learning. Ophthalmology 124, 962–969 (2017)

    Article  Google Scholar 

  39. Li, Z., He, Y., Keel, S., et al.: Efficacy of a deep learning system for detecting glaucomatous optic neuropathy based on color fundus photographs. Ophthalmology 125, 1199–1206 (2018)

    Article  Google Scholar 

  40. Burlina, P.M., Joshi, N., Pekala, M., et al.: Automated grading of age-related macular degeneration from color fundus images using deep convolutional neural networks. JAMA Ophthalmol. 135, 1170–1176 (2017)

    Article  Google Scholar 

  41. Grassmann, F., Mengelkamp, J., Brandl, C., et al.: A deep learning algorithm for prediction of age-related eye disease study severity scale for age-related macular degeneration from color fundus photography. Ophthalmology 125, 1410–1420 (2018)

    Article  Google Scholar 

  42. Brown, J.M., Campbell, J.P., Beers, A., et al.: Automated diagnosis of plus disease in retinopathy of prematurity using deep convolutional neural networks. JAMA Ophthalmol. 136, 803–810 (2018)

    Article  Google Scholar 

  43. Varadarajan, A.V., Poplin, R., Blumer, K., et al.: Deep learning for predicting refractive error from retinal fundus images. Invest. Ophthalmol. Vis. Sci. 59, 2861–2868 (2018)

    Article  Google Scholar 

  44. Poplin, R., Varadarajan, A.V., Blumer, K., et al.: Prediction of cardiovascular risk factors from retinal fundus photographs via deep learning. Nat. Biomed. Eng. 2, 158 (2018)

    Article  Google Scholar 

  45. Ting, D.S.W., Cheung, C.Y.L., Lim, G., et al.: Development and validation of a deep learning system for diabetic retinopathy and related eye diseases using retinal images from multiethnic populations with diabetes. JAMA 318, 2211–2223 (2017)

    Article  Google Scholar 

  46. Esteva, A., Kuprel, B., Novoa, R.A.: Dermatologist-level classification of skin cancer with deep neural networks. Nature 542, 115–118 (2017)

    Article  Google Scholar 

  47. Bejnordi, E., Veta, M., van Diest, P.: Diagnostic assessment of deep learning algorithms for detection of lymph node metastases in women with breast cancer. JAMA 318, 2199–2210 (2017)

    Article  Google Scholar 

  48. Collobert, R., Weston, J., Bottou, L., et al.: Natural language processing (almost) from scratch. J. Mach. Learn. Res. 12, 2493–2537 (2011)

    MATH  Google Scholar 

  49. Kalchbrenner, N., Grefenstette, E., Blunsom, P.: A convolutional neural network for modelling sentences (2014). arXiv Prepr arXiv14042188

  50. Kim, Y.: Convolutional neural networks for sentence classification (2014). arXiv Prepr arXiv14085882

  51. Palaz, D., Collobert, R.: Analysis of cnn-based speech recognition system using raw speech as input. Idiap (2015)

  52. Tu, Z., Hu, B., Lu, Z., Li, H.: Context-dependent translation selection using convolutional neural network (2015). arXiv Prepr arXiv150302357

  53. Hinton, G., Deng, L., Yu, D., et al.: Deep neural networks for acoustic modeling in speech recognition. IEEE Signal Process. Mag. 29 (2012)

  54. Ghafoor, A., Galchenko, P., Balakrishnan, S.N., et al.: ETNAC design enabling formation flight at liberation points. In: IEEE American Control Conference (ACC) 2019, Philadelphia, PA, USA, pp. 3689–3694 (2019)

  55. Ghafoor, A., Balakrishnan, S.N., Jagannathan, S., Yucelen, T.: Event-triggered neuro-adaptive controller (ETNAC) design for uncertain linear systems. In: 2018 IEEE Conference on Decision and Control (CDC). Florida, USA, pp. 2217–2222 (2018)

  56. Tesauro, G.: Temporal difference learning and td-gammon. Commun. ACM 38, 58–68 (1995)

    Article  Google Scholar 

  57. Mnih, V., Kavukcuoglu, K., Silver, D., et al.: Playing atari with deep reinforcement learning. Deep Technol

  58. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization (2014). arXiv Prepr arXiv14126980

Download references

Funding

The study was not funded by any organization.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Danish Ali.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

See Figs.

Fig. 31
figure 31

F1-score for deep learning model variants 1 and 2

31,

Fig. 32
figure 32

F1-score for deep learning model variants 3 and 4

32,

Fig. 33
figure 33

F1-score for deep learning model variants 4 and 5

33,

Fig. 34
figure 34

F1-score for deep learning model variants 5 and 6

34,

Fig. 35
figure 35

F1-score for deep learning model variants 6 and 7

35,

Fig. 36
figure 36

F1-score for deep learning model variants 7 and 8

36,

Fig. 37
figure 37

F1-score for deep learning model variants 8 and 9

37,

Fig. 38
figure 38

Training time plots for deep learning model variants 1, 2, 3, and 4 with 28 × 28 image resolution

38,

Fig. 39
figure 39

Training time plots for deep learning model variants 4, 5, 6, 7, 8, and 9 with 28 × 28 image resolution

39,

Fig. 40
figure 40

Training time plots for deep learning model variants 1, 2, 3, and 4 with 50 × 50 image resolution

40,

Fig. 41
figure 41

Training time plots for deep learning model variants 4, 5, 6, 7, 8, and 9 with 50 × 50 image resolution

41,

Fig. 42
figure 42

Training time plots for deep learning model variants 1, 2, 3, and 4 with 64 × 64 image resolution

42 and

Fig. 43
figure 43

Training time plots for deep learning model variants 4, 5, 6, 7, 8, and 9 with 64 × 64 image resolution

43.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ali, D., Frimpong, S. DeepHaul: a deep learning and reinforcement learning-based smart automation framework for dump trucks. Prog Artif Intell 10, 157–180 (2021). https://doi.org/10.1007/s13748-021-00233-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13748-021-00233-7

Keywords

Navigation