skip to main content
10.1145/3640912.3640985acmotherconferencesArticle/Chapter ViewAbstractPublication PagescnmlConference Proceedingsconference-collections
research-article

Neural network reliability analysis based on fault injection

Published:22 February 2024Publication History

ABSTRACT

Neural networks have been widely applied in various fields, including drones and autonomous vehicles. The performance of neural networks determines their effectiveness, but reliability is equally important. Building on previous research on factors influencing neural network reliability, this study employed a fault injection framework to conduct experiments and further investigate the impact of layer types on network reliability. By conducting fault injection experiments, and observing the maximum bit error rate that different layers can tolerate, we can evaluate the reliability of each layer in the network model. Additionally, we introduced different types of layers into traditional neural network models and conducted experiments to further examine the reliability relationship among these layers within the same model. The results of the fault injection experiments indicate that the convolutional layer is the most susceptible to disruptions among the layers, with its reliability decreasing as the number of convolutional layers increases. On the other hand, the reliability of the fully connected layer improves with an increase in the number of layers.

References

  1. Chen, Y. H. 2016. Eyeriss: A Spatial Architecture for Energy-Efficient Dataflow for Convolutional Neural Networks. In The International Symposium on Computer Architecture (ISCA). DOI: 10.1109/ISCA.2016.40.Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Krizhevsky, A. 2012. ImageNet Classification with Deep Convolutional Neural Networks. In Neural Information Processing Systems (NIPS). https://doi.org/10.1145/3065386Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Chen, Z. 2019. BinFI: An Efficient Fault Injector for Safety-Critical Machine Learning Systems. In The International Conference for High Performance Computing, Networking, Storage and Analysis (SC). DOI: 10.1145/3295500.3356177Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Fang, B. 2013. Towards Building Error Resilient GPGPU Applications. Blogs.ubc.ca https://api.semanticscholar.org/CorpusID:11740337Google ScholarGoogle Scholar
  5. Dos Santos, F. 2017. Evaluation and mitigation of soft-errors in neural network-based object detection in three GPU architectures. pp. 169-176. DOI: 10.1109/DSN-W.2017.47Google ScholarGoogle ScholarCross RefCross Ref
  6. Santos, F. F. d. 2019. Analyzing and Increasing the Reliability of Convolutional Neural Networks on GPUs. IEEE Transactions on Reliability, 68(2), 663-677. DOI: 10.1109/TR.2018.2878387.Google ScholarGoogle ScholarCross RefCross Ref
  7. Li, G., Pattabiraman, K. 2018. TensorFI: A Configurable Fault Injector for TensorFlow Applications. 2018 IEEE International Symposium on Software Reliability Engineering Workshops (ISSREW), pp. 313-320. DOI: 10.1109/ISSREW.2018.00024.Google ScholarGoogle ScholarCross RefCross Ref
  8. Reagen, B. 2018. Ares: A framework for quantifying the resilience of deep neural networks. IEEE Design Automation Conference (DAC), vol. 55, pp. 1-6. DOI:10.1109/DAC.2018.8465834.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Hoang, L.H. 2019. "FT-ClipAct: Resilience Analysis of Deep Neural Networks and Improving their Fault Tolerance using Clipped Activation." arXiv, DOI: 10.48550/arXiv.1912.00941.Google ScholarGoogle ScholarCross RefCross Ref
  10. Wei, J. 2020. Analyzing the impact of soft errors in VGG networks implemented on GPUs. Microelectronics Reliability, 110, 113648. DOI:10.1016/j.microrel.2020.113648.Google ScholarGoogle ScholarCross RefCross Ref
  11. Hari, S.K.S. 2017. Sassififi: an architecture level fault injection tool for GPU application resilience evaluation. In: Proceedings of the International Symposium on Performance Analysis of Systems and Software (ISPASS), California, USA, Oct 2017. DOI: 10.1109/ISPASS.2017.7975296Google ScholarGoogle ScholarCross RefCross Ref
  12. Mahmoud, A. 2020. PyTorchFI: A Runtime Perturbation Tool for DNNs. 2020 50th Annual IEEE/IFIP International Conference on Dependable Systems and Networks Workshops (DSN-W), Valencia, Spain, pp. 25-31. DOI: 10.1109/DSN-W50199.2020.00014.Google ScholarGoogle ScholarCross RefCross Ref
  13. Colucci, A. 2022. "enpheeph: A Fault Injection Framework for Spiking and Compressed Deep Neural Networks." 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Kyoto, Japan, pp. 5155-5162. DOI: 10.1109/IROS47612.2022.9982181.Google ScholarGoogle ScholarCross RefCross Ref
  14. Li, G. 2017. Understanding Error Propagation in Deep Learning Neural Network (DNN) Accelerators and Applications. SC17: International Conference for High Performance Computing, Networking, Storage and Analysis, Denver, CO, USA, pp. 1-12. Doi:10.1145/3126908.3126964Google ScholarGoogle Scholar

Index Terms

  1. Neural network reliability analysis based on fault injection
          Index terms have been assigned to the content through auto-classification.

          Recommendations

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in
          • Published in

            cover image ACM Other conferences
            CNML '23: Proceedings of the 2023 International Conference on Communication Network and Machine Learning
            October 2023
            446 pages
            ISBN:9798400716683
            DOI:10.1145/3640912

            Copyright © 2023 ACM

            Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            • Published: 22 February 2024

            Permissions

            Request permissions about this article.

            Request Permissions

            Check for updates

            Qualifiers

            • research-article
            • Research
            • Refereed limited
          • Article Metrics

            • Downloads (Last 12 months)8
            • Downloads (Last 6 weeks)5

            Other Metrics

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader

          HTML Format

          View this article in HTML Format .

          View HTML Format