Skip to main content

Fine-Grained Channel Pruning for Deep Residual Neural Networks

  • Conference paper
  • First Online:
Book cover Artificial Neural Networks and Machine Learning – ICANN 2020 (ICANN 2020)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 12397))

Included in the following conference series:

Abstract

Pruning residual neural networks is a challenging task due to the constraints induced by cross layer connections. Many existing approaches assign channels connected by skip-connections to the same group and prune them simultaneously, limiting the pruning ratio on those troublesome filters. Instead, we propose a Fine-grained Channel Pruning (FCP) method that allows any channels to be pruned independently. To avoid the misalignment problem between convolution and skip connection, we always keep the residual addition operations alive. Thus we can obtain a novel efficient residual architecture by removing any unimportant channels without the alignment constraint. Besides classification, We further apply FCP on residual models for image super-resolution, which is a low-level vision task. Extensive experimental results show that FCP can achieve better performance than other state-of-the-art methods in terms of parameter and computation cost. Notably, on CIFAR-10, FCP reduces more than 78% FLOPs on ResNet-56 with no accuracy drop. Moreover, it achieves more than 48% FLOPs reduction on MSRResNet with negligible performance degradation.

Supported by the National Key R&D Program of China (2018YFB0904900, 2018YFB0904902).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Ahn, N., Kang, B., Sohn, K.-A.: Fast, accurate, and lightweight super-resolution with cascading residual network. In: Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y. (eds.) ECCV 2018. LNCS, vol. 11214, pp. 256–272. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-01249-6_16

    Chapter  Google Scholar 

  2. Bevilacqua, M., Roumy, A., Guillemot, C., Alberi-Morel, M.: Low-complexity single-image super-resolution based on nonnegative neighbor embedding. In: BMVC, pp. 1–10 (2012). https://doi.org/10.5244/C.26.135

  3. Ding, X., Ding, G., Guo, Y., Han, J.: Centripetal SGD for pruning very deep convolutional networks with complicated structure. In: CVPR, pp. 4943–4953 (2019)

    Google Scholar 

  4. Gao, S., Liu, X., Chien, L., Zhang, W., Alvarez, J.M.: VACL: variance-aware cross-layer regularization for pruning deep residual networks. CoRR abs/1909.04485 (2019). http://arxiv.org/abs/1909.04485

  5. Han, S., Pool, J., Tran, J., Dally, W.: Learning both weights and connections for efficient neural networks, pp. 1135–1143 (2015)

    Google Scholar 

  6. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: CVPR, pp. 770–778 (2016). https://doi.org/10.1109/CVPR.2016.90

  7. He, Y., Kang, G., Dong, X., Fu, Y., Yang, Y.: Soft filter pruning for accelerating deep convolutional neural networks. In: IJCAI, pp. 2234–2240 (2018). https://doi.org/10.24963/ijcai.2018/309

  8. He, Y., Liu, P., Wang, Z., Hu, Z., Yang, Y.: Filter pruning via geometric median for deep convolutional neural networks acceleration. In: CVPR, pp. 4340–4349 (2019)

    Google Scholar 

  9. He, Y., Zhang, X., Sun, J.: Channel pruning for accelerating very deep neural networks. In: ICCV, pp. 1398–1406 (2017). https://doi.org/10.1109/ICCV.2017.155

  10. Huang, G., Liu, Z., van der Maaten, L., Weinberger, K.Q.: Densely connected convolutional networks. In: CVPR, pp. 2261–2269 (2017). https://doi.org/10.1109/CVPR.2017.243

  11. Krizhevsky, A.: Learning multiple layers of features from tiny images. Technical report (2009)

    Google Scholar 

  12. Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. In: NeurIPS, pp. 1106–1114 (2012)

    Google Scholar 

  13. Lemaire, C., Achkar, A., Jodoin, P.: Structured pruning of neural networks with budget-aware regularization. In: CVPR, pp. 9108–9116 (2019)

    Google Scholar 

  14. Li, H., Kadav, A., Durdanovic, I., Samet, H., Graf, H.P.: Pruning filters for efficient convnets. In: ICLR (2017)

    Google Scholar 

  15. Lim, B., Son, S., Kim, H., Nah, S., Lee, K.M.: Enhanced deep residual networks for single image super-resolution. In: CVPR Workshops, pp. 1132–1140 (2017). https://doi.org/10.1109/CVPRW.2017.151

  16. Liu, Z., Li, J., Shen, Z., Huang, G., Yan, S., Zhang, C.: Learning efficient convolutional networks through network slimming. In: ICCV, pp. 2755–2763 (2017). https://doi.org/10.1109/ICCV.2017.298

  17. Luo, J., Wu, J., Lin, W.: ThiNet: a filter level pruning method for deep neural network compression. In: ICCV, pp. 5068–5076 (2017). https://doi.org/10.1109/ICCV.2017.541

  18. Molchanov, P., Mallya, A., Tyree, S., Frosio, I., Kautz, J.: Importance estimation for neural network pruning. In: CVPR, pp. 11264–11272 (2019)

    Google Scholar 

  19. Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image recognition. In: ICLR (2015). http://arxiv.org/abs/1409.1556

  20. Wang, X., et al.: ESRGAN: enhanced super-resolution generative adversarial networks. In: Leal-Taixé, L., Roth, S. (eds.) ECCV 2018. LNCS, vol. 11133, pp. 63–79. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-11021-5_5

    Chapter  Google Scholar 

  21. Wen, W., Wu, C., Wang, Y., Chen, Y., Li, H.: Learning structured sparsity in deep neural networks. In: NeurIPS. pp. 2074–2082 (2016)

    Google Scholar 

  22. You, Z., Yan, K., Ye, J., Ma, M., Wang, P.: Gate decorator: global filter pruning method for accelerating deep convolutional neural networks. In: NeurIPS, pp. 2130–2141 (2019)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kai Huang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Chen, S., Huang, K., Xiong, D., Li, B., Claesen, L. (2020). Fine-Grained Channel Pruning for Deep Residual Neural Networks. In: Farkaš, I., Masulli, P., Wermter, S. (eds) Artificial Neural Networks and Machine Learning – ICANN 2020. ICANN 2020. Lecture Notes in Computer Science(), vol 12397. Springer, Cham. https://doi.org/10.1007/978-3-030-61616-8_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-61616-8_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-61615-1

  • Online ISBN: 978-3-030-61616-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics