Abstract
In recent years, deep convolutional neural networks (CNNs) significantly boost the various applications, but the high computational complexity of these models hinder the further deployment on device with limited computation resources. Hence, dynamic channel pruning has been recently proposed and widely used for compressing CNN-based models. In this paper, we propose a novel plug-and-play dynamic network pruning module. With very slight extra computation burden, it can achieve the comparable performance as the original model. Specifically, our proposed module measures the importance of each convolutional channel to prune the CNNs with small decrease in accuracy. The module reduces the computation cost by global pooling and channel-wise 1-dimensional convolution that considers the channels’ locality. Comprehensive experimental results demonstrate the effectiveness of our module, which makes a better trade-off between the performance and the acquired computational resources, comparing to its competing methods. In concrete, our dynamic pruning module can reduce 51.1\(\%\) FLOPs of VGG16 with only 0.18\(\%\) top-1 accuracy degradation on CIFAR10.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Denton, E., Zaremba, W., Bruna, J., LeCun, Y., Fergus, R.: Exploiting linear structure within convolutional networks for efficient evaluation. In: International Conference on Neural Information Processing Systems(NIPS), pp. 1269–1277 (2014)
Dong, X., Huang, J., Yang, Y., Yan, S.: More is less: A more complicated network with less inference complexity. In: International Conference on Computer Vision and Pattern Recognition (CVPR), pp. 5840–5848 (2017)
Gao, X., Zhao, Y., Dudziak, Ł., Mullins, R., Xu, C.Z.: Dynamic channel pruning: feature boosting and suppression. In: International Conference of Learning Representation (ICLR) (2018)
He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: International Conference on Computer Vision and Pattern Recognition (CVPR), pp. 770–778 (2016)
He, Y., Kang, G., Dong, X., Fu, Y., Yang, Y.: Soft filter pruning for accelerating deep convolutional neural networks. In: International Joint Conference on Artificial Intelligence (IJCAI), pp. 2234–2240 (2018)
He, Y., Liu, P., Wang, Z., Hu, Z., Yang, Y.: Filter pruning via geometric median for deep convolutional neural networks acceleration. In: International Conference on Computer Vision and Pattern Recognition (CVPR), pp. 4340–4349 (2019)
Hinton, G.E., Srivastava, N., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.R.: Improving neural networks by preventing co-adaptation of feature detectors. arXiv preprint arXiv:1207.0580 (2012)
Hu, H., Peng, R., Tai, Y., Tang, C.: Network trimming: A data-driven neuron pruning approach towards efficient deep architectures. CoRR (2016)
Huang, Z., Wang, N.: Data-driven sparse structure selection for deep neural networks. In: European Conference on Computer Vision (ECCV), pp. 304–320 (2018)
Jafarian, K., Vahdat, V., Salehi, S., Mobin, M.: Automating detection and localization of myocardial infarction using shallow and end-to-end deep neural networks. Appl. Soft Comput. 93, 106383 (2020)
Krizhevsky, A., Hinton, G., et al.: Learning multiple layers of features from tiny images. Technical report (2009)
Li, C., Wang, G., Wang, B., Liang, X., Li, Z., Chang, X.: Dynamic slimmable network. In: International Conference on Computer Vision and Pattern Recognition (CVPR) (2021)
Li, H., Kadav, A., Durdanovic, I., Samet, H., Graf, H.P.: Pruning filters for efficient convnets. In: International Conference on Learning Representations (ICLR) (2016)
Li, K., Wan, G., Cheng, G., Meng, L., Han, J.: Object detection in optical remote sensing images: a survey and a new benchmark. ISPRS J. Photogrammetry Remote Sens. 159, 296–307 (2020)
Li, Y., Zhang, H., Xue, X., Jiang, Y., Shen, Q.: Deep learning for remote sensing image classification: a survey. Wiley Interdisc. Rev. Data Min. Knowl. Discov. 8(6), e1264 (2018)
Lin, M., et al.: Hrank: filter pruning using high-rank feature map. In: International Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1529–1538 (2020)
Lin, S., Ji, R., Yan, C., Zhang, B., Cao, L., Ye, Q., Huang, F., Doermann, D.: Towards optimal structured CNN pruning via generative adversarial learning. In: International Conference on Computer Vision and Pattern Recognition (CVPR) (2020)
Liu, C., Wang, Y., Han, K., Xu, C., Xu, C.: Learning instance-wise sparsity for accelerating deep models. In: International Joint Conference on Artificial Intelligence (IJCAI) (2019)
Liu, N., Ma, X., Xu, Z., Wang, Y., Tang, J., Ye, J.: Autocompress: an automatic DNN structured pruning framework for ultra-high compression rates. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 4876–4883 (2020)
Luo, J.H., Wu, J.: An entropy-based pruning method for CNN compression. arXiv preprint arXiv:1706.05791 (2017)
Molchanov, P., Tyree, S., Karras, T., Aila, T., Kautz, J.: Pruning convolutional neural networks for resource efficient inference. In: International Conference of Learning Representation (ICLR) (2016)
Paszke, A., et al.: Pytorch: An imperative style, high-performance deep learning library. In: International Conference on Neural Information Processing Systems (NIPS) (2019)
Qilong, W., Banggu, W., Pengfei, Z., Peihua, L., Wangmeng, Z., Qinghua, H.: Eca-net: efficient channel attention for deep convolutional neural networks. In: International Conference on Computer Vision and Pattern Recognition (CVPR) (2020)
Saikia, P., Baruah, R.D., Singh, S.K., Chaudhuri, P.K.: Artificial neural networks in the domain of reservoir characterization: a review from shallow to deep models. Comput. Geosci. 135, 104357 (2020)
Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556 (2014)
Tang, Y., et al.: Manifold regularized dynamic network pruning. In: International Conference on Computer Vision and Pattern Recognition (CVPR) (2019)
Yu, R., et al.: Nisp: pruning networks using neuron importance score propagation. In: International Conference on Computer Vision and Pattern Recognition (CVPR), pp. 9194–9203 (2018)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Lin, L., Liu, W., Yu, Y. (2022). Dynamic Network Pruning Based on Local Channel-Wise Relevance. In: Sun, F., Hu, D., Wermter, S., Yang, L., Liu, H., Fang, B. (eds) Cognitive Systems and Information Processing. ICCSIP 2021. Communications in Computer and Information Science, vol 1515. Springer, Singapore. https://doi.org/10.1007/978-981-16-9247-5_4
Download citation
DOI: https://doi.org/10.1007/978-981-16-9247-5_4
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-16-9246-8
Online ISBN: 978-981-16-9247-5
eBook Packages: Computer ScienceComputer Science (R0)