Skip to main content

Filter Pruning via Similarity Clustering for Deep Convolutional Neural Networks

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13623))

Included in the following conference series:

  • 1749 Accesses

Abstract

Network pruning is a technique to obtain a smaller lightweight model by removing the redundant structure from pre-trained models. However, existing methods are mainly based on the importance of filters in the whole network. Unlike previous methods, in this paper, we propose a filter pruning strategy, called Filter Pruning via Similarity Clustering(FPSC). FPSC uses the Euclidean distance between filters to measure their similarity, and then selects the filter with the smaller sum of k-nearest neighbor distances among the similar filters for removal. We consider that the selected filter is more likely to be replaced by neighbor filters. FPSC is applied to a variety of different networks, and compared with the existing filter pruning approaches. The experimental results show that FPSC has better pruning performance. On CIFAR-10, it is worth noting that FPSC reduces more than 70\(\%\) FLOPs and parameters on GoogLeNet, and the accuracy is even 0.09\(\%\) higher than the baseline model. Moreover, on ImageNet, FPSC reduces more than 43.1\(\%\) FLOPs and 42.2\(\%\) parameters, the accuracy only dropped 0.66\(\%\) on ResNet-50.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Durand, T., Mehrasa, N., Mori, G.: Learning a deep convnet for multi-label classification with partial labels. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 647–657 (2019)

    Google Scholar 

  2. Zhu, C., Chen, F., Ahmed, U., Shen, Z., Savvides, M.: Semantic relation reasoning for shot-stable few-shot object detection. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 8782–8791 (2021)

    Google Scholar 

  3. Hinton, G., Vinyals, O., Dean, J., et al.: Distilling the knowledge in a neural network. arXiv preprint arXiv:1503.02531, 2(7) (2015)

  4. Zhou, S., Wu, Y., Ni, Z., Zhou, X., Wen, H., Y. Zou: Dorefa-net: training low bitwidth convolutional neural networks with low bitwidth gradients. arXiv preprint arXiv:1606.06160 (2016)

  5. Han, S., Mao, H., Dally, W.J.: Deep compression: compressing deep neural networks with pruning, trained quantization and huffman coding. arXiv preprint arXiv:1510.00149 (2015)

  6. Han, S., Pool, J., Tran, J., Dally, W.: Learning both weights and connections for efficient neural network. In: Advances in Neural Information Processing Systems, vol. 28 (2015)

    Google Scholar 

  7. Ye, S., et al.: Progressive weight pruning of deep neural networks using ADMM. arXiv preprint arXiv:1810.07378 (2018)

  8. Li, H., Kadav, A., Durdanovic, I., Samet, H., Graf, H.P.: Pruning filters for efficient convnets. arXiv preprint arXiv:1608.08710 (2016)

  9. He, Y., Kang, G., Dong, X., Fu, Y., Yang, Y.: Soft filter pruning for accelerating deep convolutional neural networks. arXiv preprint arXiv:1808.06866 (2018)

  10. He, Y., Liu, P., Wang, Z., Hu, Z., Yang, Y.: Filter pruning via geometric median for deep convolutional neural networks acceleration. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 4340–4349 (2019)

    Google Scholar 

  11. Shao, M., Dai, J., Wang, R., Kuang, J., Zuo, W.: Cshe: network pruning by using cluster similarity and matrix eigenvalues. Int. J. Mach. Learn. Cybern. 13(2), 371–382 (2022)

    Article  Google Scholar 

  12. He, Y., Dong, X., Kang, G., Fu, Y., Yan, C., Yang, Y.: Asymptotic soft filter pruning for deep convolutional neural networks. IEEE Trans. Cybern. 50(8), 3594–3604 (2019)

    Article  Google Scholar 

  13. Hu, H., Peng, R., Tai, Y.-W., Tang, C.-K.: Network trimming: a data-driven neuron pruning approach towards efficient deep architectures. arXiv preprint arXiv:1607.03250 (2016)

  14. Lin, M., et al.: Hrank: filter pruning using high-rank feature map. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 1529–1538 (2020)

    Google Scholar 

  15. J.-H. Luo, J. Wu, and W. Lin, "Thinet: A filter level pruning method for deep neural network compression," in Proceedings of the IEEE international conference on computer vision, 2017, pp. 5058–5066

    Google Scholar 

  16. Xie, Z., Fu, Y., Tian, S., Zhou, J., Chen, D.: Pruning with compensation: efficient channel pruning for deep convolutional neural networks. arXiv preprint arXiv:2108.13728 (2021)

  17. Liu, Z., Sun, M., Zhou, T., Huang, G., Darrell, T.: Rethinking the value of network pruning. arXiv preprint arXiv:1810.05270 (2018)

  18. Meng, F., et al.: Pruning filter in filter. In: Advances in Neural Information Processing Systems, vol. 33, pp. 17 629–17 640 (2020)

    Google Scholar 

  19. Li, Y., Gu, S., Mayer, C., Gool, L.V., Timofte, R.: Group sparsity: the hinge between filter pruning and decomposition for network compression. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 8018–8027 (2020)

    Google Scholar 

  20. Lin, M., Ji, R., Zhang, Y., Zhang, B., Wu, Y., Tian, Y.: Channel pruning via automatic structure search. arXiv preprint arXiv:2001.08565 (2020)

  21. Lin, M., et al.: Filter sketch for network pruning. IEEE Trans. Neural Networks Learn. Syst. (2021)

    Google Scholar 

  22. Li, Y., Gu, S., Zhang, K., Van Gool, L., Timofte, R.: DHP: differentiable meta pruning via HyperNetworks. In: Vedaldi, A., Bischof, H., Brox, T., Frahm, J.-M. (eds.) ECCV 2020. LNCS, vol. 12353, pp. 608–624. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58598-3_36

    Chapter  Google Scholar 

  23. Dong, X., Huang, J., Yang, Y., Yan, S.: More is less: a more complicated network with less inference complexity. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 5840–5848 (2017)

    Google Scholar 

  24. Ning, X., Zhao, T., Li, W., Lei, P., Wang, Yu., Yang, H.: DSA: more efficient budgeted pruning via differentiable sparsity allocation. In: Vedaldi, A., Bischof, H., Brox, T., Frahm, J.-M. (eds.) ECCV 2020. LNCS, vol. 12348, pp. 592–607. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58580-8_35

    Chapter  Google Scholar 

  25. Ding, X., Ding, G., Guo, Y., Han, J.: Centripetal SGD for pruning very deep convolutional networks with complicated structure. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 4943–4953 (2019)

    Google Scholar 

Download references

Acknowledgments

This work was partially supported by Collaborative Innovation Center of Novel Software Technology and Industrialization, and Project Funded by the Priority Academic Program Development of Jiangsu Higher Education Institutions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wangshu Yao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Song, K., Yao, W., Zhu, X. (2023). Filter Pruning via Similarity Clustering for Deep Convolutional Neural Networks. In: Tanveer, M., Agarwal, S., Ozawa, S., Ekbal, A., Jatowt, A. (eds) Neural Information Processing. ICONIP 2022. Lecture Notes in Computer Science, vol 13623. Springer, Cham. https://doi.org/10.1007/978-3-031-30105-6_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-30105-6_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-30104-9

  • Online ISBN: 978-3-031-30105-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics