Abstract
Compression of convolutional neural network models has recently been dominated by pruning approaches. A class of previous works focuses solely on pruning the unimportant filters to achieve network compression. Another important direction is the design of sparsity-inducing constraints which has also been explored in isolation. This paper presents a novel training scheme based on composite constraints that prune redundant filters and minimize their effect on overall network learning via sparsity promotion. Also, as opposed to prior works that employ pseudo-norm-based sparsity-inducing constraints, we propose a sparse scheme based on gradient counting in our framework. Our tests on several pixel-wise segmentation benchmarks show that the number of neurons and the memory footprint of networks in the test phase are significantly reduced without affecting performance. MobileNetV3 and UNet, two well-known architectures, are used to test the proposed scheme. Our network compression method not only results in reduced parameters but also achieves improved performance compared to MobileNetv3, which is an already optimized architecture.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Khan, T.M., Robles-Kelly, A.: Machine learning: Quantum vs classical IEEE Access.8, pp. 275–294 (2020)
Krizhevsky, A., Sutskever, I., Hinton, G. E.: ImageNet classification with deep convolutional neural networks, In Advances in Neural Information Processing Systems, 25, pp. 1–9 (2012)
Lin, D., Talathi, S., Annapureddy, S.: Fixed point quantization of deep convolutional networks, In: International Conference on Machine Learning, pp. 2849–2858 (2016)
Khan, T.M., Naqvi, S.S, Meijering, E.: Leveraging image complexity in macro-level neural network design for medical image segmentation, (2021) arXiv preprint arXiv:2112.11065
Le, Q.V.: Building high-level features using large scale unsupervised learning In: IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 8595–8598 (2013)
Ciregan, D., Meier, U., Schmidhuber, J.: Multi-column deep neural networks for image classification. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 3642–3649 (2012)
Schmidhuber, J.: Deep learning in neural networks: an overview. Neural Netw. 61, 85–117 (2015)
Khan, T. M., Robles-Kelly, A., Naqvi, S. S.: T-net: a resource-constrained tiny convolutional neural network for medical image segmentation. In: Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pp. 644–653 (2022)
Khan, T.M., et al.: Width-wise vessel bifurcation for improved retinal vessel segmentation. Biomed. Signal Process. Control 71, 103169 (2022)
Khan, T.M., Robles-Kelly, A., Naqvi, S.S., Arsalan, M.: Residual Multiscale Full Convolutional Network (RM-FCN) for High Resolution Semantic Segmentation of Retinal Vasculature. In: Torsello, A., Rossi, L., Pelillo, M., Biggio, B., Robles-Kelly, A. (eds.) S+SSPR 2021. LNCS, vol. 12644, pp. 324–333. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-73973-7_31
Khan, T.M., Robles-Kelly, A., Naqvi, S.S.: Rc-net: a convolutional neural network for retinal vessel segmentation. In: Digital Image Computing: Techniques and Applications (DICTA). IEEE 2021, 01–07 (2021)
Khan, T.M., Robles-Kelly, A.: A Derivative-Free Method for Quantum Perceptron Training in Multi-layered Neural Networks. In: Yang, H., Pasupa, K., Leung, A.C.-S., Kwok, J.T., Chan, J.H., King, I. (eds.) ICONIP 2020. CCIS, vol. 1333, pp. 241–250. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-63823-8_29
Gupta, S., Agrawal, A., Gopalakrishnan, K., Narayanan, P.: Deep learning with limited numerical precision. In: International Conference on Machine Learning, pp. 1737–1746 (2015)
Alemu, H.Z., Zhao, J., Li, F., Wu, W.: Group \(l_{1/2}\) regularization for pruning hidden layer nodes of feedforward neural networks. IEEE Access 7, 9540–9557 (2019)
Castellano, G., Fanelli, A., Pelillo, M.: An iterative pruning algorithm for feedforward neural networks. IEEE Trans. Neural Netw. 8(3), 519–531 (1997)
Zhang, Z., Qiao, J.: A node pruning algorithm for feedforward neural network based on neural complexity. In: International Conference on Intelligent Control and Information Processing, pp. 406–410 (2010)
Wen, W., Wu, C., Wang, Y., Chen, Y., Li, H.: Learning structured sparsity in deep neural networks. In: International Conference on Neural Information Processing Systems, pp. 2082–2090 (2016)
Li, H., Kadav, A., Durdanovic, I., Samet, H., Graf, H.P.: Pruning filters for efficient convnets. (2017) arXiv:1608.08710
He, Y., Zhang, X., Sun, J.: Channel pruning for accelerating very deep neural networks. In: IEEE International Conference on Computer Vision, pp. 1398–1406 (2017)
Liu, Z., Li, J., Shen, Z., Huang, G., Yan, S., Zhang, C.: Learning efficient convolutional networks through network slimming, In: IEEE International Conference on Computer Vision, pp. 2755–2763 (2017)
Han, S., Pool, J., Tran, J., Dally, W.: Learning both weights and connections for efficient neural network. In: Advances in Neural Information Processing Systems, vol. 28, pp. 1–9 (2015)
Wu, J., Leng, C., Wang, Y., Hu, Q., Cheng, J.: Quantized convolutional neural networks for mobile devices. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 4820–4828 (2016)
Tai, C., Xiao, T., Zhang, Y., Wang, X., Ee, W.: Convolutional neural networks with low-rank regularization. (2016) arXiv:1511.06067
Yu, J., Lukefahr, A., Palframan, D., Dasika, G., Das, R., Mahlke, S.: Scalpel: customizing DNN pruning to the underlying hardware parallelism In: ACM/IEEE Annual International Symposium on Computer Architecture, pp. 548–560 (2017)
Prakash, A., Storer, J., Florencio, D., Zhang, C.: RePr: improved training of convolutional filters. In: IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 10 658–10 667 (2019)
Zhou, H., Alvarez, J. M., Porikli, F.: Less is more: towards compact CNNs. In: European Conference on Computer Vision, pp. 662–677 (2016)
Staal, J., Abramoff, M., Niemeijer, M., Viergever, M., van Ginneken, B.: Ridge-based vessel segmentation in color images of the retina. IEEE Trans. Med. Imaging 23(4), 501–509 (2004)
Yang, C., Zhang, L., Lu, H., Ruan, X., Yang, M.-H.: Saliency detection via graph-based manifold ranking. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 3166–3173 (2013)
Brostow, G.J., Shotton, J., Fauqueur, J., Cipolla, R.: Segmentation and recognition using structure from motion point clouds. In: European Conference on Computer Vision, 2008, pp. 44–57 (2008)
Ronneberger, O., Fischer, P., Brox, T.: U-Net: Convolutional networks for biomedical image segmentation. In: Medical Image Computing and Computer-Assisted Intervention, pp. 234–241 (2015)
Howard, A., et al.: Searching for MobileNetV3 In: IEEE/CVF International Conference on Computer Vision, pp. 1314–1324 (2019)
Kezmann, J.-M.: Tensorflow advanced segmentation models (2020). https://github.com/ JanMarcelKezmann/TensorFlow-Advanced-Segmentation-Models
Xiaochus, L.: A Keras implementation of MobileNetV3 and lite R-ASPP semantic segmentation (2020). https://github.com/xiaochus/MobileNetV3,
Maier-Hein, L., et al.: Metrics reloaded: Pitfalls and recommendations for image analysis validation (2022) arXiv:2206.01653
Everingham, M., Eslami, S.M.A., Gool, L.V., Williams, C.K.I., Winn, J.M., Zisserman, A.: The PASCAL visual object classes challenge: A retrospective. Int. J. Comput. Vision 111, 98–136 (2014)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Khan, T.M., Naqvi, S.S., Robles-Kelly, A., Meijering, E. (2023). Neural Network Compression by Joint Sparsity Promotion and Redundancy Reduction. In: Tanveer, M., Agarwal, S., Ozawa, S., Ekbal, A., Jatowt, A. (eds) Neural Information Processing. ICONIP 2022. Lecture Notes in Computer Science, vol 13623. Springer, Cham. https://doi.org/10.1007/978-3-031-30105-6_51
Download citation
DOI: https://doi.org/10.1007/978-3-031-30105-6_51
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-30104-9
Online ISBN: 978-3-031-30105-6
eBook Packages: Computer ScienceComputer Science (R0)