PPT-KP: Pruning Point Training-based Kernel Pruning for Deep Convolutional Neural Networks | IEEE Conference Publication | IEEE Xplore

PPT-KP: Pruning Point Training-based Kernel Pruning for Deep Convolutional Neural Networks


Abstract:

Pruning, which is a representative method for compressing huge convolutional neural network (CNN) models, has been mainly studied in two directions: weight pruning and fi...Show More

Abstract:

Pruning, which is a representative method for compressing huge convolutional neural network (CNN) models, has been mainly studied in two directions: weight pruning and filter pruning, with both approaches having clear limitations caused by their intrinsic characteristics. To solve this problem, research on kernel pruning, which has the advantages of both methods, has recently advanced. In this study, pruning point training-based kernel pruning (PPT-KP) is proposed to address the problems of existing kernel pruning methods. With PPT-KP, the L1 norm of the kernel converges to zero through an adaptive regularizer that applies L1 regularization of different intensities depending on the size of the L1 norm of the kernel to secure network sparsity and obtain multiple margin spaces for pruning. Thus, outstanding kernel pruning is possible because several pruning points can be created. PPT-KP outperformed several existing filter pruning and kernel pruning methods on various networks and datasets in terms of the trade-off between FLOPs reduction and accuracy drops. In particular, PPT-KP reduced parameters and FLOPs by 77.2% and 68.9%, respectively, in ResNet-56 on the CIFAR-10 dataset with only a 0.05% accuracy degradation.
Date of Conference: 11-13 June 2023
Date Added to IEEE Xplore: 07 July 2023
ISBN Information:

ISSN Information:

Conference Location: Hangzhou, China

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.