As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
Neural network pruning is widely applied to various mobile applications. Previous pruning methods mainly leverage ad-hoc criteria to evaluate channel importance. In this paper, we propose an effective metaheuristic sub-structure selection (MetaSelection) method for neural network pruning. MetaSelection exploits evolutionary algorithm (EA) to search the proper sub-structure satisfying the resource constraints. In comparison with previous AutoML based methods, MetaSelection can automatically achieve the pruning rate and channel selection at the same time instead of hand-crafted criteria in a cascaded way. Regarding the tremendous search space of channel selection as a combinatorial optimization problem, we further utilize a coarse-to-fine strategy and the novel probability distribution crossover (PDC) to speed up the search procedure. Besides, MetaSelection prunes the network globally rather than in a layer-by-layer way. We evaluate MetaSelection on several appealing deep neural networks, achieving superior results with adaptive depth and width. Concretely, on ImageNet, MetaSelection achieves a top-1 accuracy of 71.5% on MobileNetV2 under 70% FLOPs constraint and a FLOPs reduction of 30% with 76.4% top-1 accuracy for ResNet50.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.