Abstract:
Modern edge devices have become powerful enough to run deep learning tasks, but there are still many challenges, such as limited resources such as computing power, memory...Show MoreMetadata
Abstract:
Modern edge devices have become powerful enough to run deep learning tasks, but there are still many challenges, such as limited resources such as computing power, memory space, and energy. To address these challenges, methods such as channel pruning, network quantization and early exiting has been introduced to reduce the computational load for achieve this tasks. In this paper, we propose LazyNet, an alternative network of applying skip modules instead of early exiting on a pre-trained neural network. We use a small module that preserves the spatial information and also provides metrics to decide the computational flow. If the data sample is easy, the network skips most of the computation load and if not, the network computes the sample for accurate classification. We test our model with various backbone networks and cifar-10 dataset and show reduction on model inference time, memory consumption and increased accuracy to prove our results.
Published in: 2022 13th International Conference on Information and Communication Technology Convergence (ICTC)
Date of Conference: 19-21 October 2022
Date Added to IEEE Xplore: 25 November 2022
ISBN Information: