ABSTRACT
Recent years have witnessed unprecedented growth in sensors-based indoor activity recognition. Further, a significant improvement in recognition performance of indoor activities is observed by incorporating Deep Neural Network (DNN) model. In this paper, we propose knowledge distillation based economic and efficient indoor activity recognition approach for low-cost resource constraint devices. Here, we adopt knowledge from teacher and trainee (cumbersome DNN models) for training student (compressed DNN model). Initially, student and trainee both are beginner and trainee helps the student in learning from the teacher. The student, after certain steps, is mature enough for directly learning from the teacher. We introduce an early halting mechanism for simultaneously reducing floating-point operations and training time of the student model.
- Geoffrey Hinton, Oriol Vinyals, and Jeff Dean. Distilling the knowledge in a neural network. arXiv preprint arXiv:1503.02531, 2015.Google Scholar
- Junho Yim, Donggyu Joo, Jihoon Bae, and Junmo Kim. A gift from knowledge distillation: Fast optimization, network minimization and transfer learning. In Proc. CVPR, pages 4133--4141, 2017.Google ScholarCross Ref
- Chungkuk, Bumsoo Yoo, Minsik Kang, and Cho. Snow: Subscribing to knowledge via channel pooling for transfer & lifelong learning of convolutional neural networks. In Proc. ICLR, pages 1--12, 2019.Google Scholar
- A. Mishra and D. Marr. Apprentice: Using knowledge distillation techniques to improve low-precision network accuracy. In Proc. ICLR, pages 1--15, 2017.Google Scholar
- H. Zhao, X. Sun, J. Dong, C. Chen, and Z. Dong. Highlight every step: Knowledge distillation via collaborative teaching. IEEE Transactions on Cybernetics, pages 1--12, 2020.Google Scholar
- J. Yang, H. Zou, S. Cao, Z. Chen, and L. Xie. Mobileda: Toward edge-domain adaptation. IEEE Internet of Things Journal, 7(8):6909--6918, 2020.Google ScholarCross Ref
Index Terms
Teacher, trainee, and student based knowledge distillation technique for monitoring indoor activities: poster abstract
Recommendations
Knowledge Distillation based Online Learning Methodology using Unlabeled Data Stream
MLMI '18: Proceedings of the 2018 International Conference on Machine Learning and Machine IntelligenceIn supervised learning, the performance of the learning model decreases with the change of time step due to concept drift caused by overfitting of the training data. As a methodology to mitigate such concept drift, an online learning methodology has ...
FSKD: Detecting Fake News with Few-Shot Knowledge Distillation
Advanced Data Mining and ApplicationsAbstractThe detection of fake news on social networks is highly desirable and socially beneficial. In real scenarios, there are few labeled news articles and a large number of unlabeled articles. One prominent way is to consider fake news detection as a ...
Image quality assessment based on self-supervised learning and knowledge distillation
Highlights- We propose a new self-supervised learning method, which uses the teacher network to score the unlabeled data as a soft target for student network.
AbstractDeep neural networks have achieved great success in a wide range of machine learning tasks due to their excellent ability to learn rich semantic features from high-dimensional data. Deeper networks have been successful in the field of ...
Comments