skip to main content
10.1145/3384419.3430450acmconferencesArticle/Chapter ViewAbstractPublication PagessensysConference Proceedingsconference-collections
short-paper

Teacher, trainee, and student based knowledge distillation technique for monitoring indoor activities: poster abstract

Published:16 November 2020Publication History

ABSTRACT

Recent years have witnessed unprecedented growth in sensors-based indoor activity recognition. Further, a significant improvement in recognition performance of indoor activities is observed by incorporating Deep Neural Network (DNN) model. In this paper, we propose knowledge distillation based economic and efficient indoor activity recognition approach for low-cost resource constraint devices. Here, we adopt knowledge from teacher and trainee (cumbersome DNN models) for training student (compressed DNN model). Initially, student and trainee both are beginner and trainee helps the student in learning from the teacher. The student, after certain steps, is mature enough for directly learning from the teacher. We introduce an early halting mechanism for simultaneously reducing floating-point operations and training time of the student model.

References

  1. Geoffrey Hinton, Oriol Vinyals, and Jeff Dean. Distilling the knowledge in a neural network. arXiv preprint arXiv:1503.02531, 2015.Google ScholarGoogle Scholar
  2. Junho Yim, Donggyu Joo, Jihoon Bae, and Junmo Kim. A gift from knowledge distillation: Fast optimization, network minimization and transfer learning. In Proc. CVPR, pages 4133--4141, 2017.Google ScholarGoogle ScholarCross RefCross Ref
  3. Chungkuk, Bumsoo Yoo, Minsik Kang, and Cho. Snow: Subscribing to knowledge via channel pooling for transfer & lifelong learning of convolutional neural networks. In Proc. ICLR, pages 1--12, 2019.Google ScholarGoogle Scholar
  4. A. Mishra and D. Marr. Apprentice: Using knowledge distillation techniques to improve low-precision network accuracy. In Proc. ICLR, pages 1--15, 2017.Google ScholarGoogle Scholar
  5. H. Zhao, X. Sun, J. Dong, C. Chen, and Z. Dong. Highlight every step: Knowledge distillation via collaborative teaching. IEEE Transactions on Cybernetics, pages 1--12, 2020.Google ScholarGoogle Scholar
  6. J. Yang, H. Zou, S. Cao, Z. Chen, and L. Xie. Mobileda: Toward edge-domain adaptation. IEEE Internet of Things Journal, 7(8):6909--6918, 2020.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Teacher, trainee, and student based knowledge distillation technique for monitoring indoor activities: poster abstract

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      SenSys '20: Proceedings of the 18th Conference on Embedded Networked Sensor Systems
      November 2020
      852 pages
      ISBN:9781450375900
      DOI:10.1145/3384419

      Copyright © 2020 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 16 November 2020

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • short-paper

      Acceptance Rates

      Overall Acceptance Rate174of867submissions,20%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader