Loading [MathJax]/extensions/MathMenu.js
A Novel Two-Stage Knowledge Distillation Framework for Skeleton-Based Action Prediction | IEEE Journals & Magazine | IEEE Xplore

A Novel Two-Stage Knowledge Distillation Framework for Skeleton-Based Action Prediction


Abstract:

This letter addresses the challenging problem of action prediction with partially observed sequences of skeletons. Towards this goal, we propose a novel two-stage knowled...Show More

Abstract:

This letter addresses the challenging problem of action prediction with partially observed sequences of skeletons. Towards this goal, we propose a novel two-stage knowledge distillation framework, which transfers prior knowledge to assist the early prediction of ongoing actions. In the first stage, the action prediction model (also referred to as the student) learns from a couple of teachers to adaptively distill action knowledge at different progress levels for partial sequences. Then the learned student acts as a teacher in the next stage, with the objective of optimizing a better action prediction model in a self-training manner. We design an adaptive self-training strategy from the perspective of undermining the supervision from the annotated labels, since this hard supervision is actually too strict for partial sequences without enough discriminative information. Finally, the action prediction models trained in the two stages jointly constitute a two-stream architecture for action prediction. Extensive experiments on the large-scale NTU RGB+D dataset validate the effectiveness of the proposed method.
Published in: IEEE Signal Processing Letters ( Volume: 29)
Page(s): 1918 - 1922
Date of Publication: 05 September 2022

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.