Loading [a11y]/accessibility-menu.js
Task-Specific Feature Purifying in Radar-Based Human Pose Estimation | IEEE Journals & Magazine | IEEE Xplore

Task-Specific Feature Purifying in Radar-Based Human Pose Estimation


Abstract:

Recently, radar-based human pose estimation has attracted increasing attention, which aims to reconstruct the human skeleton from the radar signals with the advanced deep...Show More

Abstract:

Recently, radar-based human pose estimation has attracted increasing attention, which aims to reconstruct the human skeleton from the radar signals with the advanced deep neural networks. However, one fundamental challenge is the interferences from task-irrelevant ingredients due to the propagation characteristics of radar signals during feature learning. In this article, we present a two-stage framework for task-specific feature purifying to distill the feature representation with high discriminability specific to the pose estimation task but less interpersonal discrepancy, which is essential to handle the difficulties arising from the task-irrelevant interferences in reconstructing the human skeleton. The logic behind the proposed two-stage feature purifying architecture is that the discrepancy among individual personal data is first eliminated with an adversarial auto-encoder module to distill the interpersonal-independent features, and then the pose-irrelevant components are removed through a feature disentanglement module to form task-specific features for human pose estimations. The two-stage architecture turns out to be considerably effective for feature purifying against the interferences due to the task-irrelevant ingredients involved in the feature learning process. The experimental study is presented to illustrate the effectiveness of the proposed approach.
Published in: IEEE Transactions on Aerospace and Electronic Systems ( Volume: 59, Issue: 6, December 2023)
Page(s): 9285 - 9298
Date of Publication: 22 September 2023

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.