Loading [a11y]/accessibility-menu.js
Human Video Instance Segmentation and Tracking via Data Association and Single-Stage Detector | IEEE Journals & Magazine | IEEE Xplore

Human Video Instance Segmentation and Tracking via Data Association and Single-Stage Detector


Abstract:

Human video instance segmentation plays an important role in computer understanding of human activities and is widely used in extensive artificial intelligence applicatio...Show More

Abstract:

Human video instance segmentation plays an important role in computer understanding of human activities and is widely used in extensive artificial intelligence applications for consumer electronics. In this paper, we develop a new method for human video instance segmentation based on the one-stage detector. To track the instance across the video, we have adopted a data association strategy for matching the same instance in the video sequence, where we jointly learn target instance appearances and their affinities in a pair of video frames in an end-to-end fashion. We have also adopted the centroid sampling module for enhancing the embedding extraction ability of instances, so that the problem that the same instance is represented by two different instances, which tends to be confronted in the one-stage detector, can be alleviated. Finally, we collect PVIS dataset by assembling several video instance segmentation datasets to fill the gap of the current lack of datasets dedicated to human video segmentation. Extensive simulations based on the dataset have been conducted and the simulation results show that the proposed method can achieve better performance of human video instance segmentation than other SOTA methods by approximately 3% improvements in DetA, AssA, and IDS measurements. The inference time can also achieve 18.9 FPS, which is much more efficient than most of other SOTA methods. This illustrates the efficiency and effectiveness of the proposed work.
Published in: IEEE Transactions on Consumer Electronics ( Volume: 70, Issue: 1, February 2024)
Page(s): 2979 - 2988
Date of Publication: 26 September 2023

ISSN Information:

Funding Agency:


References

References is not available for this document.