Loading [a11y]/accessibility-menu.js
Block-Sparse RPCA for Salient Motion Detection | IEEE Journals & Magazine | IEEE Xplore

Block-Sparse RPCA for Salient Motion Detection


Abstract:

Recent evaluation [2][13] of representative background subtraction techniques demonstrated that there are still considerable challenges facing these methods. Challenges i...Show More

Abstract:

Recent evaluation [2][13] of representative background subtraction techniques demonstrated that there are still considerable challenges facing these methods. Challenges in realistic environment include illumination change causing complex intensity variation, background motions (trees, waves, etc.) whose magnitude can be greater than those of the foreground, poor image quality under low light, camouflage, etc. Existing methods often handle only part of these challenges; we address all these challenges in a unified framework which makes little specific assumption of the background. We regard the observed image sequence as being made up of the sum of a low-rank background matrix and a sparse outlier matrix and solve the decomposition using the Robust Principal Component Analysis method. Our contribution lies in dynamically estimating the support of the foreground regions via a motion saliency estimation step, so as to impose spatial coherence on these regions. Unlike smoothness constraint such as MRF, our method is able to obtain crisply defined foreground regions, and in general, handles large dynamic background motion much better. Furthermore, we also introduce an image alignment step to handle camera jitter. Extensive experiments on benchmark and additional challenging data sets demonstrate that our method works effectively on a wide range of complex scenarios, resulting in best performance that significantly outperforms many state-of-the-art approaches.
Published in: IEEE Transactions on Pattern Analysis and Machine Intelligence ( Volume: 36, Issue: 10, 01 October 2014)
Page(s): 1975 - 1987
Date of Publication: 01 April 2014

ISSN Information:

PubMed ID: 26352629

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.