A comparative study of two popular families of sparsity-aware adaptive filters | IEEE Conference Publication | IEEE Xplore

A comparative study of two popular families of sparsity-aware adaptive filters


Abstract:

In this paper, we review two families for sparsity-aware adaptive filtering. Proportionate-type NLMS filters try to accelerate filter convergence by assigning each filter...Show More

Abstract:

In this paper, we review two families for sparsity-aware adaptive filtering. Proportionate-type NLMS filters try to accelerate filter convergence by assigning each filter weight a different gain that depends on its actual value. Sparsity-norm regularized filters penalize the cost function minimized by the filter using sparsity-promoting norms (such as ℓ0 or ℓ1) and derive new stochastic gradient descent rules from the regularized cost function. We compare both families of algorithms in terms of computational complexity and studying how well they deal with the convergence vs steady-state error tradeoff. We conclude that sparsity-norm regularized filters are computationally less expensive and can achieve a better tradeoff, making them more attractive in principle. However, selection of the strength of the regularization term seems to be a critical element for the good performance of these filters.
Date of Conference: 26-28 May 2014
Date Added to IEEE Xplore: 26 June 2014
Electronic ISBN:978-1-4799-3696-0

ISSN Information:

Conference Location: Copenhagen, Denmark

Contact IEEE to Subscribe

References

References is not available for this document.