IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
Online ISSN : 1745-1337
Print ISSN : 0916-8508
Regular Section
A Fast Stochastic Gradient Algorithm: Maximal Use of Sparsification Benefits under Computational Constraints
Masahiro YUKAWAWolfgang UTSCHICK
Author information
JOURNAL RESTRICTED ACCESS

2010 Volume E93.A Issue 2 Pages 467-475

Details
Abstract

In this paper, we propose a novel stochastic gradient algorithm for efficient adaptive filtering. The basic idea is to sparsify the initial error vector and maximize the benefits from the sparsification under computational constraints. To this end, we formulate the task of algorithm-design as a constrained optimization problem and derive its (non-trivial) closed-form solution. The computational constraints are formed by focusing on the fact that the energy of the sparsified error vector concentrates at the first few components. The numerical examples demonstrate that the proposed algorithm achieves the convergence as fast as the computationally expensive method based on the optimization without the computational constraints.

Content from these authors
© 2010 The Institute of Electronics, Information and Communication Engineers
Previous article Next article
feedback
Top