Loading [a11y]/accessibility-menu.js
LMS-2: Towards an algorithm that is as cheap as LMS and almost as efficient as RLS | IEEE Conference Publication | IEEE Xplore

LMS-2: Towards an algorithm that is as cheap as LMS and almost as efficient as RLS


Abstract:

We consider linear prediction problems in a stochastic environment. The least mean square (LMS) algorithm is a well-known, easy to implement and computationally cheap sol...Show More

Abstract:

We consider linear prediction problems in a stochastic environment. The least mean square (LMS) algorithm is a well-known, easy to implement and computationally cheap solution to this problem. However, as it is well known, the LMS algorithm, being a stochastic gradient descent rule, may converge slowly. The recursive least squares (RLS) algorithm overcomes this problem, but its computational cost is quadratic in the problem dimension. In this paper we propose a two timescale stochastic approximation algorithm which, as far as its slower timescale is considered, behaves the same way as the RLS algorithm, while it is as cheap as the LMS algorithm. In addition, the algorithm is easy to implement. The algorithm is shown to give estimates that converge to the best possible estimate with probability one. The performance of the algorithm is tested in two examples and it is found that it may indeed offer some performance gain over the LMS algorithm.
Date of Conference: 15-18 December 2009
Date Added to IEEE Xplore: 29 January 2010
ISBN Information:

ISSN Information:

Conference Location: Shanghai, China

Contact IEEE to Subscribe

References

References is not available for this document.