Loading [MathJax]/extensions/MathMenu.js
Analyzing Step-Size Approximation for Fixed-Point Implementation of LMS and BLMS Algorithms | IEEE Conference Publication | IEEE Xplore

Analyzing Step-Size Approximation for Fixed-Point Implementation of LMS and BLMS Algorithms


Abstract:

In this work, we analyze the step-size approximation for fixed-point least-mean-square (LMS) and block LMS (BLMS) algorithms. Our primary focus is on investigating how st...Show More

Abstract:

In this work, we analyze the step-size approximation for fixed-point least-mean-square (LMS) and block LMS (BLMS) algorithms. Our primary focus is on investigating how step size approximation impacts the convergence rate and steady-state mean square error (MSE) across varying block sizes and filter lengths. We consider three different FP quantized LMS and BLMS algorithms. The results demonstrate that the algorithm with two quantizers in single precision behaves approximately the same as one quantizer under quantized weights, regardless of block size and filter lengths. Subsequently, we explore the approximation effects of nearest power-of-two and their combinations with different design parameters on the convergence performance. Simulation results for within the context of a system identification problem under these approximations reveal intriguing insights. For instance, a single quantizer algorithm without quantized error is more robust than its counterpart under these approximations. Additionally, both single quantizer algorithms with combined power-of-two approximations matches the behavior of the actual step-size.
Date of Conference: 31 October 2023 - 01 November 2023
Date Added to IEEE Xplore: 06 November 2023
ISBN Information:
Conference Location: Aalborg, Denmark

Contact IEEE to Subscribe

References

References is not available for this document.