THE LMS, PNLMS, AND EXPONENTIATED GRADIENT ALGORITHMS (WedAmOR5)
Author(s) :
Jacob Benesty (Universite du Quebec, INRS-EMT, Canada)
Yiteng Huang (Bell Labs, Lucent Technologies, USA)
Abstract : Sparse impulse responses are encountered in many applications (network and acoustic echo cancellation, feedback cancellation in hearing aids, etc). Recently, a class of exponentiated gradient (EG) algorithms has been proposed. One of the algorithms belonging to this class, the so-called EG$\pm$ algorithm, converges and tracks much better than the classical stochastic gradient, or LMS, algorithm for sparse impulse responses. In this paper, we show how to derive the different algorithms. We analyze the EG$\pm$ algorithm and explain when to expect it to behave like the LMS algorithm. It is also shown that the proportionate normalized LMS (PNLMS) algorithm proposed by Duttweiler in the context of network echo cancellation is an approximation of the EG$\pm$.

Menu