The Binormalized Data-Reusing LMS Algorithm with Optimized Step-Size Sequence
A new algorithm, the binonnalized data-reusing least mean-squares (LMS) algorithm is presented. The new algorithm has been found to converge faster than other LMS-like algorithms, such as the Normalized LMS algorithm and several data-reusing LMS algorithms, in cases where the input signal is strongly correlated. The computational complexity of this new algorithm is only slightly higher than a recently proposed normalized new data-reusing LMS algorithm. Superior performance in convergence speed is, however, followed by a higher misadjustment if the step-size is close to the value which allows the fastest convergence. An optimal step-size sequence for this algorithm is proposed after considering a number of simplifying assumptions. Moreover, this work brings insight in how to deal with these conflicting requirements of fast convergence and minimum steady-state mean-square error (MSE).
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).