On almost-sure bounds for the LMS algorithm

M.A. Kouritzin
1994 IEEE Transactions on Information Theory  
Almost-sure (as.) bounds for linear, constant-gain, generd pseudo-stationarity and dependence conditions on the driving data { J~k , k = 1 , 2 , 3 ,... }, { Y k , l i = 0 , 1 , 2 ,... } as. convergence and r a t s of a.s. convergence (as the algorithm gain c + 0) are established for the LMS algorithm second-order stochastic processes defined on a common probability space. A basic problem of adaptive filtering is to find best ,nean-square linear approximation to +k+l in terms of the CoKLPonents
more » ... of the CoKLPonents of yk, i.e. find a deterministic sequence, {fk}rZ0, in g d , which minimizes adaptive filtering algorithms are investigated. For instance, under hi+, hfk + ~Y L ( I , ! I~+~ -Y T k i ) subject to some nonrandom initial condition hg = ho. In particular, defining {gi}p=o by gg = ho and g;+, = g; + F(EIYkCI'A+Il -E [ I ' k C ] g ; ) fork = 0.1.Z.. . , we show that for any > 0 m a x o 5 k 5 , , -~l h~ --+ 0 as -+ 0 as. and under a stronger dependency condition, we show that for any 0 < C 5 1 and 7 > 0, maXosk<7r-C Ih; -sfkl converges (as e -+ O)la.s. at a rate marginally slower than O((c2-Clog log(c-c))z). Then, under a stronger pseudostationarity assumption it is shown that similar results hold if the sequences {g;}r=o, c > 0 in the above results are replaced with the solution g o ( . ) of a nonrandom linear ordinary differential equation, i.e. we have maxo5,5L,f-CJ ( h ig " ( c k ) ( + 0 as 6 + 0 as., E { I + ! I~+~ -~, T f k }~ for k = 0,1,2,. . . (1.1) If E(YkYkr) is nonsingular for each k 2 0, then it is immediately apparent that {fk}P& is uniquely defined by fk e (E'[Y~Y,TI)-'E[~,/~~+~Y~] for r~ = 0,1,2,. . . (1.2) However, in practice {EIYkYz]}r=O and {E[q!~k+lYk]}~?~ often are not readily discernible so (1.2) is of no direct use. Consequently, stochastic estimates of { fk}& generated by adaptive algorithms, with either decreasing or constant gain, must suffice. The linear, decreasing-gain algorithm hk+l = hk + Pk Yk("k+l -Yzhk), (1.3) where {pk}r=o is a sequence of real numbers converging to zero as k -+ cc) and {hk(w), k = 0 , 1 , 2 ,... } forms our parameter estimates, is well-suited for ascertaining the minimum of (1.1) when fk is independent of k . This algorithm has been studied extensively in the almost-sure case by, e.g., Eweda and Macchi [6] and Heunis [ 113. On the other hand, it is well-noted that constant-gain adaptive-filtering algorithms where we can attach a rate to this convergence under the stronger provide Of {fk}r=0 tracking dependency condition. The almost-sure bounds contained in this paper complement previously developed weak convergence results in Kushner and 19841 and, as will be seen, are "near optimal". Moreover, the proofs used to establish these bounds are quite elementary. Index Terms-Adaptive filtering, almost-sure bounds, method when {fk} r=o fluctuates with time; however, few almost-sure results hav'e been developed for the constant-gain version of (1.3). arising from basic linear constant-gain adaptive-filtering algorithms such as the following LMS algorithm: Shwartz UEEE Trans. Information Theory, IT-WZ), 177-1829 this paper we examine the linear stochastic recursion of averaging. h&+, = h; + EYk($k+l -Y: h; ), (1.4)
doi:10.1109/18.312160 fatcat:cikfio4alnfzrgqkwkegci6hl4