An analysis of the exponentiated gradient descent algorithm

S.I. Hill, R.C. Williamson
ISSPA '99. Proceedings of the Fifth International Symposium on Signal Processing and its Applications (IEEE Cat. No.99EX359)  
This paper analyses three algorithms recently studied in the Computational Learning Theory community: the Gradient Descent (GD) Algorithm, the Exponentiated Gradient Algorithm with Positive and Negative weights (EG algorithm) and the Exponentiated Gradient Algorithm with Unnormalised Positive and Negative weights (EGU algorithm). The analysis is of the form used in the signal processing community and is in terms of the mean square error. A relationship between the learning rate and the mean
more » ... red error (MSE) of predictions is found for the family of algorithms. Trials involving simulated acoustic echo cancellation are conducted whereby learning rates for the algorithms are selected such that they converge to the same steady state MSE. These trials demonstrate that, in the case that the target is sparse, the EG algorithm typically converges more quickly than the GD or EGU algorithms which perform very similarly.
doi:10.1109/isspa.1999.818191 dblp:conf/isspa/HillW99 fatcat:3emn6wmgkrhsxknnllftth5534