Competitive Normalized Least-Squares Regression

Waqas Jamil, Abdelhamid Bouchachia
2020 IEEE Transactions on Neural Networks and Learning Systems  
Online learning has witnessed an increasing interest over the recent past due to its low computational requirements and its relevance to a broad range of streaming applications. In this brief, we focus on online regularized regression. We propose a novel efficient online regression algorithm, called online normalized least-squares (ONLS). We perform theoretical analysis by comparing the total loss of ONLS against the normalized gradient descent (NGD) algorithm and the best off-line LS
more » ... f-line LS predictor. We show, in particular, that ONLS allows for a better bias-variance tradeoff than those state-of-the-art gradient descent-based LS algorithms as well as a better control on the level of shrinkage of the features toward the null. Finally, we conduct an empirical study to illustrate the great performance of ONLS against some state-of-the-art algorithms using real-world data.
doi:10.1109/tnnls.2020.3009777 pmid:32755868 fatcat:2py3jzu5cjhsnblloc6qkiwvqa