Communication-Efficient Online Federated Learning Framework for Nonlinear Regression [article]

Vinay Chakravarthi Gogineni, Stefan Werner, Yih-Fang Huang, Anthony Kuh
2021 arXiv   pre-print
Federated learning (FL) literature typically assumes that each client has a fixed amount of data, which is unrealistic in many practical applications. Some recent works introduced a framework for online FL (Online-Fed) wherein clients perform model learning on streaming data and communicate the model to the server; however, they do not address the associated communication overhead. As a solution, this paper presents a partial-sharing-based online federated learning framework (PSO-Fed) that
more » ... es clients to update their local models using continuous streaming data and share only portions of those updated models with the server. During a global iteration of PSO-Fed, non-participant clients have the privilege to update their local models with new data. Here, we consider a global task of kernel regression, where clients use a random Fourier features-based kernel LMS on their data for local learning. We examine the mean convergence of the PSO-Fed for kernel regression. Experimental results show that PSO-Fed can achieve competitive performance with a significantly lower communication overhead than Online-Fed.
arXiv:2110.06556v1 fatcat:iirn6aqp4rg53ofe4vtqirtgty