A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
Filters
Federated Learning of a Mixture of Global and Local Models
[article]
2021
arXiv
pre-print
We propose a new optimization formulation for training federated learning models. ...
Notably, our methods are similar but not identical to federated averaging / local SGD, thus shedding some light on the role of local steps in federated learning. ...
Instead of learning a single global model by solving (1), we propose to learn a mixture of the global model and the purely local models which can be trained by each device i using its data D i only. ...
arXiv:2002.05516v3
fatcat:mijaykh7rje55b5x2nr2cwxdxu
Specialized federated learning using a mixture of experts
[article]
2021
arXiv
pre-print
To achieve this personalization we propose a federated learning framework using a mixture of experts to combine the specialist nature of a locally trained model with the generalist knowledge of a global ...
In federated learning, clients share a global model that has been trained on decentralized local client data. ...
Figure 1 . 1 Federated mixtures of experts, consisting of a global model fg and local specialist models f k s using local gating functions h k . ...
arXiv:2010.02056v3
fatcat:6bku362pl5fjjf62mhgp5hajou
Subspace Learning for Personalized Federated Optimization
[article]
2021
arXiv
pre-print
i.e. global model and local model). ...
Training federated learning systems usually focuses on optimizing a global model that is identically deployed to all client devices. ...
between a local and a global model. ...
arXiv:2109.07628v1
fatcat:aogce7pjavd33od6ka4ktc4h2y
Certified Robustness in Federated Learning
[article]
2022
arXiv
pre-print
Finally, we explore the robustness of mixtures of global and local~(\ie personalized) models, and find that the robustness of local models degrades as they diverge from the global model ...
We show several advantages of personalization over both~(that is, only training on local data and federated training) in building more robust models with faster training. ...
At last, we analyze a version of the recently-proposed mixture between global and personalized models from federated learning [19, 20] from a certified robustness lens. ...
arXiv:2206.02535v1
fatcat:3tshzwpowfcypifr3rvvq2wklm
Survey of Personalization Techniques for Federated Learning
[article]
2020
arXiv
pre-print
The standard formulation of federated learning produces one shared model for all clients. ...
Federated learning enables machine learning models to learn from private decentralized data without compromising privacy. ...
Instead of learning a single global model, each device learns a mixture of the global model and its own local model. ...
arXiv:2003.08673v1
fatcat:6uqguovedvfghh4v6dwkvtyt6q
Separate but Together: Unsupervised Federated Learning for Speech Enhancement from Non-IID Data
[article]
2021
arXiv
pre-print
Each client trains their model in isolation using mixture invariant training while periodically providing updates to a central server. ...
We simulate a real-world scenario where each client only has access to a few noisy recordings from a limited and disjoint number of speakers (hence non-IID). ...
We analyze the convergence of the global model learned in a federated setup and we compare it against training on each one of the private datasets of the C individual nodes. ...
arXiv:2105.04727v3
fatcat:o5p4ig6fljaotacm66c5gmjdb4
Federated Learning: A Signal Processing Perspective
[article]
2021
arXiv
pre-print
Federated learning is an emerging machine learning paradigm for training models across multiple edge devices holding local datasets, without explicitly exchanging the data. ...
We present a formulation for the federated learning paradigm from a signal processing perspective, and survey a set of candidate approaches for tackling its unique challenges. ...
set of global models, i.e., a mixture of models (also referred to as model interpolation [61] ). ...
arXiv:2103.17150v2
fatcat:pktgiqowsjbklfnj753ehdbnhu
Collaborative Prognostics for Machine Fleets Using a Novel Federated Baseline Learner
2021
Proceedings of the Annual Conference of the Prognostics and Health Management Society, PHM
To demonstrate the ability of federated learning to enhance the robustness and reliability of PHM models, this paper proposes a novel federated Gaussian Mixture Model (GMM) algorithm to build universal ...
Federated collaborative learning can serve as a catalyst for the adaptation of business models based on the servitization of assets in the era of Industry 4.0. ...
Federated training of the global baseline GMM model in the global server is performed by the exchange of model parameters with each client using the local baseline data as described in the proposed algorithm ...
doi:10.36001/phmconf.2021.v13i1.2989
fatcat:yotbx3xk2jff3iak5dpvxfn32a
Federated Learning for Open Banking
[article]
2021
arXiv
pre-print
The most attractive aspect of federated learning is its ability to decompose model training into a centralized server and distributed nodes without collecting private data. ...
This is a just-in-time technology that can learn intelligent models in a decentralized training manner. ...
[9] proposes a mixture model of global and local models in a federated setting. The local model can preserve the personalized information and the global model provides common information. ...
arXiv:2108.10749v1
fatcat:qbn5q6poxbdyfnktnt6svavbxm
Model Fusion with Kullback–Leibler Divergence
[article]
2020
arXiv
pre-print
Our algorithm is easy to describe and implement, efficient, and competitive with state-of-the-art on motion capture analysis, topic modeling, and federated learning of Bayesian neural networks. ...
The components of the dataset posteriors are assigned to the proposed global model components by solving a regularized variant of the assignment problem. ...
FA9550-19-1-031, of National Science Foundation grant IIS-1838071, from the MIT-IBM Watson AI Laboratory, from the Toyota-CSAIL Joint Research Center, from a gift from Adobe Systems, and from the Skoltech-MIT ...
arXiv:2007.06168v1
fatcat:krycdvisjbfhrncolbckv5ggzi
Communication-Efficient Federated Learning over MIMO Multiple Access Channels
[article]
2022
arXiv
pre-print
We then analyze the reconstruction error of the proposed algorithm and its impact on the convergence rate of federated learning. ...
Communication efficiency is of importance for wireless federated learning systems. ...
In federated learning, each wireless device updates a local model based on its local training data set and then sends the local model update (e.g., a gradient vector of the local model) to the PS. ...
arXiv:2206.05723v1
fatcat:pxvgotp6jvgodaders3grxv7vm
Delta-DAGMM: A Free Rider Attack Detection Model in Horizontal Federated Learning
2022
Security and Communication Networks
Federated learning is a machine learning framework proposed in recent years. In horizontal federated learning, multiple participants cooperate to train and obtain a common final model. ...
Participants only need to transmit the local updated model instead of local datasets. ...
Yu Haining for his guidance of this paper and the National Natural Science Foundation of China (62172123) for its support. ...
doi:10.1155/2022/8928790
fatcat:s5veaktnfndrjksvqgazisyzzy
Decentralised Person Re-Identification with Selective Knowledge Aggregation
[article]
2021
arXiv
pre-print
the global model aggregation in federated Re-ID learning. ...
To resolve this problem, two recent works have introduced decentralised (federated) Re-ID learning for constructing a globally generalised model (server)without any direct access to local training data ...
Federated Learning aims to learn a global model with the collaboration of multiple lo- cal models. ...
arXiv:2110.11384v1
fatcat:ihe4purgebd5rf5sp56hrjymfa
Resource-Constrained Federated Learning with Heterogeneous Labels and Models
[article]
2020
arXiv
pre-print
Particularly, on-device federated learning is an active area of research, however, there are a variety of challenges in addressing statistical (non-IID data) and model heterogeneities. ...
framework by experimenting with federated learning and inference across different iterations on a Raspberry Pi 2, a single-board computing platform. ...
Mohri et al. propose Agnostic Federated Learning [18] , which addresses about handling any target data distribution formed by a mixture of client distributions. ...
arXiv:2011.03206v1
fatcat:yk4hxhyd7vahfb7qvfi44id2ji
No Fear of Heterogeneity: Classifier Calibration for Federated Learning with Non-IID Data
[article]
2021
arXiv
pre-print
A central challenge in training classification models in the real-world federated system is learning with non-IID data. ...
from an approximated gaussian mixture model. ...
Acknowledgement We would like to thank the anonymous reviewers for their insightful comments and suggestions. ...
arXiv:2106.05001v2
fatcat:fmbnjqpmdbc6dj3mlc6usg3nhe
« Previous
Showing results 1 — 15 out of 29,333 results