A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
Filters
Local Adaptivity in Federated Learning: Convergence and Consistency
[article]
2021
arXiv
pre-print
We show in both theory and practice that while local adaptive methods can accelerate convergence, they can cause a non-vanishing solution bias, where the final converged solution may be different from ...
Extensive experiments on realistic federated training tasks show that the proposed algorithms can achieve faster convergence and higher test accuracy than the baselines without local adaptivity. ...
This research was generously supported in part by NSF grants CCF-1850029 and CCF-2045694, and a Google Computing Platform (GCP) Credit grant. ...
arXiv:2106.02305v1
fatcat:hph4vle6ibeanksuvwgovwhl4i
Improving Federated Learning Personalization via Model Agnostic Meta Learning
[article]
2019
arXiv
pre-print
In this work, we point out that the setting of Model Agnostic Meta Learning (MAML), where one optimizes for a fast, gradient-based, few-shot adaptation to a heterogeneous distribution of tasks, has a number ...
We present FL as a natural source of practical applications for MAML algorithms, and make the following observations. 1) The popular FL algorithm, Federated Averaging, can be interpreted as a meta learning ...
Meta Learning optimizes the performance after adaptation given few-shot adaptation examples on heterogeneous tasks, and has increasing applications in the context of Supervised Learning and Reinforcement ...
arXiv:1909.12488v1
fatcat:ipwyst2j2jexdnaiip5x7oqsbq
Multi-Center Federated Learning
[article]
2021
arXiv
pre-print
Unlike distributed machine learning, federated learning aims to tackle non-IID data from heterogeneous sources in various real-world applications, such as those on smartphones. ...
Federated learning has received great attention for its capability to train a large-scale model in a decentralized manner without needing to access user data directly. ...
FedDist: We adapt a distance based-objective function in Reptile meta-learning [24] to federated setting. 9. ...
arXiv:2005.01026v2
fatcat:ineaiwlty5hebigb4bh3l6of7e
Fast-Convergent Federated Learning with Adaptive Weighting
[article]
2021
arXiv
pre-print
In this paper, we propose Federated Adaptive Weighting (FedAdp) algorithm that aims to accelerate model convergence under the presence of nodes with non-IID dataset. ...
Federated learning (FL) enables resource-constrained edge nodes to collaboratively learn a global model under the orchestration of a central server while keeping privacy-sensitive data locally. ...
FEDERATED ADAPTIVE WEIGHTING In this section, we develop our methodology for improving the convergence rate of federated learning. ...
arXiv:2012.00661v2
fatcat:b2kjmzkejbgtpiewwhghurbkha
Emerging Trends in Federated Learning: From Model Fusion to Federated X Learning
[article]
2021
arXiv
pre-print
We conduct a focused survey of federated learning in conjunction with other learning algorithms. ...
Specifically, we explore various learning algorithms to improve the vanilla federated averaging algorithm and review model fusion methods such as adaptive aggregation, regularization, clustered methods ...
One category is to add local constraints for clients. FedProx [17] adds proximal terms to clients' objectives to regularize local training and ensure convergence in the non-IID setting. ...
arXiv:2102.12920v2
fatcat:5fcwfhxibbedbcbuzrfyqdedky
Dynamic Federated Learning
[article]
2020
arXiv
pre-print
While many federated learning architectures process data in an online manner, and are hence adaptive by nature, most performance analyses assume static optimization problems and offer no guarantees in ...
We consider a federated learning model where at every iteration, a random subset of available agents perform local updates based on their data. ...
Related Works Several recent works have examined the convergence behavior of federated learning. ...
arXiv:2002.08782v2
fatcat:l5cvo2kynnf55lbbetrjsc7bxq
MetaGater: Fast Learning of Conditional Channel Gated Networks via Federated Meta-Learning
[article]
2020
arXiv
pre-print
The convergence of the proposed federated meta-learning algorithm is established under mild conditions. Experimental results corroborate the effectiveness of our method in comparison to related work. ...
Particularly, we develop a federated meta-learning approach to jointly learn good meta-initializations for both backbone networks and gating modules, by making use of the model similarity across learning ...
We randomly select 10 nodes for each round in federated meta-learning, and 10 target nodes for fast adaptation. • • Small local dataset size. ...
arXiv:2011.12511v2
fatcat:bluvxd2flvefhb5rzek4a55hti
Semi-Synchronous Federated Learning
[article]
2021
arXiv
pre-print
Here we introduce a novel Semi-Synchronous Federated Learning protocol that mixes local models periodically with minimal idle time and fast convergence. ...
Federated Learning (FL) provides an approach to learn a joint model over all the available data across silos. ...
Wang et al. (2019) studied adaptive FL in mobile edge computing environments under resource budget constraints with arbitrary local updates between learners, while Li et al. (2020b) provide convergence ...
arXiv:2102.02849v1
fatcat:yupf7v2evraylkkbv63mn2hdma
Heterogeneous Federated Learning
[article]
2022
arXiv
pre-print
Federated learning learns from scattered data by fusing collaborative models from local nodes. ...
Eventually, this framework effectively enhances the federated learning applicability to extensive heterogeneous settings, while providing excellent convergence speed, accuracy, and computation/communication ...
Introduction Federated Learning (FL) has drawn great attention with outstanding collaborative training performance and data privacy supportability [6] . ...
arXiv:2008.06767v2
fatcat:puw4zdefivefjcp5nszyybrt7q
Federated Learning for Keyword Spotting
[article]
2019
arXiv
pre-print
Additionally, the dataset used for these experiments is being open sourced with the aim of fostering further transparent research in the application of federated learning to speech data. ...
We empirically demonstrate that using an adaptive averaging strategy inspired from Adam in place of standard weighted model averaging highly reduces the number of communication rounds required to reach ...
The input window consists in 32 stacked frames, symmetrically distributed in left and right contexts. ...
arXiv:1810.05512v4
fatcat:mjoswpvsrbactcsrroaf6lqo2q
Accelerated Federated Learning with Decoupled Adaptive Optimization
2022
International Conference on Machine Learning
However, there is still a paucity of theoretical principles on where to and how to design and utilize adaptive optimization methods in federated settings. ...
Last but not least, full batch gradients are utilized to mimic centralized optimization in the end of the training process to ensure the convergence and overcome the possible inconsistency caused by adaptive ...
and utilize adaptive optimization methods in federated settings. ...
dblp:conf/icml/JinR0LLD22
fatcat:tkng2hnvf5gx7gn253mycoiiae
Mixed Federated Learning: Joint Decentralized and Centralized Learning
[article]
2022
arXiv
pre-print
Federated learning (FL) enables learning from decentralized privacy-sensitive data, with computations on raw data confined to take place at edge clients. ...
For example, additional datacenter data can be leveraged to jointly learn from centralized (datacenter) and decentralized (federated) training data and better match an expected inference data distribution ...
Acknowledgements The authors wish to thank Zachary Charles, Keith Rush, Brendan McMahan, Om Thakkar, and Ananda Theertha Suresh for useful discussions and suggestions. ...
arXiv:2205.13655v2
fatcat:e5dposhnqbdtnnzno73bp5gtpq
Lithography Hotspot Detection via Heterogeneous Federated Learning with Local Adaptation
[article]
2021
arXiv
pre-print
In this paper, we propose a heterogeneous federated learning framework for lithography hotspot detection that can address the aforementioned issues. ...
On the other hand, the global sub-model can be combined with a local sub-model to better adapt to local data heterogeneity. ...
Global Aggregation and Local Adaptation Global aggregation and local adaptation are the key operations in the proposed Heterogeneous Federated Learning with Local Adaptation algorithm (HFL-LA). ...
arXiv:2107.04367v3
fatcat:jyaif55xmrar3gsxlzr345m32u
User-Centric Federated Learning
[article]
2021
arXiv
pre-print
Data heterogeneity across participating devices poses one of the main challenges in federated learning as it has been shown to greatly hamper its convergence time and generalization capabilities. ...
learning scheme. ...
A straightforward solution consists in producing adapted models at a device scale by local fine-tuning procedures. ...
arXiv:2110.09869v1
fatcat:fi3lvfi44vbn3g5aoh3pl5kfzq
Federated Learning: A Distributed Shared Machine Learning Method
2021
Complexity
Federated learning (FL) is a distributed machine learning (ML) framework. ...
On the basis of classical FL algorithms, several federated machine learning algorithms are briefly introduced, with emphasis on deep learning and classification and comparisons of those algorithms are ...
Acknowledgments is research was supported by the National Natural Science Foundation of China (42075130, 61773219, and 61701244) and the Key Special Project of the National Key R&D Program (2018YFC1405703 ...
doi:10.1155/2021/8261663
fatcat:ahr2rpg2indqzg4h3zzda3co5a
« Previous
Showing results 1 — 15 out of 49,297 results