A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Adaptive Distillation for Decentralized Learning from Heterogeneous Clients
[article]
2020
arXiv
pre-print
To this end, we propose a new decentralized learning method called Decentralized Learning via Adaptive Distillation (DLAD). ...
Learning and network co-distillation. ...
DECENTRALIZED LEARNING VIA ADAPTIVE DISTILLATION We extend the distillation objective in Eq. (1) to make it applicable to our decentralized learning problem. ...
arXiv:2008.07948v1
fatcat:ofqrxnrbmnb3jfw4zrilws4z6m
A Fairness-Aware Peer-to-Peer Decentralized Learning Framework with Heterogeneous Devices
2022
Future Internet
The proposed fairness-aware approach allows local clients to adaptively aggregate the received model based on the local learning performance. ...
By doing this, multiple learning objectives are optimized to advocate for learning fairness while avoiding small-group domination. ...
For Line 4 to Line 7, it shows that (i) a client requests the local learning model from its learning pair; (ii) the client evaluates the learning performance of received learning model; and (iii) it adaptively ...
doi:10.3390/fi14050138
fatcat:mwbh5x6bkngmzjzjs7fnvxujwm
Survey of Personalization Techniques for Federated Learning
[article]
2020
arXiv
pre-print
Federated learning enables machine learning models to learn from private decentralized data without compromising privacy. ...
The standard formulation of federated learning produces one shared model for all clients. ...
Techniques This section surveys different methods for adapting global models for individual clients. ...
arXiv:2003.08673v1
fatcat:6uqguovedvfghh4v6dwkvtyt6q
Emerging Trends in Federated Learning: From Model Fusion to Federated X Learning
[article]
2021
arXiv
pre-print
Specifically, we explore various learning algorithms to improve the vanilla federated averaging algorithm and review model fusion methods such as adaptive aggregation, regularization, clustered methods ...
transfer learning, unsupervised learning, and reinforcement learning. ...
Centralized network architecture also makes the central server suffer from heavy communication workload, calling for a decentralized server architecture. [5] . ...
arXiv:2102.12920v2
fatcat:5fcwfhxibbedbcbuzrfyqdedky
No One Left Behind: Inclusive Federated Learning over Heterogeneous Devices
[article]
2022
arXiv
pre-print
Federated learning (FL) is an important paradigm for training global models from decentralized data in a privacy-preserving way. ...
Extensive experiments on many real-world benchmark datasets demonstrate the effectiveness of the proposed method in learning accurate models from clients with heterogeneous devices under the FL framework ...
method in learning an accurate global model from heterogeneous clients in the FL framework. • We propose InclusiveFL for effective federated learning over heterogeneous client devices. ...
arXiv:2202.08036v1
fatcat:ob4coc6minhrlapdfl2ztuohhy
Class-Wise Adaptive Self Distillation for Federated Learning on Non-IID Data (Student Abstract)
2022
PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE
Federated learning (FL) enables multiple clients to collaboratively train a globally generalized model while keeping local data decentralized. ...
A key challenge in FL is to handle the heterogeneity of data distributions among clients. ...
Through the adaptive controlling of the distillation loss, local models learn more knowledge from the distilled knowledge when the prediction of global model on the class is credible. ...
doi:10.1609/aaai.v36i11.21620
fatcat:5a4isfha6ncnxhg5n354ql2kdi
Federated Learning Meets Natural Language Processing: A Survey
[article]
2021
arXiv
pre-print
Federated Learning aims to learn machine learning models from multiple decentralized edge devices (e.g. mobiles) or servers without sacrificing local data privacy. ...
Since text data is widely originated from end users, in this work, we look into recent NLP models and techniques which use federated learning as the learning framework. ...
It then uses knowledge distillation to learn the knowledge from the client devices instead of traditional. ...
arXiv:2107.12603v1
fatcat:ebi4i6jnxbhihe7zuqx4uposbm
Federated Mutual Learning
[article]
2020
arXiv
pre-print
Federated learning (FL) enables collaboratively training deep learning models on decentralized data. ...
, where center server seeks for a generalized model whereas client pursue a personalized model, and clients may run different tasks; Third, clients may need to design their customized model for various ...
Li and Wang (2019) proposes a decentralized framework based on knowledge distillation, which enables federated learning for independently designed models. ...
arXiv:2006.16765v3
fatcat:3kifsth6kng4zntkc3krp5sv7a
FedRAD: Federated Robust Adaptive Distillation
[article]
2021
arXiv
pre-print
The collaborative learning framework by typically aggregating model updates is vulnerable to model poisoning attacks from adversarial clients. ...
The robustness of federated learning (FL) is vital for the distributed training of an accurate global model that is shared among large number of clients. ...
With increased heterogeneity, the learning starts to slow down for all aggregators. ...
arXiv:2112.01405v1
fatcat:ajksl6iiv5h3fff3kd2yxvd3ba
Federated Selective Aggregation for Knowledge Amalgamation
[article]
2022
arXiv
pre-print
Our motivation for investigating such a problem setup stems from a recent dilemma of model sharing. ...
The proposed FedSA offers a solution to this dilemma and makes it one step further since, again, the learned student may specialize in a new task different from all of the teachers. ...
Federated learning [19, 10, 3] develops a decentralized training schema for privacy-preserving learning, enabling multiple clients to learn a network collaboratively without sharing their private data ...
arXiv:2207.13309v1
fatcat:z4hxwg44tba5zeffrgkrxbzabe
Ensemble Distillation for Robust Model Fusion in Federated Learning
[article]
2021
arXiv
pre-print
Specifically, we propose ensemble distillation for model fusion, i.e. training the central classifier through unlabeled data on the outputs of the models from the clients. ...
Federated Learning (FL) is a machine learning setting where many devices collaboratively train a machine learning model while keeping the training data decentralized. ...
Communication-efficient
learning of deep networks from decentralized data. arXiv preprint arXiv:1602.05629, 2016. ...
arXiv:2006.07242v3
fatcat:rdjzbuulnvh5repemjkolco4eu
Avoid Overfitting User Specific Information in Federated Keyword Spotting
[article]
2022
arXiv
pre-print
Keyword spotting (KWS) aims to discriminate a specific wake-up word from other signals precisely and efficiently for different users. ...
Furthermore, we propose an adaptive local training strategy, letting clients with more training data and more uniform class distributions undertake more local update steps. ...
Federated learning (FL) [10, 11] has been effectively applied for communication efficient decentralized training with basic privacy protection. ...
arXiv:2206.08864v1
fatcat:7hv2u5f5hbgijkmxwulegnubse
FedX: Unsupervised Federated Learning with Cross Knowledge Distillation
[article]
2022
arXiv
pre-print
This paper presents FedX, an unsupervised federated learning framework. Our model learns unbiased representation from decentralized and heterogeneous local data. ...
It employs a two-sided knowledge distillation with contrastive learning as a core component, allowing the federated system to function without requiring clients to share any data features. ...
Acknowledgements We thank Seungeon Lee and Xiting Wang for their insights and discussions on our work. ...
arXiv:2207.09158v1
fatcat:gyrgp3cf2be6bit5cjho2n4ake
CD^2-pFed: Cyclic Distillation-guided Channel Decoupling for Model Personalization in Federated Learning
[article]
2022
arXiv
pre-print
Federated learning (FL) is a distributed learning paradigm that enables multiple clients to collaboratively learn a shared global model. ...
Different from previous works which establish layer-wise personalization to overcome the non-IID data across different clients, we make the first attempt at channel-wise assignment for model personalization ...
The work described in this paper is supported by grants from HKU Startup Fund and HKU Seed Fund for Basic Research (Project No. 202009185079). ...
arXiv:2204.03880v1
fatcat:3hwxltgenzhednu7e2ksfxmbd4
Challenges, Applications and Design Aspects of Federated Learning: A Survey
2021
IEEE Access
There are many application domains where large amounts of properly labeled and complete data are not available in a centralized location, for example, doctors' diagnosis from medical image analysis. ...
Federated Learning (FL) is a new technology that has been a hot research topic. ...
Statistical heterogeneity issues are dealt by modeling heterogeneous data using methods like meta-learning, multi-task learning, adapting selection between global model and device-specific models, transfer ...
doi:10.1109/access.2021.3111118
fatcat:jsdaxx6mvjdrrhomeknujwsj7i
« Previous
Showing results 1 — 15 out of 1,138 results