Filters








30,163 Hits in 5.3 sec

Optimal Client Sampling for Federated Learning [article]

Wenlin Chen, Samuel Horvath, Peter Richtarik
2021 arXiv   pre-print
It is well understood that client-master communication can be a primary bottleneck in Federated Learning.  ...  where participating clients are sampled uniformly.  ...  Acknowledgments We thank Jakub Konečný for helpful discussions and comments.  ... 
arXiv:2010.13723v2 fatcat:hlrdoe6idbaajj4jvpzliew6du

Client Selection in Nonconvex Federated Learning: Improved Convergence Analysis for Optimal Unbiased Sampling Strategy [article]

Lin Wang, YongXin Guo, Tao Lin, Xiaoying Tang
2022 arXiv   pre-print
Federated learning (FL) is a distributed machine learning paradigm that selects a subset of clients to participate in training to reduce communication burdens.  ...  FedSRC-D is provable the optimal unbiased sampling in non-convex settings for non-IID FL with respect to the given bounds.  ...  . 3 Preliminaries Algorithm 1: Select representative clients for Federated Learning (FedSRC) Framework Require: initial weights x0, global learning rate η, local learning rate η l , number of training  ... 
arXiv:2205.13925v1 fatcat:m3tlh7oufjgzzan7galm2bqzoy

Resource Management and Model Personalization for Federated Learning over Wireless Edge Networks

Ravikumar Balakrishnan, Mustafa Akdeniz, Sagar Dhakal, Arjun Anand, Ariela Zeira, Nageen Himayat
2021 Journal of Sensor and Actuator Networks  
We also develop a federated meta-learning solution, based on task similarity, that serves as a sample efficient initialization for federated learning, as well as improves model personalization and generalization  ...  Distributed learning approaches such as federated learning that move ML training to end devices have emerged, promising lower latency and bandwidth costs and enhanced privacy of end users' data.  ...  Resource Management and Model Personalization for Federated Learning Resource Management through Importance Sampling The sampling of K out of N clients to optimize a performance objective such as maximizing  ... 
doi:10.3390/jsan10010017 fatcat:2j635xalqjgenkgxny5k7fvp7m

Evaluation of Hyperparameter-Optimization Approaches in an Industrial Federated Learning System [article]

Stephanie Holly, Thomas Hiessl, Safoura Rezapour Lakani, Daniel Schall, Clemens Heitzinger, Jana Kemnitz
2021 arXiv   pre-print
Federated Learning (FL) decouples model training from the need for direct access to the data and allows organizations to collaborate with industry partners to reach a satisfying level of performance without  ...  -- allows every client to have its own hyperparameter configuration.  ...  Then, we optimized the learning rate using the local approach, trained the federated model with local individual learning rates for each client in the cohort, and tested the resulting federated model on  ... 
arXiv:2110.08202v2 fatcat:rgirbdxqmreqjcnb6p7htbys4m

Real-time Federated Evolutionary Neural Architecture Search [article]

Hangyu Zhu, Yaochu Jin
2020 arXiv   pre-print
During the search, a double-sampling technique is introduced, in which for each individual, a randomly sampled sub-model of a master model is transmitted to a number of randomly sampled clients for training  ...  One is that federated learning raises high demands on communication, since a large number of model parameters must be transmitted between the server and the clients.  ...  A Double-Sampling for Objective Evaluations Offline evolutionary optimization is intrinsically not suited for federated learning.  ... 
arXiv:2003.02793v1 fatcat:cd5jo3xe45gvtflvqrxqtemrhq

Emerging Trends in Federated Learning: From Model Fusion to Federated X Learning [article]

Shaoxiong Ji and Teemu Saravirta and Shirui Pan and Guodong Long and Anwar Walid
2021 arXiv   pre-print
Following the emerging trends, we also discuss federated learning in the intersection with other learning paradigms, termed as federated x learning, where x includes multitask learning, meta-learning,  ...  As a flexible learning setting, federated learning has the potential to integrate with other learning frameworks.  ...  [23] suggested that valuing each sample without clear discrimination is inherently risky as it might result in sub-optimal performance for underrepresented clients and sought to good-intent fairness  ... 
arXiv:2102.12920v2 fatcat:5fcwfhxibbedbcbuzrfyqdedky

A Field Guide to Federated Optimization [article]

Jianyu Wang, Zachary Charles, Zheng Xu, Gauri Joshi, H. Brendan McMahan, Blaise Aguera y Arcas, Maruan Al-Shedivat, Galen Andrew, Salman Avestimehr, Katharine Daly, Deepesh Data, Suhas Diggavi (+41 others)
2021 arXiv   pre-print
Federated learning and analytics are a distributed approach for collaboratively learning models (or statistics) from decentralized data, motivated by and designed for privacy protection.  ...  The distributed learning process can be formulated as solving federated optimization problems, which emphasize communication efficiency, data heterogeneity, compatibility with privacy and system requirements  ...  During the discussion, a general consensus about the need for a guide about federated optimization is reached.  ... 
arXiv:2107.06917v1 fatcat:lfpi4c3s45gl7aezwulaczzev4

Fairness-aware Agnostic Federated Learning [article]

Wei Du, Depeng Xu, Xintao Wu, Hanghang Tong
2020 arXiv   pre-print
Federated learning is an emerging framework that builds centralized machine learning models with training data distributed across multiple devices.  ...  To our best knowledge, this is the first work to achieve fairness in federated learning.  ...  Most recently, the authors in [20] propose agnostic federated learning. However, the proposed framework solves the problem by optimizing the worst case for a single client.  ... 
arXiv:2010.05057v1 fatcat:cr4zyopz6vbhflatdlt55opzdq

Efficient Image Representation Learning with Federated Sampled Softmax [article]

Sagar M. Waghmare, Hang Qi, Huizhong Chen, Mikhail Sirotenko, Tomer Meron
2022 arXiv   pre-print
In this work we introduce federated sampled softmax (FedSS), a resource-efficient approach for learning image representation with Federated Learning.  ...  Specifically, the FL clients sample a set of classes and optimize only the corresponding model parameters with respect to a sampled softmax objective that approximates the global full softmax objective  ...  loss for federated learning.  ... 
arXiv:2203.04888v1 fatcat:rindqzo2pbg4bepb4ldm673x3a

From Federated Learning to Federated Neural Architecture Search: A Survey [article]

Hangyu Zhu, Haoyu Zhang, Yaochu Jin
2020 arXiv   pre-print
While both federated learning and neural architecture search are faced with many open challenges, searching for optimized neural architectures in the federated learning framework is particularly demanding  ...  Federated learning is a recently proposed distributed machine learning paradigm for privacy preservation, which has found a wide range of applications where data privacy is of primary concern.  ...  Horizontal Federated Learning Horizontal federated learning is proposed for the scenarios in which datasets on the participating clients share the same feature space but have different samples.  ... 
arXiv:2009.05868v1 fatcat:tvlftvamajh3fi4rag7u27vyve

What Do We Mean by Generalization in Federated Learning? [article]

Honglin Yuan, Warren Morningstar, Lin Ning, Karan Singhal
2022 arXiv   pre-print
Thus generalization studies in federated learning should separate performance gaps from unseen client data (out-of-sample gap) from performance gaps from unseen client distributions (participation gap)  ...  Informed by our findings, we call out community suggestions for future federated learning works.  ...  Acknowledgements We would like to thank Zachary Charles, Zachary Garrett, Zheng Xu, Keith Rush, Hang Qi, Brendan McMahan, Josh Dillon, and Sushant Prakash for helpful discussions at various stages of this  ... 
arXiv:2110.14216v2 fatcat:cyhc24plynb3tos7a6ncrhy6pq

ASFGNN: Automated Separated-Federated Graph Neural Network [article]

Longfei Zheng, Jun Zhou, Chaochao Chen, Bingzhe Wu, Li Wang, Benyu Zhang
2020 arXiv   pre-print
Meanwhile, considering the limited network status of data owners, hyper-parameters optimization for collaborative learning approaches is time-consuming in data isolation scenarios.  ...  clients separately, and the loss computing part that is learnt by clients federally.  ...  learnt by clients federally.  ... 
arXiv:2011.03248v1 fatcat:gmrl2mbczvhq3oeeviw54wjbea

Training Speech Recognition Models with Federated Learning: A Quality/Cost Framework [article]

Dhruv Guliani, Francoise Beaufays, Giovanni Motta
2020 arXiv   pre-print
We propose using federated learning, a decentralized on-device learning paradigm, to train speech recognition models.  ...  Finally, we demonstrate that hyper-parameter optimization and appropriate use of variational noise are sufficient to compensate for the quality impact of non-IID distributions, while decreasing the cost  ...  ACKNOWLEDGEMENTS We would like to thank Khe Chai Sim, Lillian Zhou, Petr Zadrazil, Hang Qi, Harry Zhang, Yuxin Ding, and Tien-Ju Yang for providing valuable insights on its structure and contents.  ... 
arXiv:2010.15965v1 fatcat:lybqdjdglrcghozwuahxjttl2i

Decentralized federated learning of deep neural networks on non-iid data [article]

Noa Onoszko, Gustav Karlsson, Olof Mogren, Edvin Listo Zec
2021 arXiv   pre-print
to learn a model suitable for the local data distribution.  ...  More specifically, we study decentralized federated learning, a peer-to-peer setting where data is distributed among many clients and where there is no central server to orchestrate the training.  ...  The goal is to learn weights w for a model by optimizing some loss over data.  ... 
arXiv:2107.08517v2 fatcat:3erw3bohgjalvf4fgnm62msm2i

Clustered Scheduling and Communication Pipelining For Efficient Resource Management Of Wireless Federated Learning [article]

Cihat Keçeci, Mohammad Shaqfeh, Fawaz Al-Qahtani, Muhammad Ismail, Erchin Serpedin
2022 arXiv   pre-print
We provide a generic formulation for optimal client clustering under different settings, and we analytically derive an efficient algorithm for obtaining the optimal solution.  ...  Due to limited wireless sub-channels, a subset of the total clients is scheduled in each iteration of federated learning algorithms.  ...  PIPELINED FEDERATED LEARNING (PFL) A typical federated learning problem is considered with a total of M clients participating in the optimization algorithm without sharing their own data sets.  ... 
arXiv:2206.07631v1 fatcat:wrhg5vjcfnhfpel3seypho55ku
« Previous Showing results 1 — 15 out of 30,163 results