Filters








90 Hits in 6.9 sec

Private and Communication-Efficient Edge Learning: A Sparse Differential Gaussian-Masking Distributed SGD Approach [article]

Xin Zhang, Minghong Fang, Jia Liu, Zhengyuan Zhu
2020 arXiv   pre-print
Toward this end, we propose a new decentralized stochastic gradient method with sparse differential Gaussian-masked stochastic gradients (SDM-DSGD) for non-convex distributed edge learning.  ...  In this paper, we consider the problem of jointly improving data privacy and communication efficiency of distributed edge learning, both of which are critical performance metrics in wireless edge network  ...  A SPARSE DIFFERENTIAL GAUSSIAN-MASKING SGD APPROACH In this section, we first present the problem formulation of edge ML training in Section 4.1.  ... 
arXiv:2001.03836v4 fatcat:aldo5jy3lfegxbqswah5lq7csq

A Review of Privacy-preserving Federated Learning for the Internet-of-Things [article]

Christopher Briggs, Zhong Fan, Peter Andras
2020 arXiv   pre-print
We survey a wide variety of papers covering communication-efficiency, client heterogeneity and privacy preserving methods that are crucial for federated learning in the context of the IoT.  ...  This work reviews federated learning as an approach for performing machine learning on distributed data with the goal of protecting the privacy of user-generated data as well as reducing communication  ...  Acknowledgements This work is partly supported by the SEND project (grant ref.Âă32R16P00706) funded by ERDF and BEIS.  ... 
arXiv:2004.11794v2 fatcat:2cir7oiwyfevjfw7ymnnonbf5e

Privacy-Preserving Machine Learning: Methods, Challenges and Directions [article]

Runhua Xu, Nathalie Baracaldo, James Joshi
2021 arXiv   pre-print
We discuss the unique characteristics and challenges of PPML and outline possible research directions that leverage as well as benefit multiple research communities such as ML, distributed systems, security  ...  Machine learning (ML) is increasingly being adopted in a wide variety of application domains.  ...  Federated Learning (FL) Architecture The FL [4, 5] is also a distributed machine learning framework with a similar architecture to the distributed selective SGD approach [177] , in which each participant  ... 
arXiv:2108.04417v2 fatcat:pmxmsbs2gvh6nd4jadcz4dnsrq

Federated Learning: A Signal Processing Perspective [article]

Tomer Gafni, Nir Shlezinger, Kobi Cohen, Yonina C. Eldar, H. Vincent Poor
2021 arXiv   pre-print
We present a formulation for the federated learning paradigm from a signal processing perspective, and survey a set of candidate approaches for tackling its unique challenges.  ...  in the areas of signal processing and communications.  ...  At each federated learning iteration, the edge devices train a local model using their possibly private data, and transmit the updated model to the central server.  ... 
arXiv:2103.17150v2 fatcat:pktgiqowsjbklfnj753ehdbnhu

Distributed Machine Learning for Wireless Communication Networks: Techniques, Architectures, and Applications [article]

S. Hu, X. Chen, W. Ni, E. Hossain, X. Wang
2020 arXiv   pre-print
Distributed machine learning (DML) techniques, such as federated learning, partitioned learning, and distributed reinforcement learning, have been increasingly applied to wireless communications.  ...  Specifically, we review the latest applications of DML in power control, spectrum management, user association, and edge cloud computing.  ...  In this section, we discuss the approaches to improve computation and communication efficiency, and achieve a balance between computation and communication, with a guaranteed model performance. A.  ... 
arXiv:2012.01489v1 fatcat:pdauhq4xbbepvf26clhpqnc2ci

Federated Learning with Sparsified Model Perturbation: Improving Accuracy under Client-Level Differential Privacy [article]

Rui Hu, Yanmin Gong, Yuanxiong Guo
2022 arXiv   pre-print
privacy and communication efficiency in comparison with traditional centralized machine learning paradigm.  ...  Federated learning (FL) that enables distributed clients to collaboratively learn a shared statistical model while keeping their training data locally has received great attention recently and can improve  ...  [43] propose cpSGD by making modifications to distributed SGD to make the method both private and communication-efficient.  ... 
arXiv:2202.07178v1 fatcat:gaex4h4wu5cjfkcbr7ol5k3ckq

Advances and Open Problems in Federated Learning [article]

Peter Kairouz, H. Brendan McMahan, Brendan Avent, Aurélien Bellet, Mehdi Bennis, Arjun Nitin Bhagoji, Kallista Bonawitz, Zachary Charles, Graham Cormode, Rachel Cummings, Rafael G.L. D'Oliveira, Hubert Eichner (+47 others)
2021 arXiv   pre-print
approaches.  ...  Federated learning (FL) is a machine learning setting where many clients (e.g. mobile devices or whole organizations) collaboratively train a model under the orchestration of a central server (e.g. service  ...  Acknowledgments The authors would like to thank Alex Ingerman and David Petrou for their useful suggestions and insightful comments during the review process.  ... 
arXiv:1912.04977v3 fatcat:efkbqh4lwfacfeuxpe5pp7mk6a

Trusted AI in Multi-agent Systems: An Overview of Privacy and Security for Distributed Learning [article]

Chuan Ma, Jun Li, Kang Wei, Bo Liu, Ming Ding, Long Yuan, Zhu Han, H. Vincent Poor
2022 arXiv   pre-print
machine learning (ML) and artificial intelligence (AI) that can be processed on on distributed UEs.  ...  a centralized ML process into a distributed one, and brings about significant benefits.  ...  Several differentially private machine learning algorithms [157] have been developed in the community, where a trusted data curator is introduced to gather data from individual owners and honestly runs  ... 
arXiv:2202.09027v2 fatcat:hlu7bopcjrc6zjn2pct57utufy

Federated Learning in Mobile Edge Networks: A Comprehensive Survey [article]

Wei Yang Bryan Lim, Nguyen Cong Luong, Dinh Thai Hoang, Yutao Jiao, Ying-Chang Liang, Qiang Yang, Dusit Niyato, Chunyan Miao
2020 arXiv   pre-print
Traditional cloudbased Machine Learning (ML) approaches require the data to be centralized in a cloud server or data center.  ...  However, in a large-scale and complex mobile edge network, heterogeneous devices with varying constraints are involved.  ...  In particular, at the gradient averaging step of a normal FL participant, a Gaussian distribution is used to approximate the differentially private stochastic gradient descent.  ... 
arXiv:1909.11875v2 fatcat:a2yxlq672needkejenu4j3izyu

Stochastic, Distributed and Federated Optimization for Machine Learning [article]

Jakub Konečný
2017 arXiv   pre-print
We propose a communication-efficient framework which iteratively forms local subproblems that can be solved with arbitrary local optimization algorithms.  ...  Second, we study distributed setting, in which the data describing the optimization problem does not fit into a single computing node.  ...  However, in the interesting case when we consider small communication budget, the sparse communication protocols are the most efficient.  ... 
arXiv:1707.01155v1 fatcat:t6uqrmnssrafze6l6c7gk5vcyu

Federated Learning for Internet of Things: A Comprehensive Survey [article]

Dinh C. Nguyen, Ming Ding, Pubudu N. Pathirana, Aruna Seneviratne, Jun Li, H. Vincent Poor
2021 arXiv   pre-print
Federated Learning (FL) has emerged as a distributed collaborative AI approach that can enable many intelligent IoT applications, by allowing for AI training at distributed IoT devices without the need  ...  The important lessons learned from this review of the FL-IoT services and applications are also highlighted.  ...  [174] Communication- efficient FL HFL NN Edge devices Cloud server A communication-efficient FL approach called CE-FedAvg for optimizing communication rounds.  ... 
arXiv:2104.07914v1 fatcat:b5wsrfcbynel7jqdxpfw4ftwh4

Federated Learning for Internet of Things: A Comprehensive Survey

Dinh C. Nguyen, Ming Ding, Pubudu N. Pathirana, Aruna Seneviratne, Jun Li, H. Vincent Poor
2021 IEEE Communications Surveys and Tutorials  
Federated Learning (FL) has emerged as a distributed collaborative AI approach that can enable many intelligent IoT applications, by allowing for AI training at distributed IoT devices without the need  ...  The important lessons learned from this review of the FL-IoT services and applications are also highlighted.  ...  -Communication-Efficient FL for Industrial Edge-Based IoT: The authors in [172] consider a communication-efficient FL approach called CE-FedAvg which is able to reduce the number of rounds to convergence  ... 
doi:10.1109/comst.2021.3075439 fatcat:ycq2zydqrzhibfqyo4vzloeoqy

Secure and Robust Machine Learning for Healthcare: A Survey [article]

Adnan Qayyum, Junaid Qadir, Muhammad Bilal, Ala Al-Fuqaha
2020 arXiv   pre-print
Recent years have witnessed widespread adoption of machine learning (ML)/deep learning (DL) techniques due to their superior performance for a variety of healthcare applications ranging from the prediction  ...  In this paper, we present an overview of various application areas in healthcare that leverage such techniques from security and privacy point of view and present associated challenges.  ...  Various approaches for differential privacy have been proposed in the literature, e.g., private aggregation of teacher ensembles (PATE) for private ML [129] , differentially private stochastic gradient  ... 
arXiv:2001.08103v1 fatcat:u6obszbeajcp5asciz5z5unmlq

Secure and Robust Machine Learning for Healthcare: A Survey

Adnan Qayyum, Junaid Qadir, Muhammad Bilal, Ala Al Fuqaha
2020 IEEE Reviews in Biomedical Engineering  
Recent years have witnessed widespread adoption of machine learning (ML)/deep learning (DL) techniques due to their superior performance for a variety of healthcare applications ranging from the prediction  ...  In this paper, we present an overview of various application areas in healthcare that leverage such techniques from security and privacy point of view and present associated challenges.  ...  Various approaches for differential privacy have been proposed in the literature, e.g., private aggregation of teacher ensembles (PATE) for private ML [156] , differentially private stochastic gradient  ... 
doi:10.1109/rbme.2020.3013489 pmid:32746371 fatcat:wd2flezcjng4jjsn46t24c5yb4

The Internet of Federated Things (IoFT): A Vision for the Future and In-depth Survey of Data-driven Approaches for Federated Learning [article]

Raed Kontar, Naichen Shi, Xubo Yue, Seokhyun Chung, Eunshin Byon, Mosharaf Chowdhury, Judy Jin, Wissam Kontar, Neda Masoud, Maher Noueihed, Chinedum E. Okwudire, Garvesh Raskutti (+3 others)
2021 arXiv   pre-print
Specifically, we first introduce the defining characteristics of IoFT and discuss FL data-driven approaches, opportunities, and challenges that allow decentralized inference within three dimensions: (i  ...  ) a global model that maximizes utility across all IoT devices, (ii) a personalized model that borrows strengths across all devices yet retains its own model, (iii) a meta-learning model that quickly adapts  ...  Meta-learning with latent embedding Communication-efficient distributed optimization optimization.  ... 
arXiv:2111.05326v1 fatcat:bbgdhtuqcrhstgakt2vxuve2ca
« Previous Showing results 1 — 15 out of 90 results