Filters








2,407 Hits in 4.1 sec

Generative Models for Effective ML on Private, Decentralized Datasets [article]

Sean Augenstein, H. Brendan McMahan, Daniel Ramage, Swaroop Ramaswamy, Peter Kairouz, Mingqing Chen, Rajiv Mathews, Blaise Aguera y Arcas
2020 arXiv   pre-print
This paper demonstrates that generative models - trained using federated methods and with formal differential privacy guarantees - can be used effectively to debug many commonly occurring data issues even  ...  We explore these methods in applications to text with differentially private federated RNNs and to images using a novel algorithm for differentially private federated GANs.  ...  These generated results provide actual information on the tokenized words being fed to the model from the decentralized dataset.  ... 
arXiv:1911.06679v2 fatcat:qdupc7zyh5gwpgu5yj2fim2kdu

Towards Collaborative Intelligence: Routability Estimation based on Decentralized Private Data [article]

Jingyu Pan, Chen-Chia Chang, Zhiyao Xie, Ang Li, Minxue Tang, Tunhou Zhang, Jiang Hu, Yiran Chen
2022 arXiv   pre-print
Although one can commission ML model training to a design company, the data of a single company might be still inadequate or biased, especially for small companies.  ...  To further strengthen the results, we co-design a customized ML model FLNet and its personalization under the decentralized training scenario.  ...  Training Method Evaluation Table 3 shows the accuracy of FLNet, using various model training algorithms based on decentralized private data.  ... 
arXiv:2203.16009v1 fatcat:cpk6aodmtnhrtc6ciwiegenodi

Federated Learning for Wireless Communications: Motivation, Opportunities and Challenges [article]

Solmaz Niknam, Harpreet S. Dhillon, Jeffery H. Reed
2020 arXiv   pre-print
As a result, decentralized ML approaches that keep the data where it is generated are much more appealing.  ...  for future research on federated learning in the context of wireless communications.  ...  It utilizes the on-device processing power and untapped private data by performing the model training in a decentralized manner and keeping the data where it is generated.  ... 
arXiv:1908.06847v4 fatcat:plfaupfexzd5bb3o72f3z5kskm

Survey on the Convergence of Machine Learning and Blockchain [article]

Shengwen Ding, Chenhui Hu
2022 arXiv   pre-print
For instance, training of traditional ML models is limited to the access of data sets, which are generally proprietary; published ML models may soon be out of date without an update of new data and continuous  ...  Machine learning (ML) has been pervasively researched nowadays and it has been applied in many aspects of real life. Nevertheless, issues of model and data still accompany the development of ML.  ...  A more completed marketplace for both ML model and data sharing is conducted on the Architecture of DInEMMo [14] , a convergence of decentralized AI and blockchain.  ... 
arXiv:2201.00976v2 fatcat:exenjf2dmzfqfbnfpx3c3v5bmi

BEAS: Blockchain Enabled Asynchronous Secure Federated Machine Learning [article]

Arup Mondal, Harpreet Virk, Debayan Gupta
2022 arXiv   pre-print
Federated Learning (FL) enables multiple parties to distributively train a ML model without revealing their private datasets.  ...  We perform extensive experiments on multiple datasets with promising results: BEAS successfully prevents privacy leakage from dataset reconstruction attacks, and minimizes the efficacy of poisoning attacks  ...  Definition 0.1 (Differential Privacy (Dwork 2006 )) A randomized function K gives −differential privacy if for all model gradients G 1 and G 2 , generated by training on datasets D 1 and D 2 differing  ... 
arXiv:2202.02817v1 fatcat:lzwiv3bysrgyvmff2tqxxmm4lm

Jointly Learning from Decentralized (Federated) and Centralized Data to Mitigate Distribution Shift [article]

Sean Augenstein, Andrew Hard, Kurt Partridge, Rajiv Mathews
2021 arXiv   pre-print
models while still meeting the private training data access constraints imposed by FL.  ...  By mixing decentralized (federated) and centralized (datacenter) data, we can form an effective training data distribution that better matches the inference data distribution, resulting in more useful  ...  Two tools for doing this, which could be combined with mixed learning, are federated analytics [Ramage and Mazzocchi, 2020] and federated private generative models [Augenstein et al., 2019] .  ... 
arXiv:2111.12150v1 fatcat:f2sdddzqprff7eplutwucjpuam

Blockchain-Enabled: Multi-Layered Security Federated Learning Platform for Preserving Data Privacy

Zeba Mahmood, Vacius Jusas
2022 Electronics  
., providing local but private data to the server and using ML apps, performing ML operations on the devices without benefiting from other users' data, and preventing direct access to raw data and local  ...  training of ML models.  ...  Users are more likely to choose the latter option between providing local but private data to the server and using ML apps, performing ML operations on devices without benefiting from other users' data  ... 
doi:10.3390/electronics11101624 fatcat:inuotik6uzcqvnremxqbmqca5m

FairVFL: A Fair Vertical Federated Learning Framework with Contrastive Adversarial Learning [article]

Tao Qi, Fangzhao Wu, Chuhan Wu, Lingjuan Lyu, Tong Xu, Zhongliang Yang, Yongfeng Huang, Xing Xie
2022 arXiv   pre-print
Experiments on two real-world datasets validate that our method can effectively improve model fairness with user privacy well-protected.  ...  However, existing fair ML methods usually rely on the centralized storage of fairness-sensitive features to achieve model fairness, which are usually inapplicable in federated scenarios.  ...  Thus, to compare their effectiveness, we combine them with several basic ML models on the two datasets.  ... 
arXiv:2206.03200v1 fatcat:ox6doodd2vgkzgwxuc7nfpbbam

Differential Privacy: What is all the noise about? [article]

Roxana Danger
2022 arXiv   pre-print
DP has been actively researched during the last 15 years, but it is still hard to master for many Machine Learning (ML)) practitioners.  ...  This paper aims to provide an overview of the most important ideas, concepts and uses of DP in ML, with special focus on its intersection with Federated Learning (FL).  ...  Let θ T be the trained parameters of P uM This algorithm showed another direction in private ML; instead of creating differentially private models, we could produce differentially private datasets that  ... 
arXiv:2205.09453v1 fatcat:5z3nqsh7qbbwfhbrc6hmzt43ya

SCOTCH: An Efficient Secure Computation Framework for Secure Aggregation [article]

Yash More, Prashanthi Ramachandran, Priyam Panda, Arup Mondal, Harpreet Virk, Debayan Gupta
2022 arXiv   pre-print
Federated learning enables multiple data owners to jointly train a machine learning model without revealing their private datasets.  ...  To mitigate this centralization of power, we propose SCOTCH, a decentralized m-party secure-computation framework for federated aggregation that deploys MPC primitives, such as secret sharing.  ...  Hence, there arises a need for a secure, decentralized FL framework that protects user privacy, while allowing seamless training of ML models.  ... 
arXiv:2201.07730v2 fatcat:ldgyvtvu65duzfaqsjssvvul6e

Distributed Machine Learning for Wireless Communication Networks: Techniques, Architectures, and Applications [article]

S. Hu, X. Chen, W. Ni, E. Hossain, X. Wang
2020 arXiv   pre-print
This survey bridges the gap by providing a contemporary and comprehensive survey of DML techniques with a focus on wireless networks.  ...  There is a clear gap in the existing literature in that the DML techniques are yet to be systematically reviewed for their applicability to wireless systems.  ...  Tremendous amount of effort has been devoted to decentralizing ML models, for example, FL.  ... 
arXiv:2012.01489v1 fatcat:pdauhq4xbbepvf26clhpqnc2ci

FedGNN: Federated Graph Neural Network for Privacy-Preserving Recommendation [article]

Chuhan Wu, Fangzhao Wu, Yang Cao, Yongfeng Huang, Xing Xie
2021 arXiv   pre-print
Extensive experiments on six benchmark datasets validate that our approach can achieve competitive results with existing centralized GNN-based recommendation methods and meanwhile effectively protect user  ...  In this paper, we propose a federated framework for privacy-preserving GNN-based recommendation, which can collectively train GNN models from decentralized user data and meanwhile exploit high-order user-item  ...  Model Effectiveness Then, we validate the effectiveness of incorporating high-order information of the user-item graphs as well as the generality of our approach.  ... 
arXiv:2102.04925v2 fatcat:cucnmoawcfhtvkz4gtcd6ovdoi

Communication Efficiency in Federated Learning: Achievements and Challenges [article]

Osama Shahid, Seyedamin Pouriyeh, Reza M. Parizi, Quan Z. Sheng, Gautam Srivastava, Liang Zhao
2021 arXiv   pre-print
IoT devices of mobiles phone (2) The model is then downloaded by the devices and trained locally on-device on the private dataset that is generated by those devices (3) the ML model is uploaded back to  ...  has their private dataset.  ... 
arXiv:2107.10996v1 fatcat:7cyelxjnbjhczm27gknbwcmlyy

Decentralized Word2Vec Using Gossip Learning

Abdul Aziz Alkathiri, Lodovico Giaretta, Sarunas Girdzijauskas, Magnus Sahlgren
2021 Zenodo  
Therefore, for this specific scenario, we investigate how gossip learning, a massively-parallel, data-private, decentralized protocol, compares to a shared-dataset solution.  ...  It is useful then for a few large public and private organizations to join their corpora during training.  ...  Word2Vec could be implemented on top of gossip learning, a massively-parallel, decentralized, data-private framework.  ... 
doi:10.5281/zenodo.4679361 fatcat:6gmu7zdi6bgnle42bkzmurz7xu

Customized Video QoE Estimation with Algorithm-Agnostic Transfer Learning [article]

Selim Ickin and Markus Fiedler and Konstantinos Vandikas
2020 arXiv   pre-print
In this paper, we present a transfer learning-based ML model training approach, which allows decentralized local models to share generic indicators on MOS to learn a generic base model, and then customize  ...  This makes a decentralized learning-based framework appealing for sharing and aggregating learned knowledge in-between the local models that map the obtained metrics to the user QoE, such as Mean Opinion  ...  From a privacy perspective, techniques such as differential privacy [6] and secure aggregation [1] can be utilized for sharing private information for the purposes of training ML models without revealing  ... 
arXiv:2003.08730v1 fatcat:pxv6fqmwwbeljgaxprka7waqpa
« Previous Showing results 1 — 15 out of 2,407 results