A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Q-GADMM: Quantized Group ADMM for Communication Efficient Decentralized Machine Learning
[article]
2020
arXiv
pre-print
In this article, we propose a communication-efficient decentralized machine learning (ML) algorithm, coined quantized group ADMM (Q-GADMM). ...
However, due to the lack of centralized entity in decentralized ML, the spatial sparsity and payload compression may incur error propagation, hindering model training convergence. ...
Exploiting the sheer amount of these user-generated private data is instrumental in training high-accuracy machine learning (ML) models in various domains, ranging from medical diagnosis and disaster/epidemic ...
arXiv:1910.10453v6
fatcat:bgyuvk367beibpu47bkfr3zlf4
Overmind: A Collaborative Decentralized Machine Learning Framework
2020
Advances in Science, Technology and Engineering Systems
data and associated attributes for assigning machine learnings in the collaborative decentralized manner. ...
This paper introduces "Overmind", the solution that governs and builds the network of decentralized machine learning as a prediction framework named after its functionality: it aims to discover a set of ...
Overmind: A Framework for Decentralized Machine Learnings Overmind, a collaborative decentralized machine learning framework, is proposed considering dealing with anonymous dataset before building a machine ...
doi:10.25046/aj050634
fatcat:gkuh6hqzqvdjpi4uu3s7fr2ufq
Swarm Learning for decentralized and confidential clinical machine learning
2021
Nature
Here, to facilitate the integration of any medical data from any data owner worldwide without violating privacy laws, we introduce Swarm Learning—a decentralized machine-learning approach that unites edge ...
Patients with leukaemia can be identified using machine learning on the basis of their blood transcriptomes3. ...
The SLL is a framework to enable decentralized training of machine learning models without sharing the data. ...
doi:10.1038/s41586-021-03583-3
pmid:34040261
fatcat:5ule2vsgbngltmi6b7ubr24yga
On the Privacy of Decentralized Machine Learning
[article]
2022
arXiv
pre-print
In this work, we carry out the first, in-depth, privacy analysis of Decentralized Learning -- a collaborative machine learning framework aimed at circumventing the main limitations of federated learning ...
We demonstrate that, contrary to what is claimed by decentralized learning proposers, decentralized learning does not offer any security advantages over more practical approaches such as federated learning ...
Decentralized machine learning, also known as fullydecentralized machine learning, peer-to-peer machine learning, or gossip learning, aims to address these limitations by performing the optimization learning ...
arXiv:2205.08443v1
fatcat:we2manartzgsbgzt3qc54jmpa4
Decentralized Multi-Task Learning Based on Extreme Learning Machines
[article]
2019
arXiv
pre-print
Due to the fact that many data sets of different tasks are geo-distributed, decentralized machine learning is studied. ...
To exploit the high learning speed of extreme learning machines (ELMs), we apply the ELM framework to the MTL problem, where the output weights of ELMs for all the tasks are learned collaboratively. ...
DECENTRALIZED MULTI-TASK LEARNING WITH ELM
A. Motivation and basics In many real-world applications, the data of different tasks may be geo-distributed over different machines. ...
arXiv:1904.11366v1
fatcat:2z3u5d7cynam3k3dliygx72xlm
OmniLytics: A Blockchain-based Secure Data Market for Decentralized Machine Learning
[article]
2021
arXiv
pre-print
We propose OmniLytics, a blockchain-based secure data trading marketplace for machine learning applications. ...
Conclusion We develop OmniLytics, the first Ethereum smart contract implementation of a secure data market for decentralized machine learning. ...
Secure Data Market for Decentralized Machine Learning We consider a network of many compute nodes (e.g., mobile devices like smartphones, or institutions like hospitals, banks, and companies), each of ...
arXiv:2107.05252v4
fatcat:u2uaa4fbrvdb3jqbipttcmtvq4
The Non-IID Data Quagmire of Decentralized Machine Learning
[article]
2020
arXiv
pre-print
Many large-scale machine learning (ML) applications need to perform decentralized learning over datasets generated at different devices and locations. ...
Such datasets pose a significant challenge to decentralized learning because their different contexts result in significant data distribution skew across devices/locations. ...
Decentralized learning. ...
arXiv:1910.00189v2
fatcat:vtlj6cznunavrk57swfedpqdgm
Stochastic Distributed Optimization for Machine Learning from Decentralized Features
[article]
2019
arXiv
pre-print
We propose an asynchronous stochastic gradient descent (SGD) algorithm for such a feature distributed machine learning (FDML) problem, to jointly learn from decentralized features, with theoretical convergence ...
We study distributed machine learning from another perspective, where the information about the training same samples are inherently decentralized and located on different parities. ...
RELATED WORK Distributed Machine Learning. ...
arXiv:1812.06415v2
fatcat:r2co4bpg4vfk3mbybe2khcl4gy
Communication-efficient Decentralized Machine Learning over Heterogeneous Networks
[article]
2020
arXiv
pre-print
In the last few years, distributed machine learning has been usually executed over heterogeneous networks such as a local area network within a multi-tenant cluster or a wide area network connecting data ...
In these heterogeneous networks, the link speeds among worker nodes vary significantly, making it challenging for state-of-the-art machine learning approaches to perform efficient training. ...
INTRODUCTION Recently, distributed machine learning has become increasingly popular. ...
arXiv:2009.05766v2
fatcat:jpe64nhtwjeaff636kojlxlvla
Adversary-resilient Distributed and Decentralized Statistical Inference and Machine Learning
[article]
2020
arXiv
pre-print
While the last few decades have witnessed a huge body of work devoted to inference and learning in distributed and decentralized setups, much of this work assumes a non-adversarial setting in which individual ...
As a result, we now have a plethora of algorithmic approaches that guarantee robustness of distributed and/or decentralized inference and learning under different adversarial threat models. ...
Decentralized Machine Learning Decentralized machine learning algorithms, which can be considered a combination of consensus and distributed learning frameworks, approximately solve (2) by minimizing a ...
arXiv:1908.08649v2
fatcat:de356dvwinfv5g5njo64qmzpvi
Decentralized machine learning using compressed push-pull averaging
2020
Proceedings of the 1st International Workshop on Distributed Infrastructure for Common Good
For decentralized learning algorithms communication efficiency is a central issue. On the one hand, good machine learning models require more and more parameters. ...
Here, we propose a novel compression mechanism for P2P machine learning that is based on the application of stateful codecs over P2P links. ...
For example, when it is used as part of a decentralized machine learning platform that runs different learning tasks continuously. Only the second 24 hours are shown in the plots. ...
doi:10.1145/3428662.3428792
fatcat:docwsqcrmfgjzdohq7pgnbrthm
Consensus-Based Transfer Linear Support Vector Machines for Decentralized Multi-Task Multi-Agent Learning
[article]
2018
arXiv
pre-print
Transfer learning has been developed to improve the performances of different but related tasks in machine learning. ...
We propose a consensus-based distributed transfer learning framework, where several tasks aim to find the best linear support vector machine (SVM) classifiers in a distributed network. ...
The proposed framework is a generalization of both centralized transfer learning scheme and distributed machine learning. ...
arXiv:1706.05039v2
fatcat:nznequoyhrad3a7s3dfyoiltfe
Learning to Act in Decentralized Partially Observable MDPs
2018
International Conference on Machine Learning
We address a long-standing open problem of reinforcement learning in decentralized partially observable Markov decision processes. ...
Experiments show our approach can learn to act near-optimally in many finite domains from the literature. ...
Proceedings of the 35 th International Conference on Machine Learning, Stockholm, Sweden, PMLR 80, 2018. Copyright 2018 by the author(s). ...
dblp:conf/icml/DibangoyeB18
fatcat:b7wvrytlqjdrhpehyxewkgnziq
Drynx: Decentralized, Secure, Verifiable System for Statistical Queries and Machine Learning on Distributed Datasets
[article]
2020
arXiv
pre-print
Drynx relies on a set of computing nodes to enable the computation of statistics such as standard deviation or extrema, and the training and evaluation of machine-learning models on sensitive and distributed ...
In this paper, we propose Drynx, a decentralized system for privacy-conscious statistical analysis on distributed datasets. ...
evaluate machine-learning models on data hosted at different sources, i.e., on distributed datasets. ...
arXiv:1902.03785v3
fatcat:jhs2wwxf3jgpxnc7hio7c5xcf4
An Improved Analysis of Gradient Tracking for Decentralized Machine Learning
[article]
2022
arXiv
pre-print
We consider decentralized machine learning over a network where the training data is distributed across n agents, each of which can compute stochastic model updates on their local data. ...
[24] on decentralized stochastic gradient descent (D-SGD) has spurred the research on decentralized training methods for machine learning models. ...
Introduction Methods that train machine learning models on decentralized data offer many advantages over traditional centralized approaches in core aspects such as data ownership, privacy, fault tolerance ...
arXiv:2202.03836v1
fatcat:p3oterj35vhpdkshtdodx45udu
« Previous
Showing results 1 — 15 out of 58,231 results