Filters








5,446 Hits in 3.0 sec

Federated Neural Architecture Search [article]

Mengwei Xu, Yuxin Zhao, Kaigui Bian, Gang Huang, Qiaozhu Mei, Xuanzhe Liu
2020 arXiv   pre-print
However, training over decentralized data makes the design of neural architecture quite difficult as it already was.  ...  In this work, we propose an automatic neural architecture search into the decentralized training, as a new DNN training paradigm called Federated Neural Architecture Search, namely federated NAS.  ...  To address this major challenge, this paper proposes a new paradigm for DNN training to enable automatic neural architecture search (NAS) on decentralized data, called federated NAS.  ... 
arXiv:2002.06352v4 fatcat:wmnsezxtc5hmnaj2tjqx4ii3iu

Asynchronous Decentralized Learning of a Neural Network [article]

Xinyue Liang, Alireza M. Javid, Mikael Skoglund, Saikat Chatterjee
2020 arXiv   pre-print
In this work, we exploit an asynchronous computing framework namely ARock to learn a deep neural network called self-size estimating feedforward neural network (SSFN) in a decentralized scenario.  ...  Using this algorithm namely asynchronous decentralized SSFN (dSSFN), we provide the centralized equivalent solution under certain technical assumptions.  ...  Gradient descent for a neural network can be realized in a decentralized setup where training data is distributed over nodes.  ... 
arXiv:2004.05082v1 fatcat:hwosehepqnekvdfsncn2c2ltey

A Low Complexity Decentralized Neural Net with Centralized Equivalence using Layer-wise Learning [article]

Xinyue Liang, Alireza M. Javid, Mikael Skoglund, Saikat Chatterjee
2020 arXiv   pre-print
We design a low complexity decentralized learning algorithm to train a recently proposed large neural network in distributed processing nodes (workers).  ...  We show that it is possible to achieve equivalent learning performance as if the data is available in a single place.  ...  Comparison with decentralized gradient search We now present a comparison with distributed gradient descent for neural networks.  ... 
arXiv:2009.13982v1 fatcat:zg2p7vyunvbo3g4ekwfhzzqcqy

Graph Neural Networks for Decentralized Multi-Robot Submodular Action Selection [article]

Lifeng Zhou, Vishnu D. Sharma, Qingbiao Li, Amanda Prorok, Alejandro Ribeiro, Vijay Kumar
2021 arXiv   pre-print
Particularly, our learning architecture leverages a graph neural network (GNN) to capture local interactions of the robots and learns decentralized decision-making for the robots.  ...  In this work, we propose a general-purpose learning architecture towards submodular maximization at scale, with decentralized communications.  ...  Thus, the training mation) with its neighbors Ni over multiple communication set D can be constructed as a collection of these data, i.e., D := {({Zi }i∈V , G, U g )}.  ... 
arXiv:2105.08601v2 fatcat:vqr5jvrou5e3jbase4uwzfresy

Towards Crowdsourced Training of Large Neural Networks using Decentralized Mixture-of-Experts [article]

Max Ryabinin, Anton Gusev
2020 arXiv   pre-print
We analyze the performance, reliability, and architectural constraints of this paradigm and compare it against existing distributed training techniques.  ...  For instance, the cluster used to train GPT-3 costs over $250 million. As a result, most researchers cannot afford to train state of the art models and contribute to their development.  ...  Selecting the best experts is then reduced to a beam search over this tree, which scales logarithmically in the number of experts. More recent study by Lample et al.  ... 
arXiv:2002.04013v3 fatcat:c5u36uzdenannmgtj6m3bpq2sm

Differentially-private Federated Neural Architecture Search [article]

Ishika Singh, Haoyi Zhou, Kunlin Yang, Meng Ding, Bill Lin, Pengtao Xie
2020 arXiv   pre-print
In many application scenarios, several parties would like to collaboratively search for a shared neural architecture by leveraging data from all parties.  ...  Neural architecture search, which aims to automatically search for architectures (e.g., convolution, max pooling) of neural networks that maximize validation performance, has achieved remarkable progress  ...  search of neural architectures.  ... 
arXiv:2006.10559v2 fatcat:7xpjs4bc6rf5nj6ljle4mnzdpm

Using neural search approach for resource discovery in P2P networks

Hesam Yousefipour, Zahra Jafari
2011 Procedia Computer Science  
In this paper, we describe a method that benefits from the flexibility of Neural Networks and not only takes into account most of the measures previously used to optimize the searching process, but also  ...  Previously, to search for one requested resource, algorithms were invented that exploited the local knowledge about the network.  ...  Related Work P2P Networks in essence are categorized by three main architectures: Centralized, Decentralized but Structured and Decentralized and not Structured.  ... 
doi:10.1016/j.procs.2011.01.040 fatcat:fspqxjbdynh5fa3iobpjor6f2u

Blockchain and Artificial Intelligence [article]

Tshilidzi Marwala, Bo Xing
2018 arXiv   pre-print
However, a common misunderstanding about blockchain concept, in particular, is that blockchain is decentralized and is not controlled by anyone.  ...  Take smart contract as an example, it is essentially a collection of codes (or functions) and data (or states) that are programmed and deployed on a blockchain (say, Ethereum) by different human programmers  ...  Since roughly half of the total funding spent on software projects goes to software testing [8] , it is thus not surprising that over half of SBSE publications are also dedicated to search based software  ... 
arXiv:1802.04451v2 fatcat:qh5eb66amjcqnnazlyx3gr7f3y

Scalable Deep Learning on Distributed Infrastructures: Challenges, Techniques and Tools [article]

Ruben Mayer, Hans-Arno Jacobsen
2019 arXiv   pre-print
One of the reasons for this success is the increasing size of DL models and the proliferation of vast amounts of training data being available.  ...  This incorporates infrastructures for DL, methods for parallel DL training, multi-tenant resource scheduling and the management of training and model data.  ...  When a node in the decentralized architecture fails, other nodes can easily take over its workload and the training proceeds without interruptions.  ... 
arXiv:1903.11314v2 fatcat:y62z7mteyzeq5kenb7srwtlg7q

Predicting digital asset market based on blockchain activity data [article]

Zvezdin Besarabov, Todor Kolev
2018 arXiv   pre-print
The standard approach for asset value predictions is based on market analysis with an LSTM neural network.  ...  By utilizing blockchain account distribution histograms, spatial dataset modeling, and a Convolutional architecture, the error was reduced further by 26%.  ...  The process is called neural architecture search (NAS). The authors emphasize the enormous computational power required for achieving these results which was in order of 10 20 computation.  ... 
arXiv:1810.06696v1 fatcat:42fjftctnnfwfpj5l5qe63pzii

Fully Decentralized Reinforcement Learning-based Control of Photovoltaics in Distribution Grids for Joint Provision of Real and Reactive Power [article]

Rayan El Helou, Dileep Kalathil, Le Xie
2021 arXiv   pre-print
By representing the system in a quasi-steady state manner, and by carefully formulating the Markov decision process, we reduce the complexity of the problem and allow for fully decentralized (communication-free  ...  We demonstrate that by reducing the search space, for the simper 16-bus subsystem, from over 8000 parameters for a centralized setting to under 700 parameters for a decentralized setting, we still achieve  ...  To replace the centralized policy, used in standard PPO, with a decentralized policy, we propose a neural network architecture for π that connects input to output only at the same bus, rendering it equivalent  ... 
arXiv:2008.01231v3 fatcat:jx7a2doc5rdv5pcs7tkgd3kwsy

Fully Decentralized Reinforcement Learning-based Control of Photovoltaics in Distribution Grids for Joint Provision of Real and Reactive Power

Rayan El Helou, Dileep Kalathil, Le Xie
2021 IEEE Open Access Journal of Power and Energy  
By representing the system in a quasi-steady state manner, and by carefully formulating the Markov decision process, we reduce the complexity of the problem and allow for fully decentralized (communication-free  ...  We demonstrate that by reducing the search space, for the simper 16-bus subsystem, from over 8000 parameters for a centralized setting to under 700 parameters for a decentralized setting, we still achieve  ...  To replace the centralized policy, used in standard PPO, with a decentralized policy, we propose a neural network architecture for π that connects input to output only at the same bus, rendering it equivalent  ... 
doi:10.1109/oajpe.2021.3077218 fatcat:wfhzoxvkufajlabvckhodvsfhu

FedML: A Research Library and Benchmark for Federated Machine Learning [article]

Chaoyang He, Songze Li, Jinhyun So, Xiao Zeng, Mi Zhang, Hongyi Wang, Xiaoyang Wang, Praneeth Vepakomma, Abhishek Singh, Hang Qiu, Xinghua Zhu, Jianzong Wang (+8 others)
2020 arXiv   pre-print
Federated Neural Architecture Search (FedNAS).  ...  in neural architecture search-based FL [67, 68, 69] .  ... 
arXiv:2007.13518v4 fatcat:tyoav4xm3bgqbdy2gctnjfeb5i

Coin.AI: A Proof-of-Useful-Work Scheme for Blockchain-based Distributed Deep Learning [article]

Alejandro Baldominos, Yago Saez
2019 arXiv   pre-print
As of today, there are many different implementations of cryptocurrencies working over a blockchain, with different approaches and philosofies.  ...  In this paper, we present a theoretical proposal that introduces a proof-of-useful-work scheme to support a cryptocurrency running over a blockchain, which we named Coin.AI.  ...  More recently, improvements in hardware have enabled the automation of this search process, leading to the rise of "neural architecture search", where techniques such as reinforcement learning [36] or  ... 
arXiv:1903.09800v1 fatcat:gvrq5ugjozbtbct6nsctm3r4ny

Demystifying Parallel and Distributed Deep Learning: An In-Depth Concurrency Analysis [article]

Tal Ben-Nun, Torsten Hoefler
2018 arXiv   pre-print
We discuss asynchronous stochastic optimization, distributed system architectures, communication schemes, and neural architecture search.  ...  Deep Neural Networks (DNNs) are becoming an important tool in modern computing applications.  ...  This limitation promoted a recent rise of research into automated neural architecture search.  ... 
arXiv:1802.09941v2 fatcat:ne2wiplln5eavjvjwf5to7nwsu
« Previous Showing results 1 — 15 out of 5,446 results