Filters








167,939 Hits in 4.3 sec

Speeding Up Distributed Machine Learning Using Codes

Kangwook Lee, Maximilian Lam, Ramtin Pedarsani, Dimitris Papailiopoulos, Kannan Ramchandran
2018 IEEE Transactions on Information Theory  
can speed up distributed matrix multiplication by a factor of n.  ...  there has been little interaction cutting across codes, machine learning, and distributed systems.  ...  In this work, we explore a new research agenda, that is driven by the question: Can codes speed up distributed machine learning?  ... 
doi:10.1109/tit.2017.2736066 fatcat:vnllavqx3vforgbrorcypq235e

Speeding up distributed machine learning using codes

Kangwook Lee, Maximilian Lam, Ramtin Pedarsani, Dimitris Papailiopoulos, Kannan Ramchandran
2016 2016 IEEE International Symposium on Information Theory (ISIT)  
We show how codes can be used to speed up two of the most basic building blocks of distributed ML algorithms: data shuffling and matrix multiplication.  ...  However, there has been little interaction between Codes, Machine Learning, and Distributed Systems. In this work, we scratch the tip of the "Coding for Distributed ML" iceberg.  ...  In this work, we explore a new research agenda, that is driven by the question: Can codes speed up distributed machine learning?  ... 
doi:10.1109/isit.2016.7541478 dblp:conf/isit/LeeLPPR16 fatcat:zrzj7v7dxrh5novlu2qsh4gj2m

Predictive search distributions

Edwin V. Bonilla, Christopher K. I. Williams, Felix V. Agakov, John Cavazos, John Thomson, Michael F. P. O'Boyle
2006 Proceedings of the 23rd international conference on Machine learning - ICML '06  
This predictive distribution is then used to focus the search.  ...  Now we can define a machine learning problem to predict the distribution of good solutions q(s|x) for a new problem with features x, where s denotes a solution.  ...  Acknowledgements This work is supported under EPSRC grant GR/S71118/01 Compilers that Learn to Optimize and in part by the IST Programme of the European Community, under the PASCAL Network of Excellence  ... 
doi:10.1145/1143844.1143860 dblp:conf/icml/BonillaWACTO06 fatcat:fpc7k42hqrd25iufmqd3lw3w3q

Training Large Scale Deep Neural Networks on the Intel Xeon Phi Many-Core Coprocessor

Lei Jin, Zhaokang Wang, Rong Gu, Chunfeng Yuan, Yihua Huang
2014 2014 IEEE International Parallel & Distributed Processing Symposium Workshops  
In this paper, we propose a many-core algorithm which is based on a parallel method and is used in the Intel Xeon Phi many-core systems to speed up the unsupervised training process of Sparse Autoencoder  ...  As a new area of machine learning research, the deep learning algorithm has attracted a lot of attention from the research community. It may bring human beings to a higher cognitive level of data.  ...  Google has distributed a very large deep network to hundreds of computing nodes and uses lock-less asynchronous update to speed up the procedure.  ... 
doi:10.1109/ipdpsw.2014.194 dblp:conf/ipps/JinWGYH14 fatcat:tpmcupt4indklfoj3dtwm65rze

Reproducible Experiment Platform

Tatiana Likhomanenko, Alex Rogozhnikov, Alexander Baranov, Egor Khairullin, Andrey Ustyuzhanin
2015 Journal of Physics, Conference Series  
can see that complexity of those analyses increases fast due to a)~enormous volumes of datasets being analyzed, b)~variety of techniques and algorithms one have to check inside a single analysis, c)~distributed  ...  A typical analysis task, search for optimal predictive model, can be speeded up using parallel computational system. In Figure 1 they are presented as a GRID/cluster system.  ...  To speed up training operations parallel system (IPython cluster, threads) is available during factory and grid search fitting.  ... 
doi:10.1088/1742-6596/664/5/052022 fatcat:odobiradtjc5hpy7birb77ig7u

shapr: An R-package for explaining machine learning models with dependence-aware Shapley values

Nikolai Sellereite, Martin Jullum
2019 Journal of Open Source Software  
(Eddelbuettel & Sanderson, 2014) for computational speed up.  ...  Summary A common task within machine learning is to train a model to predict an unknown outcome (response variable) based on a set of known input variables/features.  ... 
doi:10.21105/joss.02027 fatcat:q63df27izvbwljvz4fj3t23nju

Kinematic Frequencies of Rotating Equipment Identified with Sparse Coding and Dictionary Learning

Sergio Martin-del-Campo, Fredrik Sandin, Stephan Schnabel
2019 Proceedings of the Annual Conference of the Prognostics and Health Management Society, PHM  
Unsupervised machine learning methods, like sparse coding with dictionary learning, enable automatic modeling and characterization of repeating signal structures in the time domain, which are naturally  ...  Sparse coding with dictionary learning offers a possibility for self-learning of the kinematic frequencies of a bearing, which can be useful for the further improvement of automated anomaly detection methods  ...  Another unsupervised machine learning approach is sparse coding. B.  ... 
doi:10.36001/phmconf.2019.v11i1.837 fatcat:f76j2hls6zfd5pbifkbpqrzvxa

MeCP2 deletion impaired layer 2/3-dominant dynamic reorganization of cortical circuit during motor skill learning [article]

Yuanlei Yue, Pan Xu, Zhichao Liu, Zekai Chen, Juntao Su, Ai Chen, Ryan T. Ash, Emre Barut, Rahul Simha, Stelios Smirnakis, Chen Zeng, Hui Lu
2019 bioRxiv   pre-print
the motor cortex while mice learned to run on a speed-controlled treadmill.  ...  Machine learning-based analysis of the neuronal events recorded with in vivo two-photon calcium imaging revealed procedural learning-induced circuit reorganization in superficial but not deep layers of  ...  day 1, 3 and 7 at speed-up mode.  ... 
doi:10.1101/786822 fatcat:smnegycaurfddlewefdcvszx2a

Different Tools for Implementing ML Algorithms in Data Mining Tasks

W Sarada and Dr.P.V.Kumar
2020 International journal of modern trends in science and technology  
In this paper I would like to discuss about the different tools which can be used for implementing machine learning algorithms for the tasks in data mining using java, etc.  ...  There are various tools which are available as open source or for free to use for executing or implementing various algorithms depending up on our project requirement.  ...  Everybody can learn machine learning now a days whether he knows coding or not, as there are various apps like DataScience101 app for learning different ML algorithms like KNN, Linear regression, SVM and  ... 
doi:10.46501/ijmtstciet06 fatcat:7hpnl7r5trdyzaaabg4tc3gujq

Milepost GCC: Machine Learning Enabled Self-tuning Compiler

Grigori Fursin, Yuriy Kashnikov, Abdul Wahid Memon, Zbigniew Chamski, Olivier Temam, Mircea Namolaru, Elad Yom-Tov, Bilha Mendelson, Ayal Zaks, Eric Courtois, Francois Bodin, Phil Barnard (+5 others)
2011 International journal of parallel programming  
In this paper we describe Milepost GCC, the first publicly-available open-source machine learning-based compiler.  ...  We developed machine learning plugins based on probabilistic and transductive approaches to predict good combinations of optimizations.  ...  Hence it is important to speed up iterative compilation. In the next section, we present the Milepost framework which speeds up program optimization through machine learning.  ... 
doi:10.1007/s10766-010-0161-2 fatcat:r6s7qcgunzf5hcgllcorvkir4m

Hyper: Distributed Cloud Processing for Large-Scale Deep Learning Tasks [article]

Davit Buniatyan
2019 arXiv   pre-print
The system implements a distributed file system and failure-tolerant task processing scheduler, independent of the language and Deep Learning framework used.  ...  We introduce a hybrid distributed cloud framework with a unified view to multiple clouds and an on-premise infrastructure for processing tasks using both CPU and GPU compute instances at scale.  ...  Fig. 2 . 2 Hyper File System on a single machine (AWS p3.2xlarge) can achieve up to 875 MB/s download speed with multithreading T and multiprocessing P enabled.  ... 
arXiv:1910.07172v1 fatcat:5qlpul5yqzfyrhfgallvnyj4ne

Page 488 of The American Psychologist Vol. 1, Issue 11 [page]

1946 The American Psychologist  
When the military requirements for code operators were high, some code schools tried to force a speed-up by keeping men at code learning for 7 hours a day in spite of fatigue and boredom.  ...  These contributions were new in the area of military psychology, and they were officially distributed and extensively used in the Navy.  ... 

Hard-Coded Gaussian Attention for Neural Machine Translation [article]

Weiqiu You, Simeng Sun, Mohit Iyyer
2020 arXiv   pre-print
We push further in this direction by developing a "hard-coded" attention variant without any learned parameters.  ...  Much of this BLEU drop can be recovered by adding just a single learned cross attention head to an otherwise hard-coded Transformer.  ...  Single-headed cross attention speeds up decoding: Despite removing learned self-attention from both the encoder and decoder, we did not observe huge efficiency or speed gains.  ... 
arXiv:2005.00742v1 fatcat:jmkieyd4qnfodezmhmwnpvrkau

Integrating Human Patterns of Qualitative Coding with Machine Learning: A Pilot Study Involving Technology-Induced Error Incident Reports [chapter]

Elizabeth M. Borycki, Amr Farghali, Andre W. Kushniruk
2022 Studies in Health Technology and Informatics  
The objective of this research was to develop a reproducible method of integrating human patterns of qualitative coding with machine learning.  ...  The application of qualitative codes from the technology-induced error and safety literatures to the analysis of incident reports was done successfully, helping to identify the factors that lead to an  ...  The manual portion of coding took several days but this was speeded up with the automated portion being coded in close to real time.  ... 
doi:10.3233/shti220716 pmid:35773862 fatcat:zefai433ffatth6xkq24ph76jq

Lowering post‐construction yield assessment uncertainty through better wind plant power curves

Nicola Bodini, Mike Optis, Jordan Perr‐Sauer, Eric Simley, M. Jason Fields
2021 Wind Energy  
Adding input variables to the machine-learning model at daily resolution can further reduce regression uncertainty, with up to a À10% relative change.  ...  Here, we evaluate the benefits in augmenting this conventional approach by testing alternative regressions performed with multiple inputs, at a finer time resolution, and using nonlinear machine-learning  ...  DATA AVAILABILITY STATEMENT The wind power plant data used in this work are proprietary and cannot be shared with the public.  ... 
doi:10.1002/we.2645 fatcat:x62ch3a2jbhy7nspawdmkow6wy
« Previous Showing results 1 — 15 out of 167,939 results