Filters








2,398 Hits in 4.6 sec

Scheduling optimization of parallel linear algebra algorithms using Supervised Learning [article]

G. Laberge and S. Shirzad and P. Diehl and H. Kaiser and S. Prudhomme and A. Lemoine
2019 arXiv   pre-print
Linear algebra algorithms are used widely in a variety of domains, e.g machine learning, numerical physics and video games graphics.  ...  In this paper, we study the applications of supervised learning models to predict the chunk-size which yields maximum performance on multiple parallel linear algebra operations using the HPX backend of  ...  Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the Department of Energy.  ... 
arXiv:1909.03947v2 fatcat:6dnzgsegkbf2lhqnzdxlhxkepy

Large-scale deep unsupervised learning using graphics processors

Rajat Raina, Anand Madhavan, Andrew Y. Ng
2009 Proceedings of the 26th Annual International Conference on Machine Learning - ICML '09  
The promise of unsupervised learning methods lies in their potential to use vast amounts of unlabeled data to learn complex, highly nonlinear models with millions of free parameters.  ...  Unfortunately, current learning algorithms for both models are too slow for large-scale applications, forcing researchers to focus on smaller-scale models, or to use fewer training examples.  ...  This work was supported by the DARPA transfer learning program under contract number FA8750-  ... 
doi:10.1145/1553374.1553486 dblp:conf/icml/RainaMN09 fatcat:ixd3cwfbc5h4rfhjcgxswpuukm

Book reports

1993 Computers and Mathematics with Applications  
Supervised learning (training) using parametric and nonparametric approaches. 4. Linear discriminant functions and the discrete ~ binary feature cases. 5. Unsupervised learning and clustering. IH.  ...  Implementation details of the PMF stereo algorithm. 4. Binocular stereo algorithm based on the disparitygradient limit and using optimization theory. 5.  ... 
doi:10.1016/0898-1221(93)90315-m fatcat:tnmkz3axfnekvgr4srsgsivuvy

Book reports

2005 Computers and Mathematics with Applications  
Review of Linear Algebra. 2.5. Exercises. 2.6. References. 3. Linear Block Codes. 3.1. Basic Definitions. 3.2. The Generator Matrix Description of Linear Block Codes. 3.2.1.  ...  Which of Them? 3.7. Summary. References. 4. Meraheuristics and Parallelism. 4.1. Introduction. 4.2. Parallel LSMs. 4.3. Case Studies of Parallel LSMs. 4.4. Parallel Evolutionary Algorithms. 4.5.  ... 
doi:10.1016/j.camwa.2005.10.001 fatcat:smjkjmzbkfdbvezerczxaktjfy

Declarative Data Analytics: a Survey [article]

Nantia Makrynioti Athens University of Economics, Business)
2019 arXiv   pre-print
The area of declarative data analytics explores the application of the declarative paradigm on data science and machine learning.  ...  The survey explores a wide range of declarative data analysis frameworks by examining both the programming model and the optimization techniques used, in order to provide conclusions on the current state  ...  ACKNOWLEDGMENTS We thank Panagiotis-Ioannis Betchavas for the implementation of Linear Regression using DML in section 4.4.1.  ... 
arXiv:1902.01304v1 fatcat:mixepfprkjc5xayhz76bwu3px4

Page 5442 of Mathematical Reviews Vol. , Issue 88j [page]

1988 Mathematical Reviews  
Wachter, Optimization of special permutation net- works using simple algebraic relations (pp. 169-175); Clark D. Thomborson [Clark D. Thompson], Linda L. Deneen and Gary M.  ...  Mirenkov, Parallel algorithms and static analysis of parallel programs (pp. 48-59); B. Monien and O. Vornberger, Parallel pro- cessing of combinatorial search trees (pp. 60-69); A. Apostolico, C. §.  ... 

Data Management in Machine Learning

Arun Kumar, Matthias Boehm, Jun Yang
2017 Proceedings of the 2017 ACM International Conference on Management of Data - SIGMOD '17  
We focus on three complementary lines of work: (1) integrating ML algorithms and languages with existing data systems such as RDBMSs, (2) adapting data management-inspired techniques such as query optimization  ...  Large-scale data analytics using statistical machine learning (ML), popularly called advanced analytics, underpins many modern data-driven applications.  ...  Second, systems like SAP HANA aim to integrate linear algebra for a wide range of ML algorithms.  ... 
doi:10.1145/3035918.3054775 dblp:conf/sigmod/Kumar0017 fatcat:zph46pk72vc5nn3axspvvuxpx4

Prospects and challenges of quantum finance [article]

Adam Bouland, Wim van Dam, Hamed Joorati, Iordanis Kerenidis, Anupam Prakash
2020 arXiv   pre-print
We consider quantum speedups for Monte Carlo methods, portfolio optimization, and machine learning.  ...  The near-term relevance of these quantum finance algorithms varies widely across applications - some of them are heuristic algorithms designed to be amenable to near-term prototype quantum computers, while  ...  Machine learning algorithms work usually with real valued data and make extensive use of linear algebra sub-routines.  ... 
arXiv:2011.06492v1 fatcat:mqzj2a2pzzaz5pdcllxgkr73oq

TuPAQ: An Efficient Planner for Large-scale Predictive Analytic Queries [article]

Evan R. Sparks, Ameet Talwalkar, Michael J. Franklin, Michael I. Jordan, Tim Kraska
2015 arXiv   pre-print
, and physical optimization via batching.  ...  In this work, we build upon these recent efforts and propose an integrated PAQ planning architecture that combines advanced model search techniques, bandit resource allocation via runtime algorithm introspection  ...  Thanks to Trevor Darrell, Yangqing Jia, and Sergey Karayev who provided featurized imagenet data, Ben Recht who provided valuable ideas about derivative-free optimization and feedback, and Shivaram Venkataraman  ... 
arXiv:1502.00068v2 fatcat:l5ane47jazgq7cm3w7wh5cmho4

Analyzing Analytics

Rajesh Bordawekar, Bob Blainey, Ruchir Puri
2015 Synthesis Lectures on Computer Architecture  
We then use this information to characterize and recommend suitable parallelization strategies for these algorithms, specifically when used in data management workloads.  ...  In this survey paper, we identify some of the key techniques employed in analytics both to serve as an introduction for the non-specialist and to explore the opportunity for greater optimizations for parallelization  ...  These hardware components can be then used to optimize re-usable software kernel functions (e.g., numerical linear algebra, distance functions, etc.), which themselves can be parallelized by a variety  ... 
doi:10.2200/s00678ed1v01y201511cac035 fatcat:jkjywe5rzzaupjwq5rjyavqxi4

Scaling Distributed Machine Learning with the Parameter Server

Mu Li
2014 Proceedings of the 2014 International Conference on Big Data Science and Computing - BigDataScience '14  
We propose a parameter server framework for distributed machine learning problems.  ...  To demonstrate the scalability of the proposed framework, we show experimental results on petabytes of real data with billions of examples and parameters on problems ranging from Sparse Logistic Regression  ...  We use the Wikipedia (and other Wiki projects) page view statistics as benchmark. Each entry is an unique key of a webpage with the corresponding number of requests served in a hour.  ... 
doi:10.1145/2640087.2644155 fatcat:l2lwr5t6undvroygn6ohatx3de

Load Balancing Optimization Based on Deep Learning Approach in Cloud Environment

Amanpreet Kaur, Associate Professor, Chandigarh Engineering College, Landran (Mohali), Bikrampal Kaur, Parminder Singh, Mandeep Singh Devgan, Harpreet Kaur Toor
2020 International Journal of Information Technology and Computer Science  
Optimal schedule for VMs has been generated using Deep Learning based technique. The Genome workflow tasks have been taken as input to the suggested framework.  ...  In cloud computing, Deep Learning (DL) techniques can be used to achieve QoS such as improve resource utilization and throughput; while reduce latency, response time and cost, balancing load across machines  ...  This algorithm is best to compute the extracted features (cost and time) of the workflow tasks since these features cannot be computed analytically using linear algebra. the objective is to compute the  ... 
doi:10.5815/ijitcs.2020.03.02 fatcat:ncsbiikxw5aldkrqgjr47q6s6e

Adaptive Elastic Training for Sparse Deep Learning on Heterogeneous Multi-GPU Servers [article]

Yujing Ma, Florin Rusu, Kesheng Wu, Alexander Sim
2021 arXiv   pre-print
We address these challenges with Adaptive SGD, an adaptive elastic model averaging stochastic gradient descent algorithm for heterogeneous multi-GPUs that is characterized by dynamic scheduling, adaptive  ...  Motivated by extreme multi-label classification applications, we consider training deep learning models over sparse data in multi-GPU servers.  ...  TensorFlow [1] and Omnivore [20] support deep learning on heterogeneous CPU+GPU architectures. They schedule linear algebra primitives across CPU and GPU.  ... 
arXiv:2110.07029v1 fatcat:b6qigugo6za7xfjr5abmy3k5m4

Automating model search for large scale machine learning

Evan R. Sparks, Ameet Talwalkar, Daniel Haas, Michael J. Franklin, Michael I. Jordan, Tim Kraska
2015 Proceedings of the Sixth ACM Symposium on Cloud Computing - SoCC '15  
tuning techniques, bandit resource allocation via runtime algorithm introspection, and physical optimization via batching and optimal resource allocation.  ...  In this work, we build upon these recent efforts and propose an architecture for automatic machine learning at scale comprised of a cost-based cluster resource allocation estimator, advanced hyperparameter  ...  Thanks to Trevor Darrell, Yangqing Jia, and Sergey Karayev who provided featurized ImageNet data, Ben Recht who provided valuable ideas about derivative-free optimization and feedback, and Shivaram Venkataraman  ... 
doi:10.1145/2806777.2806945 dblp:conf/cloud/SparksTHFJK15 fatcat:y4mzheh2ejf5fmimcauahaif5i

Julia language in machine learning: Algorithms, applications, and open issues

Kaifeng Gao, Gang Mei, Francesco Piccialli, Salvatore Cuomo, Jingzhi Tu, Zenan Huo
2020 Computer Science Review  
Currently, the programming languages most commonly used to develop machine learning algorithms include Python, MATLAB, and C/C ++.  ...  Then, it investigates applications of the machine learning algorithms implemented with the Julia language.  ...  Acknowledgments This research was jointly supported by the National Natural Science Foundation of China (11602235), the Fundamental Research Funds for China Central Universities (2652018091), and the Major  ... 
doi:10.1016/j.cosrev.2020.100254 fatcat:gdt66djfvjfqpjou3lvemxsxfy
« Previous Showing results 1 — 15 out of 2,398 results