Filters








19 Hits in 11.4 sec

No MCMC for me: Amortized sampling for fast and stable training of energy-based models [article]

Will Grathwohl, Jacob Kelly, Milad Hashemi, Mohammad Norouzi, Kevin Swersky, David Duvenaud
2021 arXiv   pre-print
In this work, we present a simple method for training EBMs at scale which uses an entropy-regularized generator to amortize the MCMC sampling typically used in EBM training.  ...  Energy-Based Models (EBMs) present a flexible and appealing way to represent uncertainty.  ...  Training Method Fast Stable training High dimensions No aux.  ... 
arXiv:2010.04230v3 fatcat:a3n7ekjmybhbdh2kedl2z7cvmu

Hamiltonian Dynamics with Non-Newtonian Momentum for Rapid Sampling [article]

Greg Ver Steeg, Aram Galstyan
2021 arXiv   pre-print
ESH dynamics converge faster than their MCMC competitors enabling faster, more stable training of neural network energy models.  ...  Sampling from an unnormalized probability distribution is a fundamental problem in machine learning with applications including Bayesian modeling, latent factor inference, and energy-based model training  ...  No mcmc for me: Amortized sampling for fast and stable training of energy-based models. arXiv preprint arXiv:2010.04230, 2020. [14] Rithesh Kumar, Sherjil Ozair, Anirudh Goyal, Aaron Courville, and  ... 
arXiv:2111.02434v3 fatcat:7kec7hrvkvf3fjovt7tilrkqsm

Neural Density Estimation and Likelihood-free Inference [article]

George Papamakarios
2019 arXiv   pre-print
The contribution of the thesis is a set of new methods for addressing these problems that are based on recent advances in neural networks and deep learning.  ...  I consider two problems in machine learning and statistics: the problem of estimating the joint probability density of a collection of random variables, known as density estimation, and the problem of  ...  George Papamakarios and Theo Pavlakou were supported by the Centre for Doctoral Training in Data Science, funded by EPSRC (grant EP/L016427/1) and the University of Edinburgh.  ... 
arXiv:1910.13233v1 fatcat:ftbqx3mno5e4bdxiufizdcglhq

The Internet of Federated Things (IoFT): A Vision for the Future and In-depth Survey of Data-driven Approaches for Federated Learning [article]

Raed Kontar, Naichen Shi, Xubo Yue, Seokhyun Chung, Eunshin Byon, Mosharaf Chowdhury, Judy Jin, Wissam Kontar, Neda Masoud, Maher Noueihed, Chinedum E. Okwudire, Garvesh Raskutti (+3 others)
2021 arXiv   pre-print
This article provides a vision for IoFT and a systematic overview of current efforts towards realizing this vision.  ...  In the IoT system of the future, IoFT, the cloud will be substituted by the crowd where model training is brought to the edge, allowing IoT devices to collaboratively extract knowledge and build smart  ...  Assessment of demand response and advanced me- [79] EIA. U.S energy information administration, tering.  ... 
arXiv:2111.05326v1 fatcat:bbgdhtuqcrhstgakt2vxuve2ca

Truncated Log-concave Sampling with Reflective Hamiltonian Monte Carlo [article]

Apostolos Chalkis, Vissarion Fisikopoulos, Marios Papachristou, Elias Tsigaridas
2021 arXiv   pre-print
) steps for a well-rounded convex body where κ = L / m is the condition number of the negative log-density, d is the dimension, ℓ is an upper bound on the number of reflections, and ε is the accuracy parameter  ...  We introduce Reflective Hamiltonian Monte Carlo (ReHMC), an HMC-based algorithm, to sample from a log-concave distribution restricted to a convex body.  ...  Preprint. provide MCMC methods for sampling, that in turn allow the creation of powerful Bayesian models.  ... 
arXiv:2102.13068v3 fatcat:ypgddukc4rgenksjhvs4xniwdy

Applications of the Free Energy Principle to Machine Learning and Neuroscience [article]

Beren Millidge
2021 arXiv   pre-print
We go on to propose novel and simpler algorithms which allow for backprop to be implemented in purely local, biologically plausible computations.  ...  In this PhD thesis, we explore and apply methods inspired by the free energy principle to two important areas in machine learning and neuroscience.  ...  approaches as well as being more stable and easy to train.  ... 
arXiv:2107.00140v1 fatcat:c6phd65xwfc2rcyq7pnth5a3pq

Knowledge Augmented Machine Learning with Applications in Autonomous Driving: A Survey [article]

Julian Wörmann, Daniel Bogdoll, Etienne Bührle, Han Chen, Evaristus Fuh Chuo, Kostadin Cvejoski, Ludger van Elst, Tobias Gleißner, Philip Gottschall, Stefan Griesche, Christian Hellert, Christian Hesels (+34 others)
2022 arXiv   pre-print
However, the subsequent application of these models often involves scenarios that are inadequately represented in the data used for training.  ...  This work provides an overview of existing techniques and methods in the literature that combine data-based models with existing knowledge.  ...  MCMC-based: A common method for computing an empirical approximation of the posterior distribution is Hamiltonian Monte Carlo (HMC) [179, 508] and its extension No-U-Turn Sampling (NUTS) [313] .  ... 
arXiv:2205.04712v1 fatcat:u2bgxr2ctnfdjcdbruzrtjwot4

Scalable and Reliable Inference for Probabilistic Modeling

Ruqi Zhang
2021
In the era of big and complex data, there is an urgent need for new inference methods in probabilistic modeling to extract information from data effectively and efficiently.  ...  Probabilistic modeling, as known as probabilistic machine learning, provides a principled framework for learning from data, with the key advantage of offering rigorous solutions for uncertainty quantification  ...  Tempering for SG-MCMC was first used by [85] practical technique for neural network training for fast convergence in limited time 1 .  ... 
doi:10.7298/d364-gz12 fatcat:lsixokv3ofgtnlg3qlf6qaiqci

Finding signals in the void: Improving deep latent variable generative models via supervisory signals present within data

Jason Emmanuel Ramapuram, Alexandros Kalousis, Stéphane Marchand-Maillet
2021
Improving deep latent variable generative models via supervisory signals present within data.  ...  K++ involves two forms of reading (Figure 4-5): iterative reading and a simpler and more stable read model used for training.  ...  To generate samples, score based energy models rely on Langevin MCMC: 𝑥 𝑘+1 ← 𝑥 𝑘 + 𝜖 2 2 ∇ 𝑥 log 𝑃 𝜃 (𝑥 𝑘 ) + 𝜖𝑧 𝑘 , 𝑘 = {0, ..., 𝐾 − 1} (5.1) Given an initial prior sample, e.g: 𝑥 0 ∼  ... 
doi:10.13097/archive-ouverte/unige:160342 fatcat:kd23ch53gbekjabkfv7qxnypfu

Point-of-Interest Recommendation [chapter]

2017 Encyclopedia of GIS  
Computational Epidemiology is the development and use of computer models for the spatio-temporal diffusion of disease through populations.  ...  and insect feeding patterns for mosquito-borne diseases.  ...  If no other uses were present at the location and the position and time of the sighting matches the GPS sample, the user can be identified.  ... 
doi:10.1007/978-3-319-17885-1_100975 fatcat:myyebmb3hrhgnpqmobyyvm2xum

Variational Methods for Energy Systems

Henning Lange
2019
Due to resource constraints and global climate change, there is an increased need for technological solutions to improve the efficiency and reduce waste of our energy systems.  ...  observations and so far the existing solutions either require supervised training or make assumptions that limit their applicability and performance in real conditions.  ...  However, Gibbs-based MCMC techniques in the context of energy disaggregation have drawbacks.  ... 
doi:10.1184/r1/7970504.v1 fatcat:igqk3eodg5bbhcimgc6rqsfd7i

Reservoir modeling and inversion using generative adversarial network priors

Lukas J. Mosser, Olivier Dubrule, Martin Blunt
2020
After training, the GAN generator is used to sample large high-fidelity realizations that follow the same statistical and physical properties as represented in the training images.  ...  A GAN can be trained to represent pore-scale micro-CT images of segmented and grayscale porous media.  ...  Definition of GAN training objectives compatible with high-diversity samples showing no mode-collapse and stable training remains an open problem.  ... 
doi:10.25560/80165 fatcat:g4geuhu2nvfj3dzof4hjlcveke

Probabilistic Programming for Deep Learning

Dustin Tran
2020
arbitrary manipulation of probabilistic programs for flexible inference and model criticism.  ...  We propose the idea of deep probabilistic programming, a synthesis of advances for systems at the intersection of probabilistic modeling and deep learning.  ...  Another compelling example are energy-based models p(x) ∝ exp{f (x)}, where sampling is not even available in closed-form; in contrast, the unnormalized density is.5 In principle, one can reify any model  ... 
doi:10.7916/d8-95c9-sj96 fatcat:ujiuu2ryx5gzxlj5zckk5k6d6m

Principles of Learning in Multitask Settings: A Probabilistic Perspective

Maruan Al-shedivat
2022
So, is there a set of principles we could follow when designing models and algorithms for such settings?  ...  Starting with a general definition of a learning task, we show how multiple related tasks can be assembled into and represented by a joint probabilistic model.  ...  Acknowledgments I nd myself extremely fortunate and privileged to have so many fantastic people to thank, without whom the work behind this thesis would not have been possible. Acknowledgments vii  ... 
doi:10.1184/r1/19540438 fatcat:jifbve43dnhelgdoaaz54tnrru

Comparative analysis of the frequentist and Bayesian approaches to stress testing [article]

Zheqi Wang, University Of Edinburgh, Jonathan Crook, Galina Andreeva
2021
Based on U.S. mortgage loan data, we model the probability of default at the account level using discrete time hazard analysis.  ...  Stress testing is necessary for banks as it is required by the Basel Accords for loss predictions and regulatory and economic capital computations.  ...  I am forever grateful for all the time and energy they spent in helping me and I deeply appreciate the opportunity to work with them.  ... 
doi:10.7488/era/1120 fatcat:2j7pxxnqqzbsvilw54rz6f2oqu
« Previous Showing results 1 — 15 out of 19 results