89 Hits in 8.5 sec

Sampling from a log-concave distribution with compact support with proximal Langevin Monte Carlo [article]

Nicolas Brosse, Alain Durmus, Éric Moulines, Marcelo Pereyra
2017 arXiv   pre-print
(Efficient Bayesian computation by proximal Markov chain Monte Carlo: when Langevin meets Moreau, 2016) when applied to log-concave probability distributions that are restricted to a convex body K.  ...  This paper presents a detailed theoretical analysis of the Langevin Monte Carlo sampling algorithm recently introduced in Durmus et al.  ...  Acknowledgments The authors wish to express their thanks to the anonymous referees for several helpful remarks, in particular concerning a simplified proof of Proposition 4.  ... 
arXiv:1705.08964v1 fatcat:fkorykad4nerpeflucd6a76uci

Proximal Markov chain Monte Carlo algorithms

Marcelo Pereyra
2015 Statistics and computing  
This paper presents a new Metropolis-adjusted Langevin algorithm (MALA) that uses convex analysis to simulate efficiently from high-dimensional densities that are log-concave, a class of probability distributions  ...  The method is based on a new first-order approximation for Langevin diffusions that exploits log-concavity to construct Markov chains with favourable convergence properties.  ...  This work was supported in part by the SuSTaIn program -EPSRC grant EP/D063485/1 -at the Department of Mathematics, University of Bristol, and by a French Ministry of Defence postdoctoral fellowship.  ... 
doi:10.1007/s11222-015-9567-4 fatcat:vwlzo3unv5gkfmqppmfb5x6uhe

Efficient Bayesian computation by proximal Markov chain Monte Carlo: when Langevin meets Moreau [article]

Alain Durmus, Eric Moulines, Marcelo Pereyra
2016 arXiv   pre-print
This paper presents a new and highly efficient Markov chain Monte Carlo methodology to perform Bayesian computation for high dimensional models that are log-concave and non-smooth, a class of models that  ...  The methodology is based on a regularised unadjusted Langevin algorithm that exploits tools from convex analysis, namely Moreau-Yoshida envelopes and proximal operators, to construct Markov chains with  ...  Generating the Monte Carlo samples and computing the HPD threshold values required 15 minutes. Comparison with proximal MALA.  ... 
arXiv:1612.07471v1 fatcat:5yhkbbe6qzev3hl5gox7obftu4

Fast Convergence of Langevin Dynamics on Manifold: Geodesics meet Log-Sobolev [article]

Xiao Wang, Qi Lei, Ioannis Panageas
2020 arXiv   pre-print
One approach to sample from a high dimensional distribution e^-f for some function f is the Langevin Algorithm (LA).  ...  From technical point of view, we show that KL decreases in a geometric rate whenever the distribution e^-f satisfies a log-Sobolev inequality on M.  ...  Introduction We focus on the problem of sampling from a distribution e −f (x) supported on a Riemannian manifold M with standard volume measure.  ... 
arXiv:2010.05263v2 fatcat:nu2cmgr3wfa3dddeqyita6npma

The split Gibbs sampler revisited: improvements to its algorithmic structure and augmented target distribution [article]

Marcelo Pereyra, Luis A. Vargas-Mieles, Konstantinos C. Zygalakis
2022 arXiv   pre-print
The proposed methodology is derived from the Langevin diffusion process and stems from tightly integrating two state-of-the-art proximal Langevin MCMC samplers, SK-ROCK and split Gibbs sampling (SGS),  ...  This paper proposes a new accelerated proximal Markov chain Monte Carlo (MCMC) methodology to perform Bayesian computation efficiently in imaging inverse problems.  ...  If E x|y,θ,ρ 2 [h(x)] is not available analytically, we would draw samples from the conditional x|y, Zi, θ, ρ 2 and apply a standard Monte Carlo estimator.  ... 
arXiv:2206.13894v1 fatcat:sxcqjwdi5bhc3fxfpbc6fazmc4

Efficient Bayesian computation for low-photon imaging problems [article]

Savvas Melidonis, Paul Dobson, Yoann Altmann, Marcelo Pereyra, Konstantinos C. Zygalakis
2022 arXiv   pre-print
This paper studies a new and highly efficient Markov chain Monte Carlo (MCMC) methodology to perform Bayesian inference in low-photon imaging problems, with particular attention to situations involving  ...  ., hard non-negativity constraints, non-smooth priors, and log-likelihood terms with exploding gradients.  ...  The resulting proximal Markov chain Monte Carlo algorithms are presented in Section 4.  ... 
arXiv:2206.05350v1 fatcat:bbcsdj4g5jddfmctpin44dmlk4

Composite Logconcave Sampling with a Restricted Gaussian Oracle [article]

Ruoqi Shen, Kevin Tian, Yin Tat Lee
2020 arXiv   pre-print
The restricted Gaussian oracle, which draws samples from a distribution whose negative log-likelihood sums a quadratic and g, has been previously studied and is a natural extension of the proximal oracle  ...  For f with condition number κ, our algorithm runs in O (κ^2 d log^2κ dϵ) iterations, each querying a gradient of f and a restricted Gaussian oracle, to achieve total variation distance ϵ.  ...  Sampling from a log-concave distribution with compact support with proximal langevin monte carlo.  ... 
arXiv:2006.05976v1 fatcat:3wqsaa627vgjxnnc5m7pgmto7i

Stochastic Gradient MCMC for Nonlinear State Space Models [article]

Christopher Aicher, Srshti Putcha, Christopher Nemeth, Paul Fearnhead,, Emily B. Fox
2019 arXiv   pre-print
We present error bounds that account for both buffering error and particle error in the case of nonlinear SSMs that are log-concave in the latent process.  ...  The challenge is two-fold: not only do computations scale linearly with time, as in the linear case, but particle filters additionally suffer from increasing particle degeneracy with longer series.  ...  This work was supported in part by ONR Grants N00014-15-1-2380 and N00014-18-1-2862, NSF CAREER Award IIS-1350133, and EPSRC  ... 
arXiv:1901.10568v2 fatcat:g76uz7rdhremfasffbsdzzy3vy

Efficient stochastic optimisation by unadjusted Langevin Monte Carlo

Valentin De Bortoli, Alain Durmus, Marcelo Pereyra, Ana F. Vidal
2021 Statistics and computing  
Combined with Markov chain Monte Carlo algorithms, these stochastic optimisation methods have been successfully applied to a wide range of problems in science and industry.  ...  However, this strategy scales poorly to large problems because of methodological and theoretical difficulties related to using high-dimensional Markov chain Monte Carlo algorithms within a stochastic approximation  ...  Langevin Markov chain Monte Carlo methods Langevin MCMC schemes to sample from p(x|y, θ) are based on stochastic continuous dynamics (X θ t ) t≥0 for which the target distribution p(x|y, θ) is invariant  ... 
doi:10.1007/s11222-020-09986-y fatcat:qdesu72pofarpdmimiejbbx3xy

Implicit Langevin Algorithms for Sampling From Log-concave Densities [article]

Liam Hodgkinson, Robert Salomone, Fred Roosta
2021 arXiv   pre-print
For sampling from a log-concave density, we study implicit integrators resulting from θ-method discretization of the overdamped Langevin diffusion stochastic differential equation.  ...  Numerical examples supporting our theoretical analysis are also presented.  ...  Fred Roosta was partially supported by the Australian Research Council through a Discovery Early Career Researcher Award (DE180100923).  ... 
arXiv:1903.12322v2 fatcat:taby5mkhf5cjpcdkgc6h4op6wq

Bayesian imaging using Plug Play priors: when Langevin meets Tweedie [article]

Rémi Laumont, Valentin de Bortoli, Andrés Almansa, Julie Delon, Alain Durmus, Marcelo Pereyra
2022 arXiv   pre-print
We introduce two algorithms: 1) PnP-ULA (Unadjusted Langevin Algorithm) for Monte Carlo sampling and MMSE inference; and 2) PnP-SGD (Stochastic Gradient Descent) for MAP inference.  ...  These methods derive Minimum Mean Square Error (MMSE) or Maximum A Posteriori (MAP) estimators for inverse problems in imaging by combining an explicit likelihood function with a prior that is implicitly  ...  These ACF plots measure how fast samples become uncorrelated. A fast decay of the ACF is associated with good Markov chain mixing, which in turn implies accurate Monte Carlo estimates.  ... 
arXiv:2103.04715v6 fatcat:4xryxmhvd5gvxa6xujnnp3d5w4

Projected Stochastic Gradient Langevin Algorithms for Constrained Sampling and Non-Convex Learning [article]

Andrew Lamperski
2020 arXiv   pre-print
Other work has examined projected Langevin algorithms for sampling from log-concave distributions restricted to convex compact sets.  ...  Langevin algorithms are gradient descent methods with additive noise. They have been used for decades in Markov chain Monte Carlo (MCMC) sampling, optimization, and learning.  ...  The author acknowledges funding from NASA STTR 19-1-T4.03-3451 and NSF CMMI 1727096  ... 
arXiv:2012.12137v1 fatcat:qnardluiozbubonhmkz5sc4wgq

Primal Dual Interpretation of the Proximal Stochastic Gradient Langevin Algorithm [article]

Adil Salim, Peter Richtárik
2021 arXiv   pre-print
We consider the task of sampling with respect to a log concave probability distribution.  ...  In the second part of this paper, we use the duality gap arising from the first part to study the complexity of the Proximal Stochastic Gradient Langevin Algorithm (PSGLA), which can be seen as a generalization  ...  Leb, our goal is construct samples x 1 , . . . , x k from the posterior distribution µ (x|D 1 , . . . , D n ) ∝ π(x) n i=1 L(D i , x), (32) in order e.g. to estimate the mean a posteriori via Monte Carlo  ... 
arXiv:2006.09270v2 fatcat:36lmqqukfzdq3anb4wsjy4k66u

Global Convergence of Langevin Dynamics Based Algorithms for Nonconvex Optimization [article]

Pan Xu and Jinghui Chen and Difan Zou and Quanquan Gu
2020 arXiv   pre-print
We present a unified framework to analyze the global convergence of Langevin dynamics based algorithms for nonconvex finite-sum optimization with n component functions.  ...  Our theoretical analyses shed some light on using Langevin dynamics based algorithms for nonconvex optimization with provable guarantees.  ...  As to sampling from distribution with compact support, Bubeck et al. [8] analyzed sampling from log-concave distributions via projected Langevin Monte Carlo, and Brosse et al. [7]proposed a proximal Langevin  ... 
arXiv:1707.06618v3 fatcat:ron6imxljbgwrelspvctnkyuey

A Proximal Algorithm for Sampling from Non-smooth Potentials [article]

Jiaming Liang, Yongxin Chen
2022 arXiv   pre-print
In this work, we examine sampling problems with non-smooth potentials. We propose a novel Markov chain Monte Carlo algorithm for sampling from non-smooth potentials.  ...  This framework requires the so-called restricted Gaussian oracle, which can be viewed as a sampling counterpart of the proximal mapping in convex optimization.  ...  Several widely used MCMC methods include Langevin Monte Carlo (LMC) [10, 17, 33, 36] , Metropolis-adjusted Langevin algorithm (MALA) [4, 35, 36] , and Hamiltonian Monte Carlo (HMC) [32] .  ... 
arXiv:2110.04597v2 fatcat:d2ewksvcfbdixdwkre6mlmdzde
« Previous Showing results 1 — 15 out of 89 results