Filters








443 Hits in 5.5 sec

Understanding and Accelerating Particle-Based Variational Inference [article]

Chang Liu, Jingwei Zhuo, Pengyu Cheng, Ruiyi Zhang, Jun Zhu, Lawrence Carin
2019 arXiv   pre-print
We propose an acceleration framework and a principled bandwidth-selection method for general ParVIs; these are based on the developed theory and leverage the geometry of the Wasserstein space.  ...  Particle-based variational inference methods (ParVIs) have gained attention in the Bayesian inference literature, for their capacity to yield flexible and accurate approximations.  ...  Our acceleration remains first-order, and we consider the Wasserstein space P 2 (X ) so that with our theory the acceleration is applicable for all ParVIs.  ... 
arXiv:1807.01750v4 fatcat:zm73eyeyo5fvxic33qiq6a4i7e

De-randomizing MCMC dynamics with the diffusion Stein operator [article]

Zheyang Shen, Markus Heinonen, Samuel Kaski
2021 arXiv   pre-print
via steepest descent in the Wasserstein space.  ...  Approximate Bayesian inference estimates descriptors of an intractable target distribution - in essence, an optimization problem within a family of distributions.  ...  an analog of Nesterov acceleration for MCMC methods.  ... 
arXiv:2110.03768v1 fatcat:intcd2zyhffabcaf65wsmtf6mq

Accelerated Information Gradient flow [article]

Yifei Wang, Wuchen Li
2020 arXiv   pre-print
We present a framework for Nesterov's accelerated gradient flows in probability space.  ...  Numerical experiments, including Bayesian logistic regression and Bayesian neural network, show the strength of the proposed methods compared with state-of-the-art algorithms.  ...  In recent years, various first-order sampling methods via generalized Wasserstein gradient direction are proposed.  ... 
arXiv:1909.02102v2 fatcat:s53ixlts6fevvd37n35ryfggau

Approximate Bayesian Computation with the Sliced-Wasserstein Distance [article]

Kimia Nadjahi, Valentin De Bortoli, Alain Durmus, Roland Badeau, Umut Şimşekli
2020 arXiv   pre-print
Approximate Bayesian Computation (ABC) is a popular method for approximate inference in generative models with intractable but easy-to-sample likelihood.  ...  We propose a new ABC technique, called Sliced-Wasserstein ABC and based on the Sliced-Wasserstein distance, which has better computational and statistical properties.  ...  inference methods.  ... 
arXiv:1910.12815v2 fatcat:p7u6viu3lzgyjdho2rtk5lwtmm

Bayesian nonparametric model based clustering with intractable distributions: an ABC approach [article]

Mario Beraha, Riccardo Corradin
2021 arXiv   pre-print
Bayesian nonparametric mixture models offer a rich framework for model based clustering.  ...  In this case, most of the commonly used Markov chain Monte Carlo (MCMC) methods are not suitable.  ...  Inference for sde models via approximate bayesian computation.  ... 
arXiv:2112.10393v1 fatcat:ode6vv3eo5e6zgovd3ats2uz2e

Deep generative models for accelerated Bayesian posterior inference of microseismic events [article]

Alessio Spurio Mancini, Davide Piras, Michael P. Hobson, Benjamin Joachimi
2020 arXiv   pre-print
The surrogate models can accelerate Bayesian source inversion by emulating the forward modelling of the seismograms, thus replacing the expensive solution of the elastic wave equation at each likelihood  ...  We evaluate the performance of our methods on a testing set and compare them against state-of-the-art algorithms applied to the same data, showing that all of our models provide more accurate predictions  ...  Some of the computations have been performed on the Wilkes High Performance GPU computer cluster at the University of Cambridge; we are grateful to Stuart Rankin and Greg Willatt for their technical support  ... 
arXiv:2009.06758v1 fatcat:w3qwlgjlk5cl7dzd7cbrrmvrye

Accelerating proximal Markov chain Monte Carlo by using an explicit stabilised method [article]

Luis Vargas, Marcelo Pereyra, Konstantinos C. Zygalakis
2020 arXiv   pre-print
that combines several gradient evaluations to significantly accelerate its convergence speed, similarly to accelerated gradient optimisation methods.  ...  Similarly to previous proximal Monte Carlo approaches, the proposed method is derived from an approximation of the Langevin diffusion.  ...  In this paper we focus on the computational aspects of performing Bayesian inferences in imaging problems.  ... 
arXiv:1908.08845v3 fatcat:pydwwrulmzbejc7pwkhv2hglae

Algorithmic Parameter Estimation and Uncertainty Quantification for Hodgkin-Huxley Neuron Models [article]

Y. Curtis Wang, Nirvik Sinha, Johann Rudi, James Velasco, Gideon Idumah, Randall K. Powers, Charles J. Heckman, Matthieu Chardon
2021 bioRxiv   pre-print
We propose a novel method for parameter inference and uncertainty quantification in a Bayesian framework using the Markov chain Monte Carlo (MCMC) approach.  ...  Furthermore, using the adaptive parallel tempering strategy for MCMC, we tackle the highly nonlinear, noisy, and multimodal loss function, which depends on the HH neuron model.  ...  Acknowledgments We gratefully acknowledge the computing resources provided on Bebop, a high-performance com-  ... 
doi:10.1101/2021.11.18.469189 fatcat:bybufrygoncdbmsbthlockqiwq

ODE^2VAE: Deep generative second order ODEs with Bayesian neural networks [article]

Çağatay Yıldız, Markus Heinonen, Harri Lähdesmäki
2019 arXiv   pre-print
In order to account for uncertainty, we propose probabilistic latent ODE dynamics parameterized by deep Bayesian neural networks.  ...  Our model explicitly decomposes the latent space into momentum and position components and solves a second order ODE system, which is in contrast to recurrent neural network (RNN) based time series models  ...  Bayesian second-order ODEs First-order ODEs are incapable of modelling high-order dynamics 1 , such as acceleration or the motion of a pendulum.  ... 
arXiv:1905.10994v2 fatcat:nilzdi5neja6fawngf3gv5f7oa

Scalable Computations of Wasserstein Barycenter via Input Convex Neural Networks [article]

Jiaojiao Fan, Amirhossein Taghvaei, Yongxin Chen
2021 arXiv   pre-print
Our proposed algorithm is based on the Kantorovich dual formulation of the Wasserstein-2 distance as well as a recent neural network architecture, input convex neural network, that is known to parametrize  ...  We demonstrate the efficacy of our algorithm by comparing it with the state-of-art methods in multiple experiments.  ...  Remark 1 Obtaining convergence rate for first-order optimization algorithms solving (8) is challenging even in the ideal setting that the optimization is carried out in the function space and space of  ... 
arXiv:2007.04462v3 fatcat:3veauu3kfzabxawrvyak7b7qca

Information Newton's flow: second-order optimization method in probability space [article]

Yifei Wang, Wuchen Li
2020 arXiv   pre-print
For the numerical implementation, we design sampling efficient variational methods in affine models and reproducing kernel Hilbert space (RKHS) to approximate Wasserstein Newton's directions.  ...  Several numerical examples from Bayesian sampling problems are shown to demonstrate the effectiveness of the proposed method.  ...  In short, we notice that the Langevein dynamics can be viewed as first-order methods for Bayesian sampling problems.  ... 
arXiv:2001.04341v4 fatcat:m6sxdl43rbh6vabq3dxwsxbjce

Understanding MCMC Dynamics as Flows on the Wasserstein Space [article]

Chang Liu, Jingwei Zhuo, Jun Zhu
2019 arXiv   pre-print
It is known that the Langevin dynamics used in MCMC is the gradient flow of the KL divergence on the Wasserstein space, which helps convergence analysis and inspires recent particle-based variational inference  ...  We develop two ParVI methods for a particular MCMC dynamics and demonstrate the benefits in experiments.  ...  Preliminaries We first introduce the recipe for general MCMC dynamics (Ma et al., 2015) , and prior knowledge on flows on a smooth manifold M and its Wasserstein space P(M).  ... 
arXiv:1902.00282v3 fatcat:txe2bou5hbccvg5vlornpjvg2m

Accelerating Bayesian microseismic event location with deep learning

Alessio Spurio Mancini, Davide Piras, Ana Margarida Godinho Ferreira, Michael Paul Hobson, Benjamin Joachimi
2021 Solid Earth  
In this paper we focus on accelerating this process by training deep-learning models to learn the mapping between source location and seismic traces for a given 3D heterogeneous velocity model and a fixed  ...  We present a series of new open-source deep-learning algorithms to accelerate Bayesian full-waveform point source inversion of microseismic events.  ...  Some of the computations have been performed on the Wilkes High Performance GPU computer cluster at the University of Cambridge; we are grateful to Stuart Rankin and Greg Willatt for their technical support  ... 
doi:10.5194/se-12-1683-2021 fatcat:wehjwnowdncxrpb7gckncu4ukq

Function Space Particle Optimization for Bayesian Neural Networks [article]

Ziyu Wang, Tongzheng Ren, Jun Zhu, Bo Zhang
2019 arXiv   pre-print
While Bayesian neural networks (BNNs) have drawn increasing attention, their posterior inference remains challenging, due to the high-dimensional and over-parameterized nature.  ...  To address this issue, several highly flexible and scalable variational inference procedures based on the idea of particle optimization have been proposed.  ...  L172037), Tsinghua Tiangong Institute for Intelligent Computing, the NVIDIA NVAIL Program, a project from Siemens, and a project from NEC.  ... 
arXiv:1902.09754v2 fatcat:amofwut2o5aaxn4kv5ihsk2fuu

Straight-Through Estimator as Projected Wasserstein Gradient Flow [article]

Pengyu Cheng, Chang Liu, Chunyuan Li, Dinghan Shen, Ricardo Henao and Lawrence Carin
2019 arXiv   pre-print
The Straight-Through (ST) estimator is a widely used technique for back-propagating gradients through discrete random variables. However, this effective method lacks theoretical justification.  ...  In this paper, we show that ST can be interpreted as the simulation of the projected Wasserstein gradient flow (pWGF).  ...  Instead of directly updating µ in the discrete distribution family, µ is first updated toμ on a larger Wasserstein distribution space where gradients are easier to compute.  ... 
arXiv:1910.02176v1 fatcat:5mc7ug5eavhwjee54liybv5r6i
« Previous Showing results 1 — 15 out of 443 results