A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Filters
Recursive Monte Carlo and Variational Inference with Auxiliary Variables
[article]
2022
arXiv
pre-print
This paper presents recursive auxiliary-variable inference (RAVI), a new framework for exploiting flexible proposals, for example based on involved simulations or stochastic optimization, within Monte ...
A key challenge in applying Monte Carlo and variational inference (VI) is the design of proposals and variational families that are flexible enough to closely approximate the posterior, but simple enough ...
In this paper, we present a new framework, called Recursive Auxiliary-Variable Inference (RAVI), for incorporating more complex proposals, without tractable densities, into standard Monte Carlo and VI ...
arXiv:2203.02836v1
fatcat:l4iwitv55nfsrot6frsasyqv4y
Variational Latent-State GPT for Semi-supervised Task-Oriented Dialog Systems
[article]
2022
arXiv
pre-print
In this work, we establish Recursive Monte Carlo Approximation (RMCA) to the variational objective with non-Markovian inference model and prove its unbiasedness. ...
Variational training of VLS-GPT is both statistically and computationally more challenging than previous variational learning works for sequential latent variable models, which use turn-level first-order ...
In this work, we establish Recursive Monte Carlo Approximation (RMCA) to the variational objective with non-Markovian inference model and prove its unbiasedness. ...
arXiv:2109.04314v2
fatcat:j6toboql2bezjl67zibl5bcaey
Variational Bayesian inference of hidden stochastic processes with unknown parameters
[article]
2019
arXiv
pre-print
The posterior distribution of hidden states are approximated by a set of weighted particles generated by the sequential Monte carlo (SMC) algorithm involving sampling with importance sampling resampling ...
This paper adopts a machine learning approach to devise variational Bayesian inference for such scenarios. ...
Variational Bayesian inference of latent states is studied in [1, 12] , and modified in [8, 35] by assuming the joint density of latent states and of an auxiliary random variable. ...
arXiv:1911.00757v1
fatcat:oowed2t3xrco5aks55ndibcm6i
Deep Rao-Blackwellised Particle Filters for Time Series Forecasting
2020
Neural Information Processing Systems
Furthermore, we use an auxiliary variable approach with a decoder-type neural network that allows for more complex non-linear emission models and multivariate observations. ...
This work addresses efficient inference and learning in switching Gaussian linear dynamical systems using a Rao-Blackwellised particle filter and a corresponding Monte Carlo objective. ...
The second problem is addressed in [11] using an auxiliary variable between GLS states and emissions and stochastic variational inference to obtain a tractable objective function for learning. ...
dblp:conf/nips/KurleRBGG20
fatcat:htivtyyh4bdjxgvf4msuw5h5bi
Advancing Semi-Supervised Task Oriented Dialog Systems by JSA Learning of Discrete Latent Variable Models
[article]
2022
arXiv
pre-print
For semi-supervised learning of latent state TOD models, variational learning is often used, but suffers from the annoying high-variance of the gradients propagated through discrete latent variables and ...
Recently, an alternative algorithm, called joint stochastic approximation (JSA), has emerged for learning discrete latent variable models with impressive performances. ...
However, for variational learning of discrete latent variable models, the Monte-Carlo gradient estimator for the inference model parameter is known to have high-variance. ...
arXiv:2207.12235v1
fatcat:awekrj2mvjachjm2f47cln5thy
Joint Stochastic Approximation learning of Helmholtz Machines
[article]
2018
arXiv
pre-print
Though with progress, model learning and performing posterior inference still remains a common challenge for using deep generative models, especially for handling discrete hidden variables. ...
This paper is mainly concerned with algorithms for learning Helmholz machines, which is characterized by pairing the generative model with an auxiliary inference model. ...
We would like to thank Zhiqiang Tan for helpful discussions and the developers of Theano (Bergstra et al., 2010; Bastien et al., 2012) for their powerful software. ...
arXiv:1603.06170v2
fatcat:xftb6zzp4ng7bm3mo4cleps44m
Efficient method of moments estimation of a stochastic volatility model: A Monte Carlo study
1999
Journal of Econometrics
We perform an extensive Monte Carlo study of efficient method of moments (EMM) estimation of a stochastic volatility model. ...
Inference is sensitive to the choice of auxiliary model in small samples, but robust in larger samples. Specification tests and 't-tests' show little size distortion. 1999 Elsevier Science S.A. ...
Acknowledgements We are grateful to Hyung-Kwon Chung for assistance and David Tom for programming the EGARCH model in C##. ...
doi:10.1016/s0304-4076(98)00049-9
fatcat:jj7bctfp65eoxei7oarorvnn2a
Efficient Gradient-Based Inference through Transformations between Bayes Nets and Neural Nets
[article]
2015
arXiv
pre-print
preferred and show how inference can be made robust. ...
Hierarchical Bayesian networks and neural networks with stochastic hidden units are commonly perceived as two separate types of models. ...
Acknowledgments The authors thank the reviewers for their excellent feedback and Joris Mooij, Ted Meeds and Taco Cohen for invaluable discussions and input. ...
arXiv:1402.0480v5
fatcat:fnsr4ig5snfsncxalf6igwgoae
Elements of Sequential Monte Carlo
2019
Foundations and Trends® in Machine Learning
Sequential Monte Carlo (SMC) is a class of methods that are tailored to solved statistical inference problems recursively. ...
However, as we shall see, we can make use of auxiliary variables to design target distributions that can help with inference. ...
doi:10.1561/2200000074
fatcat:imua2dv5p5e3bes7dqeqi6lxpq
Ladder Variational Autoencoders
[article]
2016
arXiv
pre-print
We propose a new inference model, the Ladder Variational Autoencoder, that recursively corrects the generative distribution by a data dependent approximate likelihood in a process resembling the recently ...
We show that this model provides state of the art predictive log-likelihood and tighter log-likelihood lower bound compared to the purely bottom-up inference in layered Variational Autoencoders and other ...
Acknowledgments This research was supported by the Novo Nordisk Foundation, Danish Innovation Foundation and the NVIDIA Corporation with the donation of TITAN X and Tesla K40 GPUs. ...
arXiv:1602.02282v3
fatcat:53z5qnfysbfcpmo6stklz7vwfy
Nonparametric belief propagation
2010
Communications of the ACM
NBP combines ideas from Monte Carlo 3 and particle filtering 6, 11 approaches for representing complex uncertainty in time series, with the popular belief propagation (BP) algorithm 37 for approximate ...
formulations are only useful when combined with efficient algorithms for inference and learning. ...
The intersection of variational and Monte Carlo methods for approximate inference remains an extremely active research area. ...
doi:10.1145/1831407.1831431
fatcat:lbsxkfdvwbgttc4qaybyxuywlm
Variational Inference with Continuously-Indexed Normalizing Flows
[article]
2021
arXiv
pre-print
CIFs do not possess a closed-form marginal density, and so, unlike standard flows, cannot be plugged in directly to a variational inference (VI) scheme in order to produce a more expressive family of approximate ...
However, we show here how CIFs can be used as part of an auxiliary VI scheme to formulate and train expressive posterior approximations in a natural way. ...
Arnaud Doucet is supported by the EPSRC CoSInES (COmputational Statistical INference for Engineering and Security) grant EP/R034710/1 ...
arXiv:2007.05426v2
fatcat:pik3fevfwvaglgbn6lbl5u4onu
Energy-Inspired Models: Learning with Sampler-Induced Distributions
[article]
2020
arXiv
pre-print
Moreover, EIMs allow us to generalize a recent connection between multi-sample variational lower bounds and auxiliary variable variational inference. ...
We show how recent variational bounds can be unified with EIMs as the variational family. ...
Acknowledgments We thank Ben Poole, Abhishek Kumar, and Diederick Kingma for helpful comments. We thank Matthias Bauer for answering implementation questions about LARS. ...
arXiv:1910.14265v2
fatcat:z72wzvq6areh7kw5q4rm56cpsy
Hierarchical Variational Models
[article]
2016
arXiv
pre-print
HVMs augment a variational approximation with a prior on its parameters, which allows it to capture complex structure for both discrete and continuous latent variables. ...
Black box variational inference allows researchers to easily prototype and evaluate an array of models. Recent advances allow such algorithms to scale to high dimensions. ...
NSF IIS-0745520, IIS-1247664, IIS-1009542, ONR N00014-11-1-0651, DARPA FA8750-14-2-0009, N66001-15-C-4032, Facebook, Adobe, Amazon, NVIDIA, the Porter Ogden Jacobus Fellowship, the Seibel Foundation, and ...
arXiv:1511.02386v2
fatcat:cyu73a35fbcdlbo57pynmxkrxe
A multilevel approach to control variates
2009
Journal of Computational Finance
To illustrate the technique, we price Asian put options in the Black-Scholes-Merton framework and show the control variate we prescribe is competitive with other commonly used control variates and dominates ...
Control variates are a popular technique for reducing the variance of Monte Carlo estimates. Recent literature has enlarged the set of potentially useful control variates. ...
Introduction Monte Carlo integration is often used to approximate the mean µ Y of a random variable Y from which samples can be drawn in a Monte Carlo simulation. ...
doi:10.21314/jcf.2009.201
fatcat:iruu57eexfb6rjfv3ar4iubm2e
« Previous
Showing results 1 — 15 out of 3,637 results