Filters








15,038 Hits in 5.8 sec

Improving Explorability in Variational Inference with Annealed Variational Objectives [article]

Chin-Wei Huang, Shawn Tan, Alexandre Lacoste, Aaron Courville
2018 arXiv   pre-print
We demonstrate the drawbacks of biasing the true posterior to be unimodal, and introduce Annealed Variational Objectives (AVO) into the training of hierarchical variational methods.  ...  Despite the advances in the representational capacity of approximate distributions for variational inference, the optimization process can still limit the density that is ultimately learned.  ...  Annealed variational objectives Inspired by AIS and alpha-annealing, we propose to integrate energy tempering into the optimization objective of the variational distribution.  ... 
arXiv:1809.01818v3 fatcat:t2azok2az5a7dlmvdaic6jdrim

Stochastic Annealing for Variational Inference [article]

San Gultekin, Aonan Zhang, John Paisley
2015 arXiv   pre-print
Variational inference is a deterministic approach to approximate posterior inference in Bayesian models in which a typically non-convex objective function is locally optimized over the parameters of the  ...  We empirically evaluate a stochastic annealing strategy for Bayesian posterior optimization with variational inference.  ...  These approaches perform a deterministic inflating of the variational entropy term, which shrinks with each iteration to allow for exploration of the variational objective in early iterations.  ... 
arXiv:1505.06723v1 fatcat:njqharlvmjgztgjr7zrit3mrju

Proximity Variational Inference [article]

Jaan Altosaar, Rajesh Ranganath, David M. Blei
2017 arXiv   pre-print
In this paper, we develop proximity variational inference (PVI).  ...  We study PVI on a Bernoulli factor model and sigmoid belief network with both real and synthetic data and compare to deterministic annealing (Katahira et al., 2008).  ...  Acknowledgments The experiments presented in this article were performed on computational resources supported by the Princeton Institute for Computational Science and Engineering (PICSciE), the Office  ... 
arXiv:1705.08931v1 fatcat:3by4iph6p5agdjqi6ofuotommy

Cyclical Variational Bayes Monte Carlo for Efficient Multi-Modal Posterior Distributions Evaluation [article]

Felipe Igea, Alice Cicirello
2022 arXiv   pre-print
In this paper, the Variational Bayesian Monte Carlo (VBMC) method is investigated with the purpose of dealing with statistical model updating problems in engineering involving expensive-to-run models.  ...  This method combines the active-sampling Bayesian quadrature with a Gaussian-process based variational inference to yield a non-parametric estimation of the posterior distribution of the identified parameters  ...  the variational objective.  ... 
arXiv:2202.11645v1 fatcat:vmawn6l2k5e7dbowzje77h2kkm

Variational Tempering [article]

Stephan Mandt, James McInerney, Farhan Abrol, Rajesh Ranganath, and David Blei
2016 arXiv   pre-print
Variational inference (VI) combined with data subsampling enables approximate posterior inference over large data sets, but suffers from poor local optima.  ...  In contrast to related work in the Markov chain Monte Carlo literature, this algorithm results in adaptive annealing schedules.  ...  To show that VT and LVT find better local optima in the original variational objective, we predict with the non-tempered model. 1 Deterministic annealing provides a significant improvement over standard  ... 
arXiv:1411.1810v4 fatcat:nwzwfasy7bejpp5jg53ytsw6pa

Improving Inference for Neural Image Compression [article]

Yibo Yang, Robert Bamler, Stephan Mandt
2021 arXiv   pre-print
We propose remedies for each of these three limitations based on ideas related to iterative inference, stochastic annealing for discrete optimization, and bits-back coding, resulting in the first application  ...  Drawing on the variational inference perspective on compression, we identify three approximation gaps which limit performance in the conventional approach: an amortization gap, a discretization gap, and  ...  Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the Defense Advanced Research Projects Agency  ... 
arXiv:2006.04240v4 fatcat:2c45ftyttfacfdwfmu3pshjqfi

Annealed Stein Variational Gradient Descent [article]

Francesco D'Angelo, Vincent Fortuin
2021 arXiv   pre-print
In particular Stein variational gradient descent has gained attention in the approximate inference literature for its flexibility and accuracy.  ...  We propose an annealing schedule to solve these issues and show, through various experiments, how this simple solution leads to significant improvements in mode coverage, without invalidating any theoretical  ...  Indeed, annealing approaches have been shown to be beneficial in both sampling and optimization problems for highly non-convex objectives.  ... 
arXiv:2101.09815v3 fatcat:tckewlv22bfh7bkleykwoo7q7i

Variational Autoencoders for Collaborative Filtering

Dawen Liang, Rahul G. Krishnan, Matthew D. Hoffman, Tony Jebara
2018 Proceedings of the 2018 World Wide Web Conference on World Wide Web - WWW '18  
We introduce a generative model with multinomial likelihood and use Bayesian inference for parameter estimation.  ...  Finally, we identify the pros and cons of employing a principled Bayesian inference approach and characterize settings where it provides the most significant improvements.  ...  Amortized inference and the variational autoencoder: With variational inference the number of parameters to optimize {µ u , σ 2 u } grows with the number of users and items in the dataset.  ... 
doi:10.1145/3178876.3186150 dblp:conf/www/LiangKHJ18 fatcat:baidkwo2kvaldh3mr4meqlbxaa

Variational Autoencoders for Collaborative Filtering [article]

Dawen Liang, Rahul G. Krishnan, Matthew D. Hoffman, Tony Jebara
2018 arXiv   pre-print
Finally, we identify the pros and cons of employing a principled Bayesian inference approach and characterize settings where it provides the most significant improvements.  ...  Remarkably, there is an efficient way to tune the parameter using annealing.  ...  Amortized inference and the variational autoencoder: With variational inference the number of parameters to optimize {µ u , σ 2 u } grows with the number of users and items in the dataset.  ... 
arXiv:1802.05814v1 fatcat:qtdx2jcdfvdbjmfdtprcjxwasi

A General Method for Amortizing Variational Filtering [article]

Joseph Marino, Milan Cvitkovic, Yisong Yue
2018 arXiv   pre-print
The algorithm is derived from the variational objective in the filtering setting and consists of an optimization procedure at each time step.  ...  By performing each inference optimization procedure with an iterative amortized inference model, we obtain a computationally efficient implementation of the algorithm, which we call amortized variational  ...  Each data set contains between 100 and 1,000 songs, with each song Acknowledgments We would like to thank Matteo Ruggero Ronchi for helpful discussions.  ... 
arXiv:1811.05090v1 fatcat:q57icsvogvatlpsh4saju6t55y

Conditional Variational Autoencoder for Neural Machine Translation [article]

Artidoro Pagnoni, Kevin Liu, Shangyan Li
2018 arXiv   pre-print
We extend this model with a co-attention mechanism motivated by Parikh et al. in the inference network.  ...  We show that our conditional variational model improves upon both discriminative attention-based translation and the variational baseline presented in Zhang et al.  ...  KL cost annealing and 2. masking parts of the source and target tokens with '<unk>' symbols in order to strengthen the inferer by weakening the decoder ("word dropouts").  ... 
arXiv:1812.04405v1 fatcat:ct27stge5vhi3h7outuxhwx75u

A global-local approach for detecting hotspots in multiple-response regression

Hélène Ruffieux, Anthony C. Davison, Jörg Hager, Jamie Inshaw, Benjamin P. Fairfax, Sylvia Richardson, Leonardo Bottolo
2020 Annals of Applied Statistics  
Inference is carried out using a fast variational algorithm coupled with a novel simulated annealing procedure that allows efficient exploration of multimodal distributions.  ...  We tackle modelling and inference for variable selection in regression problems with many predictors and many responses.  ...  We are grateful to the editor and the two anonymous referees for their valuable comments that improved the presentation of the paper.  ... 
doi:10.1214/20-aoas1332 pmid:34992707 pmcid:PMC7612176 fatcat:g4attnp63zefncht3iammcbe4e

Understanding Posterior Collapse in Generative Latent Variable Models

James Lucas, George Tucker, Roger B. Grosse, Mohammad Norouzi
2019 International Conference on Learning Representations  
We show that training a linear VAE with variational inference recovers a uniquely identifiable global maximum corresponding to the principal component directions.  ...  Posterior collapse in Variational Autoencoders (VAEs) arises when the variational distribution closely matches the uninformative prior for a subset of latent variables.  ...  In Figure 7 we show the training ELBO for the standard ELBO objective and training with KL-annealing. In each case, σ 2 is learned online.  ... 
dblp:conf/iclr/LucasTGN19 fatcat:fqbj5ick3nhn3ohokfsdb5avmq

On the challenges of learning with inference networks on sparse, high-dimensional data [article]

Rahul G. Krishnan, Dawen Liang, Matthew Hoffman
2017 arXiv   pre-print
We propose methods to tackle it via iterative optimization inspired by stochastic variational inference hoffman2013stochastic and improvements in the sparse data representation used for inference.  ...  We study the extent of underfitting, highlighting that its severity increases with the sparsity of the data.  ...  Salimans et al. explore warm starting MCMC with the output of an inference network.  ... 
arXiv:1710.06085v1 fatcat:yker3l2jobbmnncibrxywzhvqe

Technically, the selection of Simulated annealing parameters, for instance, initial temperature, temperature decreasing function, function to generate new points only impact how the shape of the plot and how fast this model converge. From several trials,

Thành Bùi Quang
2018 VNU Journal of Science Earth and Environmental Sciences  
It is normally used in combination with optimization algorithms for tuning its parameters to generate optimal objective values.  ...  This study proposed a novel method using Simulated Annealing to improve Anfis performance.  ...  Adaptive Fuzzy Inference System (Anfis) This techniques was first introduced in early 1990s and has been widely used in variation of research topics.  ... 
doi:10.25073/2588-1094/vnuees.4304 fatcat:5qyqi3bbdze4bij576s3fafcpe
« Previous Showing results 1 — 15 out of 15,038 results