Filters








19,759 Hits in 4.9 sec

Validated Variational Inference via Practical Posterior Error Bounds [article]

Jonathan H. Huggins, Mikołaj Kasprzak, Trevor Campbell, Tamara Broderick
2020 arXiv   pre-print
In this paper, we provide rigorous bounds on the error of posterior mean and uncertainty estimates that arise from full-distribution approximations, as in variational inference.  ...  We show that our analysis naturally leads to a new and improved workflow for validated variational inference.  ...  Acknowledgements The authors thank Sushrutha Reddy for pointing out some improvements to our Wasserstein bounds on the standard deviation and variance, and also Daniel Simpson, Lester Mackey, Arthur Gretton  ... 
arXiv:1910.04102v4 fatcat:wwq4xlhfdzeajhhyijyxdbli5e

Variational Inference via χ-Upper Bound Minimization [article]

Adji B. Dieng, Dustin Tran, Rajesh Ranganath, John Paisley, David M. Blei
2017 arXiv   pre-print
Variational inference (VI) is widely used as an efficient alternative to Markov chain Monte Carlo.  ...  In this paper we propose CHIVI, a black-box variational inference algorithm that minimizes D_χ(p || q), the χ-divergence from p to q.  ...  Variational inference approximates the posterior using optimization.  ... 
arXiv:1611.00328v4 fatcat:u2ibwbhmdbccplzh4rjwytddbu

Gaussian Mean Field Regularizes by Limiting Learned Information

Julius Kunze, Louis Kirsch, Hippolyt Ritter, David Barber
2019 Entropy  
Variational inference with a factorized Gaussian posterior estimate is a widely-used approach for learning parameters and hidden variables.  ...  We quantify a maximum capacity when the posterior variance is either fixed or learned and connect it to generalization error, even when the KL-divergence in the objective is scaled by a constant.  ...  via Gaussian mean field inference.  ... 
doi:10.3390/e21080758 pmid:33267472 pmcid:PMC7515287 fatcat:oj7kzldvxbe3zimh3m32yqdpry

Gaussian Mean Field Regularizes by Limiting Learned Information [article]

Julius Kunze, Louis Kirsch, Hippolyt Ritter, David Barber
2019 arXiv   pre-print
Variational inference with a factorized Gaussian posterior estimate is a widely used approach for learning parameters and hidden variables.  ...  We quantify a maximum capacity when the posterior variance is either fixed or learned and connect it to generalization error, even when the KL-divergence in the objective is rescaled.  ...  We validated its practicality for both supervised and unsupervised learning.  ... 
arXiv:1902.04340v1 fatcat:nn3phevt25gktiognm46ckzkt4

Variational Inference with Holder Bounds [article]

Junya Chen, Danni Lu, Zidi Xiu, Ke Bai, Lawrence Carin, Chenyang Tao
2021 arXiv   pre-print
The recent introduction of thermodynamic integration techniques has provided a new framework for understanding and improving variational inference (VI).  ...  In particular, we elucidate how the TVO naturally connects the three key variational schemes, namely the importance-weighted VI, Renyi-VI, and MCMC-VI, which subsumes most VI objectives employed in practice  ...  Variational inference leverages a posterior approximation to derive a lower bound on the log-evidence of the observed data, and it can be efficiently optimized.  ... 
arXiv:2111.02947v2 fatcat:cms2wta3z5fxja3mpubwgnioeq

Variational Bayesian Monte Carlo [article]

Luigi Acerbi
2018 arXiv   pre-print
Our method produces both a nonparametric approximation of the posterior distribution and an approximate lower bound of the model evidence, useful for model selection.  ...  We introduce here a novel sample-efficient inference framework, Variational Bayesian Monte Carlo (VBMC).  ...  The goal of variational inference is to find the variational parameters φ for which the variational posterior q φ "best" approximates the true posterior.  ... 
arXiv:1810.05558v2 fatcat:7f2ihogpjrcczi5rjigif4undy

Perturbative Black Box Variational Inference [article]

Robert Bamler, Cheng Zhang, Manfred Opper, Stephan Mandt
2018 arXiv   pre-print
We show in experiments on Gaussian Processes and Variational Autoencoders that the new bounds are more mass covering, and that the resulting posterior covariances are closer to the true posterior and lead  ...  Drawing on variational perturbation theory of statistical physics, we use these insights to construct a family of new variational bounds.  ...  As a practical consequence this implies that the exact maximum of the PBBVI lower bound is the true posterior if the variational family is sufficiently flexible to contain it.  ... 
arXiv:1709.07433v2 fatcat:2yndlzi3abau3m62glbie5tqca

Inference Suboptimality in Variational Autoencoders [article]

Chris Cremer, Xuechen Li, David Duvenaud
2018 arXiv   pre-print
Amortized inference allows latent-variable models trained via variational learning to scale to large datasets.  ...  The quality of approximate inference is determined by two factors: a) the capacity of the variational distribution to match the true posterior and b) the ability of the recognition network to produce good  ...  Turner & Sahani (2011) studied the biases in parameter learning induced by the variational approximation when learning via variational Expectation-Maximization.  ... 
arXiv:1801.03558v3 fatcat:eazithkr5fg6vhhtmkb67l23ay

Gaussian Processes with Differential Privacy [article]

Antti Honkela, Laila Melkas
2021 arXiv   pre-print
We achieve this by using sparse GP methodology and publishing a private variational approximation on known inducing points.  ...  We propose a method for hyperparameter learning using a private selection protocol applied to validation set log-likelihood.  ...  The curves show the fraction of test points within α high posterior density region. The error bars indicate two standard errors of the mean after 40 repeats.  ... 
arXiv:2106.00474v2 fatcat:olxaoaug4bajnnlxnyshbgobnu

Privacy-Preserving Representation Learning on Graphs: A Mutual Information Perspective [article]

Binghui Wang, Jiayi Guo, Ang Li, Yiran Chen, Hai Li
2021 arXiv   pre-print
Then, we derive tractable variational bounds for the mutual information terms, where each bound can be parameterized via a neural network.  ...  We formally formulate our goal via mutual information objectives. However, it is intractable to compute mutual information in practice.  ...  Specifically, each variational bound involves a variational posterior distribution, and it be parameterized via a neural network.  ... 
arXiv:2107.01475v1 fatcat:qavrgjvxzbfzfg56x5ngxv3ioe

Efficient Gaussian Process Classification Using Pólya-Gamma Data Augmentation

Florian Wenzel, Théo Galy-Fajou, Christan Donner, Marius Kloft, Manfred Opper
2019 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
We propose a scalable stochastic variational approach to GP classification building on Pólya-Gamma data augmentation and inducing points.  ...  This allows for exact inference via Gibbs sampling or approximate variational inference schemes (Scott and Sun 2013) .  ...  In practice, we employ stochastic variational inference, i.e. we only use minibatches of the data to obtain a noisy version of the natural gradient.  ... 
doi:10.1609/aaai.v33i01.33015417 fatcat:rxtde7jfbvbe7lkiccmb3eid5m

Projected BNNs: Avoiding weight-space pathologies by learning latent representations of neural network weights [article]

Melanie F. Pradier, Weiwei Pan, Jiayu Yao, Soumya Ghosh, Finale Doshi-velez
2019 arXiv   pre-print
This paper introduces a novel variational inference framework for Bayesian neural networks that (1) encodes complex distributions in high-dimensional parameter space with representations in a low-dimensional  ...  latent space, and (2) performs inference efficiently on the low-dimensional representations.  ...  To facilitate the optimization task, we first optimize the variational distribution in latent space q λz (z) (assuming φ fixed) via black-box variational inference (BBVI) [Ranganath et al., 2014] with  ... 
arXiv:1811.07006v3 fatcat:jgizz6apcrbdldapkdmmrcuhii

Boosting Variational Inference: an Optimization Perspective [article]

Francesco Locatello, Rajiv Khanna, Joydeep Ghosh, Gunnar Rätsch
2018 arXiv   pre-print
Variational inference is a popular technique to approximate a possibly intractable Bayesian posterior with a more tractable one.  ...  Recently, boosting variational inference has been proposed as a new paradigm to approximate the posterior by a mixture of densities by greedily adding components to the mixture.  ...  Therefore, we can finally give the Theorem that measures the total information loss of boosting variational inference via Frank-Wolfe. Theorem 8.  ... 
arXiv:1708.01733v2 fatcat:35kinigowza6pbzuzvgrhwxwxu

Training Variational Autoencoders with Buffered Stochastic Variational Inference [article]

Rui Shu, Hung H. Bui, Jay Whang, Stefano Ermon
2019 arXiv   pre-print
The recognition network in deep latent variable models such as variational autoencoders (VAEs) relies on amortized inference for efficient posterior approximation that can scale up to large datasets.  ...  upon the variational parameters returned by the amortized inference model.  ...  Since the gap of this bound is exactly the Kullback-Leibler divergence D(q(z) p θ (z | x)), q(z) is thus the variational approximation of the posterior.  ... 
arXiv:1902.10294v1 fatcat:leryskyeavbutjhlubci23zota

Bayesian Calibration, Validation and Uncertainty Quantification for Predictive Modelling of Tumour Growth: A Tutorial

Joe Collis, Anthony J. Connor, Marcin Paczkowski, Pavitra Kannan, Joe Pitt-Francis, Helen M. Byrne, Matthew E. Hubbard
2017 Bulletin of Mathematical Biology  
In the course of the example we calibrate the model against experimental data that is subject to measurement errors, and then validate the resulting uncertain model predictions.  ...  Finally, we propose an elementary learning approach for tuning a threshold parameter in the validation procedure in order to maximize predictive accuracy of our validated model.  ...  replication obtained via the validation model posterior predictive distribution.  ... 
doi:10.1007/s11538-017-0258-5 pmid:28290010 fatcat:l3cxsmhseffmblxtjmdabvltmu
« Previous Showing results 1 — 15 out of 19,759 results