Filters








30,524 Hits in 3.9 sec

Tractable Inference for Complex Stochastic Processes [article]

Xavier Boyen, Daphne Koller
2013 arXiv   pre-print
Unfortunately, the state spaces of complex processes are very large, making an explicit representation of a belief state intractable.  ...  In the case of a stochastic system, these tasks typically involve the use of a belief state- a probability distribution over the state of the process at a given point in time.  ...  Acknowledgements We gratefully acknowledge Eric Bauer, Lise Getoor, and Uri Lerner for work on the software used in the experi-ments, Raya Fratkina for help with the network files, and Tim Huang for providing  ... 
arXiv:1301.7362v1 fatcat:r7coczhoavg7hhdlgz65gwag6i

Cascading Denoising Auto-Encoder as a Deep Directed Generative Model [article]

Dong-Hyun Lee
2017 arXiv   pre-print
These are unavoidable when recent suc-cessful directed model like VAE (Kingma &Welling, 2014) is trained on complex dataset likereal images.  ...  On theone hand, can deep directed models be success-fully trained without intractable posterior infer-ence and difficult optimization of very deep neu-ral networks in inference and generative mod-els?  ...  We consider a directed model with an stochastic identity mapping (simple corruption process) as an inference model and a DAE as a generative model.  ... 
arXiv:1511.07118v2 fatcat:ipyhvemybjb2rkbwppj3sk2q2q

The Variational Gaussian Process [article]

Dustin Tran, Rajesh Ranganath, David M. Blei
2016 arXiv   pre-print
We develop the variational Gaussian process (VGP), a Bayesian nonparametric variational family, which adapts its shape to match complex posterior distributions.  ...  Variational inference is a powerful tool for approximate inference, and it has been recently applied for representation learning with deep generative models.  ...  ACKNOWLEDGEMENTS We thank David Duvenaud, Alp Kucukelbir, Ryan Giordano, and the anonymous reviewers for their helpful comments.  ... 
arXiv:1511.06499v4 fatcat:rxtue3ahoveytj5im3ubye5vyu

Automated Variational Inference in Probabilistic Programming [article]

David Wingate, Theophane Weber
2013 arXiv   pre-print
We present a new algorithm for approximate inference in probabilistic programs, based on a stochastic gradient for variational programs.  ...  This method is efficient without restrictions on the probabilistic program; it is particularly practical for distributions which are not analytically tractable, including highly structured distributions  ...  Use of stochastic approximations for variational inference is also used by Carbonetto [30] .  ... 
arXiv:1301.1299v1 fatcat:jchutrnmkfeqbob27lk5oz3hy4

Reformulating Inference Problems Through Selective Conditioning [article]

Paul Dagum, Eric J. Horvitz
2013 arXiv   pre-print
With employ the selective conditioning approach to target specific nodes in a belief network for decomposition, based on the contribution the nodes make to the tractability of stochastic simulation.  ...  We review previous work on BNRAS algorithms- randomized approximation algorithms for probabilistic inference.  ...  Although we cannot avoid worst-case in tractability, we can apply methods to refine an initial problem instance by removing unnecessary complexity.  ... 
arXiv:1303.5397v1 fatcat:2irqr3cj7vbdhfd5eh4sjuavny

Reformulating Inference Problems Through Selective Conditioning [chapter]

Paul Dagum, Eric Horvitz
1992 Uncertainty in Artificial Intelligence  
With employ the selective con· ditioning approach to target specific nodes in a belief network for decomposition, based on the contribution the nodes make to the tractability of stochastic simulation.  ...  We re view previous work on BNRAS algorithms randomized approximation algorithms for probabilistic inference.  ...  Although we cannot avoid worst-case in tractability, we can apply methods to refine an initial problem instance by removing unnecessary complexity.  ... 
doi:10.1016/b978-1-4832-8287-9.50011-6 fatcat:m6bd4hcq3rcgjfr2n4ukurfrx4

Introduction to the Issue on Stochastic Simulation and Optimization in Signal Processing

Steve Mclaughlin, Marcelo Pereyra, Alfred O. Hero, Jean-Yves Tourneret, Jean-Christophe Pesquet
2016 IEEE Journal on Selected Topics in Signal Processing  
He has been serving as an associate editor for the IEEE TRANSACTIONS ON SIGNAL PROCESSING (2008-2011, 2015 and for the EURASIP journal on Signal Processing (2013-present).  ...  Processing Society (2001-2007, 2010-present).  ...  ; for example, they use stochastic models to represent the data observation process and the prior knowledge available and they obtain solutions by performing statistical inference (e.g., using maximum  ... 
doi:10.1109/jstsp.2016.2524963 fatcat:ogalgccjjndihjf2ck6w432hbe

Page 427 of Genetics Vol. 170, Issue 1 [page]

2005 Genetics  
The second form is based on walks over complete graphs and offers numerically tractable solutions for increasing number of taxa.  ...  I demonstrate the flexibility of these stochastic models to test competing ideas about HGT by examining the complexity hypothesis and find support for increased HGT of operational genes com- pared to informational  ... 

Inference, Prediction, and Control of Networked Epidemics [article]

Nicholas J. Watkins, Cameron Nowzari, George J. Pappas
2017 arXiv   pre-print
We then leverage the attained conditional independence property to construct tractable mechanisms for the inference and prediction of the process state, avoiding the need to use mean field approximations  ...  of the stochastic epidemic process in order to control the exact dynamics of the epidemic outbreak.  ...  using standard inference algorithms for graphical models (see, e.g. [17] ) to estimate the joint distributions of the process would not be tractable.  ... 
arXiv:1703.07409v1 fatcat:ck6rwwt6efbx7ezk2gipgkh4t4

Page 4839 of Mathematical Reviews Vol. , Issue 95h [page]

1995 Mathematical Reviews  
We suggest a general method for establishing such results when a stochastic process in  ...  (S-RIT-OP; Stockholm); Johansson, Bjorn (S-STOC-MS; Stockholm) On theorems of de Finetti type for continuous time stochastic processes.  ... 

Computational Resource Demands of a Predictive Bayesian Brain

Johan Kwisthout, Iris van Rooij
2019 Computational Brain & Behavior  
We discuss the implications of these sobering results for the predictive processing account and propose a way to move forward.  ...  Given that many forms of cognition seem to be well characterized as a form of Bayesian inference, this conjecture has great import for cognitive science.  ...  for very constructive feedback.  ... 
doi:10.1007/s42113-019-00032-3 fatcat:vvadybxrdrc6hedkn6hn7gzvoe

Hierarchical Variational Models [article]

Rajesh Ranganath, Dustin Tran, David M. Blei
2016 arXiv   pre-print
HVMs augment a variational approximation with a prior on its parameters, which allows it to capture complex structure for both discrete and continuous latent variables.  ...  Black box variational inference allows researchers to easily prototype and evaluate an array of models. Recent advances allow such algorithms to scale to high dimensions.  ...  Local expectation gradients for doubly stochastic variational inference. In Neural Information Processing Systems. Tran, D., Blei, D. M., and Airoldi, E. M. (2015). Copula variational inference.  ... 
arXiv:1511.02386v2 fatcat:cyu73a35fbcdlbo57pynmxkrxe

Training Deep Gaussian Processes using Stochastic Expectation Propagation and Probabilistic Backpropagation [article]

Thang D. Bui, José Miguel Hernández-Lobato, Yingzhen Li, Daniel Hernández-Lobato, Richard E. Turner
2015 arXiv   pre-print
The new method leverages a recently proposed method for scaling Expectation Propagation, called stochastic Expectation Propagation.  ...  Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (GPs) and are formally equivalent to neural networks with multiple, infinitely wide hidden layers.  ...  YL thanks the Schlumberger Foundation for her Faculty for the Future PhD fellowship. RET thanks EPSRC grants EP/G050821/1 and EP/L000776/1.  ... 
arXiv:1511.03405v1 fatcat:hm2fjijqdfh4loqnciri3p6dnu

Predictive Simulation for Design and Optimization of Complex Physical Systems

Robert Moser, Karen Willcox, Omar Ghattas, Youssef Marzouk, J. Tinsley Oden
2017 Figshare  
Optimization, Design & Decision Making Application of methods for stochastic optimization have not generally carried over to complex physical systems.  ...  In addition, decisions in design of complex systems are often staged, with some decisions made early in the process, and others made later when more information is available.  ... 
doi:10.6084/m9.figshare.5318215.v1 fatcat:fugcgoh27ngtnpbh5a2gpe4vgq

Sparse Spatio-temporal Gaussian Processes with General Likelihoods [chapter]

Jouni Hartikainen, Jaakko Riihimäki, Simo Särkkä
2011 Lecture Notes in Computer Science  
We use expectation propagation to perform approximate inference on non-Gaussian data, and show how to incorporate sparse approximations to further reduce the computational complexity.  ...  In this paper, we consider learning of spatio-temporal processes by formulating a Gaussian process model as a solution to an evolution type stochastic partial differential equation.  ...  The authors would like to thank Finnish Doctoral Programme in Computational Sciences (FICS), Centre of Excellence in Computational Complex Systems Research (COSY) and Academy of Finland for financial support  ... 
doi:10.1007/978-3-642-21735-7_24 fatcat:ldapp247ifcf7eycvl2f2uhaem
« Previous Showing results 1 — 15 out of 30,524 results