Filters








5,633 Hits in 5.1 sec

Bayesian Inference for Layer Representation with Mixed Markov Random Field [chapter]

Ru-Xin Gao, Tian-Fu Wu, Song-Chun Zhu, Nong Sang
Lecture Notes in Computer Science  
This paper presents a Bayesian inference algorithm for image layer representation [26] , 2.1D sketch [6] , with mixed Markov random field. 2.1D sketch is an very important problem in low-middle level vision  ...  These makes the problem a mixed random field.  ...  Discussion This paper presents a Bayesian inference algorithm for image layer representation with mixed Markov random field.  ... 
doi:10.1007/978-3-540-74198-5_17 fatcat:r55avs26sff5fn63k5dzp6i3fi

Probabilistic Deep Learning with Probabilistic Neural Networks and Deep Probabilistic Models [article]

Daniel T. Chang
2021 arXiv   pre-print
and deep mixed effects models (for deep probabilistic models).  ...  TensorFlow Probability is a library for probabilistic modeling and inference which can be used for both approaches of probabilistic deep learning. We include its code examples for illustration.  ...  Acknowledgement: Thanks to my wife Hedy (期芳) for her support. References  ... 
arXiv:2106.00120v3 fatcat:gbeonxch4vav7jaqu3nvti7thi

A Hierarchical Markov Random Field Model for Bayesian Blind Image Separation

Feng Su, Ali Mohammad-Djafari
2008 2008 Congress on Image and Signal Processing  
In this paper we propose an hierarchical Markov random field (HMRF) model and the Bayesian estimation frame for separating noisy linear mixtures of images constituted by homogeneous patches.  ...  A latent Potts-Markov labeling field is introduced for each source image to enforce piecewise homogeneity of pixel values.  ...  To numerous context-dependent variables of this form, Markov random field (MRF) theory provides a consistent estimation and inference model.  ... 
doi:10.1109/cisp.2008.6 fatcat:vmosopkqbbf53j6ixxktbxmc2i

A Survey on Bayesian Nonparametric Learning

Junyu Xuan, Jie Lu, Guangquan Zhang
2019 ACM Computing Surveys  
Some renowned examples are Bayesian network, Gaussian mixture models, hidden Markov model [134] , Markov random field, conditional random field, and latent Dirichlet allocation [11] .  ...  On the back of Bayesian learning's great success, Bayesian nonparametric learning (BNL) has emerged as a force for further advances in this field due to its greater modelling flexibility and representation  ...  Partially-Observable Reinforcement Learning (PORL) [46] Conditional Random Field (CRF) [95, 135] Markov Random Field (MRF) [25] Author Topic Model (ATM) [190] Principal Component Analysis (PCA)  ... 
doi:10.1145/3291044 fatcat:aytdnsnrfvfnti5i64ne4icenu

SEMIPARAMETRIC REGRESSION AND GRAPHICAL MODELS

M. P. Wand
2009 Australian & New Zealand journal of statistics (Print)  
Semiparametric regression models that use spline basis functions with penalization have graphical model representations.  ...  Directed acyclic graphs, also known as Bayesian networks, play a prominent role. Graphical model-based Bayesian 'inference engines', such as BUGS and VIBES, facilitate fitting and inference.  ...  I am grateful to John Ormerod for his assistance with the variational approximation examples and to him and Raymond Carroll for their valuable feedback.  ... 
doi:10.1111/j.1467-842x.2009.00538.x fatcat:5zgb25sk4jb43bt4ujiim7aflu

Incorporating visual knowledge representation in stereo reconstruction

A. Barbu, Song-Chun Zhu
2005 Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1  
, and a structureless part represented by Markov random field on pixels.  ...  Our experiments show that this representation can infer the depth map with sharp boundaries and junctions for textureless images, curve objects and free-form shapes.  ...  The two layers of Markov random field models-one on the sketch layer and one on the pixel layer work collaboratively in optimizing a Bayesian posterior probability.  ... 
doi:10.1109/iccv.2005.120 dblp:conf/iccv/BarbuZ05 fatcat:45w3ldz3frc7rh3uhtpd33lfoe

MCMC for Hierarchical Semi-Markov Conditional Random Fields [article]

Truyen Tran, Dinh Phung, Svetha Venkatesh, Hung H. Bui
2014 arXiv   pre-print
Current exact inference schemes either cost cubic time in sequence length, or exponential time in model depth. These costs are prohibitive for large-scale problems with arbitrary length and depth.  ...  Deep architecture such as hierarchical semi-Markov models is an important class of models for nested sequential data.  ...  Gibbs Sampling (RBGS) for approximate inference in the Hierarchical Semi-Markov Conditional Random Fields.  ... 
arXiv:1408.1162v1 fatcat:coojs5sjubceben7bx674idp2y

Bayesian neural networks and dimensionality reduction [article]

Deborshee Sen and Theodore Papamarkou and David Dunson
2020 arXiv   pre-print
We attempt to solve these problems by deploying Markov chain Monte Carlo sampling algorithms (MCMC) for Bayesian inference in ANN models with latent variables.  ...  A class of model-based approaches for such problems includes latent variables in an unknown non-linear regression function; this includes Gaussian process latent variable models and variational auto-encoders  ...  This manuscript has been authored by UT-Battelle, LLC, under contract DE-AC05-00OR22725 with the US Department of Energy (DOE).  ... 
arXiv:2008.08044v2 fatcat:gdrqqqrw2nbwhclaq2adwe2npi

Brain-Inspired Hardware Solutions for Inference in Bayesian Networks

Leila Bagheriye, Johan Kwisthout
2021 Frontiers in Neuroscience  
A departure from conventional computing systems to make use of the high parallelism of Bayesian inference has attracted recent attention, particularly in the hardware implementation of Bayesian networks  ...  The implementation of inference (i.e., computing posterior probabilities) in Bayesian networks using a conventional computing paradigm turns out to be inefficient in terms of energy, time, and space, due  ...  The commonly described graphical models are Hidden Markov Models (HMMs), Markov Random Fields (MRFs), and Bayesian networks.  ... 
doi:10.3389/fnins.2021.728086 pmid:34924925 pmcid:PMC8677599 fatcat:tihogzl6tfbpjdybwpggllwd5u

Machine learning based hyperspectral image analysis: A survey [article]

Utsav B. Gewali, Sildomar T. Monteiro, Eli Saber
2019 arXiv   pre-print
We also discuss the open challenges in the field of hyperspectral image analysis and explore possible future directions.  ...  Machine learning algorithms due to their outstanding predictive power have become a key tool for modern hyperspectral image analysis.  ...  Markov random fields) [116] .  ... 
arXiv:1802.08701v2 fatcat:bfi6qkpx2bf6bowhyloj2duugu

Graphical Models in a Nutshell [chapter]

2007 Introduction to Statistical Relational Learning  
Graphical models have enjoyed a surge of interest in the last two decades, due both to the flexibility and power of the representation and to the increased ability to effectively learn and perform inference  ...  The framework is quite general in that many of the commonly proposed statistical models (Kalman filters, hidden Markov models, Ising models) can be described as graphical models.  ...  The two most common types of graphical models are Bayesian networks (also called belief networks or causal networks) and Markov networks (also called Markov random fields (MRFs)).  ... 
doi:10.7551/mitpress/7432.003.0004 fatcat:wbhjah7qczdftaiod5jg4ok2xe

Relational Models [article]

Volker Tresp, Maximilian Nickel
2016 arXiv   pre-print
., Bayesian networks, Markov networks, or latent variable models.  ...  The second layer calculates the latent representations a i and a j . The following layer forms componentwise products.  ...  In a Bayesian logic program, for each clause there is one conditional probability distribution and for each random variable there is one combination rule.  ... 
arXiv:1609.03145v1 fatcat:ytomwo4l5nfsrnpm2ens5iww6m

Modeling language and cognition with deep unsupervised learning: a tutorial overview

Marco Zorzi, Alberto Testolin, Ivilin P. Stoianov
2013 Frontiers in Psychology  
Deep unsupervised learning in stochastic recurrent neural networks with many layers of hidden units is a recent breakthrough in neural computation research.  ...  A variant known as Restricted Boltzmann Machine (RBM) is obtained by removing within-layer lateral connections to form a bipartite graph, allowing to perform efficient inference and learning.  ...  Structured Bayesian models of cognition (for reviews see Griffiths et al., 2010) assume that human learning and inference approximately follow the principles of Bayesian probabilistic inference and they  ... 
doi:10.3389/fpsyg.2013.00515 pmid:23970869 pmcid:PMC3747356 fatcat:hnszejz7yfeufgsdfbuxktplbe

Bayesian inference using intermediate distribution based on coarse multiscale model for time fractional diffusion equation [article]

Lijian Jiang, Na Ou
2017 arXiv   pre-print
For Bayesian inference, we use GMsFEM and least-squares stochastic collocation method to obtain a reduced coarse model based on the intermediate distribution.  ...  A few numerical examples for time fractional diffusion equations are carried out to demonstrate the performance of the proposed method with applications of the Bayesian inversion.  ...  Identification of a mixed random field.  ... 
arXiv:1706.10224v1 fatcat:n2bbees7vze4zb53k4d37wrzdi

A Bayesian nonparametric approach for uncovering rat hippocampal population codes during spatial navigation

Scott W. Linderman, Matthew J. Johnson, Matthew A. Wilson, Zhe Chen
2016 Journal of Neuroscience Methods  
For HPD-HMM, the MCMC-based inference with Hamiltonian Monte Carlo (HMC) hyperparameter sampling is flexible and efficient, and outperforms VB and MCMC approaches with hyperparameters set by empirical  ...  Specifically, we apply a hierarchical Dirichlet process-hidden Markov model (HDP-HMM) using two Bayesian inference methods, one based on Markov chain Monte Carlo (MCMC) and the other based on variational  ...  In MCMC inference, the estimate is asymptotically unbiased, however, if the Markov chain mixes slowly, the estimate's variance can be inaccurate.  ... 
doi:10.1016/j.jneumeth.2016.01.022 pmid:26854398 pmcid:PMC4801699 fatcat:aqgqsrkgzjg2rlc2ygwtl7h4eu
« Previous Showing results 1 — 15 out of 5,633 results