A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Fast Parallel SAME Gibbs Sampling on General Discrete Bayesian Networks
[article]
2015
arXiv
pre-print
A fundamental task in machine learning and related fields is to perform inference on Bayesian networks. ...
Gibbs sampling is one of the most accurate approaches and provides unbiased samples from the posterior but it has historically been too expensive for large models. ...
We extend that result by implementing a general-purpose Gibbs sampler that can be applied to arbitrary discrete graphical models. 3 FAST PARALLEL SAME GIBBS SAMPLING SAME is a variant of MCMC where one ...
arXiv:1511.06416v1
fatcat:hy6ljcuecvcjpmd62qamk7w5pa
Building fast Bayesian computing machines out of intentionally stochastic, digital parts
[article]
2014
arXiv
pre-print
We find that by connecting stochastic digital components according to simple mathematical rules, one can build massively parallel, low precision circuits that solve Bayesian inference problems and are ...
Here we show how to build fast Bayesian computing machines using intentionally stochastic, digital parts, narrowing this efficiency gap by multiple orders of magnitude. ...
Acknowledgements The authors would like to acknowledge Tomaso Poggio, Thomas Knight, Gerald Sussman, Rakesh Kumar and Joshua Tenenbaum for numerous helpful discussions and comments on early drafts, and ...
arXiv:1402.4914v1
fatcat:mnjmxywzyrgo5avrttcvsxosri
Fast Sampling for Bayesian Max-Margin Models
[article]
2016
arXiv
pre-print
Bayesian max-margin models have shown superiority in various practical applications, such as text categorization, collaborative prediction, social network link prediction and crowdsourcing, and they conjoin ...
However, Monte Carlo sampling for these models still remains challenging, especially for applications that involve large-scale datasets. ...
Fast sampling for Gibbs iSVM We develop the fast sampling method for Gibbs iSVM by incorporating the stochastic subgradient MCMC method within the loop of a Gibbs sampler. ...
arXiv:1504.07107v5
fatcat:kdbt2e5zm5gf7jwh57or2hmzxy
Bayesian Inference of Gene Regulatory Network
[chapter]
2020
Bayesian Inference on Complicated Data
In this chapter, we introduced GRN modeling using hierarchical Bayesian network and then used Gibbs sampling to identify network variables. ...
complexity ensures fast convergence to reliable results. ...
Acknowledgements Funding for open access charge: Virginia Tech's Open Access Subvention Found (VT OASF). 13 Bayesian Inference of Gene Regulatory Network DOI: http://dx.doi.org /10.5772/intechopen.88799 ...
doi:10.5772/intechopen.88799
fatcat:c64rtict35aj5o376amu3l3wyy
Big Learning with Bayesian Methods
[article]
2017
arXiv
pre-print
Bayesian methods represent one important class of statistic methods for machine learning, with substantial recent developments on adaptive, flexible and scalable Bayesian learning. ...
, regularized Bayesian inference for improving the flexibility via posterior regularization, and scalable algorithms and systems based on stochastic subsampling and distributed computing for dealing with ...
The early asynchronous Gibbs sampler [68] is highly parallel by sampling all variables simultaneously on separate processors. ...
arXiv:1411.6370v2
fatcat:zmxse4kkqjgffkricevyumaoiu
Autonomous robot controller using bitwise gibbs sampling
2016
2016 IEEE 15th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC)
The VHDL specification of the circuit implementation of this controller is based on stochastic computation to perform Bayesian inference at a low energy cost. ...
This controller uses a bitwise version of the Gibbs sampling algorithm to select commands so the robot can adapt its course of action and avoid perceived obstacles in the environment. ...
This first generation of Bayesian machines was based on the exhaustive inference paradigm: scanning all possible values of the discrete search space. ...
doi:10.1109/icci-cc.2016.7862096
dblp:conf/IEEEicci/CanillasLFVM16
fatcat:lyy7vwr27zhc5f3k4ue4vhn2nq
A bi-partite generative model framework for analyzing and simulating large scale multiple discrete-continuous travel behaviour data
[article]
2019
arXiv
pre-print
Without loss of generality, we consider a restricted Boltzmann machine (RBM) based algorithm with multiple discrete-continuous layer, formulated as a variational Bayesian inference optimization problem ...
We show parameter stability from model analysis and simulation tests on an open dataset with multiple discrete-continuous dimensions from a data size of 293,330 observations. ...
or more Gibbs sampling steps drawn to approximate the equilibrium energy: x s p(x,s) ∼x s (20) wherex s are generated input samples multiplied by the generated latent variable samples from the Gibbs sample ...
arXiv:1901.06415v2
fatcat:zz7gdfe565avrjc3faejhk3zne
Using Deep Neural Network Approximate Bayesian Network
[article]
2018
arXiv
pre-print
Experiment results on several public Bayesian Network datasets shows that Deep Neural Network is capable of learning joint probability distri- bution of Bayesian Network by learning from a few observation ...
Another contribution of our work is that we have shown discriminative model like Deep Neural Network can approximate generative model like Bayesian Network. ...
C proposed an more efficient parallel Gibbs sampling algorithm running on GPU using state replication (State Augmented Marginal Estimation) 23 . ...
arXiv:1801.00282v2
fatcat:ppf6csgjwjhx3pajl6ak23zzw4
Operations for Learning with Graphical Models
[article]
1994
arXiv
pre-print
This includes versions of linear regression, techniques for feed-forward networks, and learning Gaussian and discrete Bayesian networks from data. ...
Two standard algorithm schemas for learning are reviewed in a graphical framework: Gibbs sampling and the expectation maximization algorithm. ...
These ideas were presented in formative stages at Snowbird 1993 (Neural Networks for Computing), April 1993, and to the Bayesian Analysis in Expert Systems (BAIES) group in Pavia, Italy, June 1993. ...
arXiv:cs/9412102v1
fatcat:7ysut5wo6nfwnaco4rycbedpva
Vida: How to Use Bayesian Inference to De-anonymize Persistent Communications
[chapter]
2009
Lecture Notes in Computer Science
We present the Vida family of abstractions of anonymous communication systems, model them probabilistically and apply Bayesian inference to extract patterns of communications and user profiles. ...
The first is a very generic Vida Black-box model that can be used to analyse information about all users in a system simultaneously, while the second is a simpler Vida Red-Blue model, that is very efficient ...
The authors would like to thank the participants of the second UK anonymity meet-up in 2008, and in particular Andrei Serjantov, Ben Laurie, and Tom Chothia for their valuable comments on this research ...
doi:10.1007/978-3-642-03168-7_4
fatcat:svehdc4beba4labv5l75a65cce
Probabilistic Models for Text Mining
[chapter]
2012
Mining Text Data
The chapter focuses more on the fundamental probabilistic techniques, and also covers their various applications to different text mining problems. ...
In [49] , parallel algorithms for LDA are based on Gibbs sampling algorithm. Two versions of algorithms, AD-LDA and HD-LDA, are proposed. ...
There are also some parallel learning algorithms for fast computing LDA. [47] proposes parallel version of algorithms based on variational EM algorithm for LDA. ...
doi:10.1007/978-1-4614-3223-4_8
fatcat:2ryk6bv3ovgv7mtuyfak3pxiyi
A Gibbs sampler for conductivity imaging and other inverse problems
2012
Image Reconstruction from Incomplete Data VII
/ on 05/07/2013 Terms of Use: http://spiedl.org/terms ...
Gibbs samplers have many desirable theoretical properties, but also have the pesky requirement that conditional distributions be available. ...
While Gibbs sampling does not necessarily give a fast method for computational inference, the sampler is now amenable to acceleration techniques that we expect will produce an fast computational route ...
doi:10.1117/12.931111
fatcat:oghuvwtvyjd7pm6spdnudx6kra
Constraint-based causal discovery with mixed data
2018
International Journal of Data Science and Analytics
In experiments on simulated Bayesian networks, we employ the PC algorithm with different conditional independence tests for mixed data and show that the proposed approach outperforms alternatives in terms ...
Such tests can then be directly used by existing constraint-based methods with mixed data, such as the PC and FCI algorithms for learning Bayesian networks and maximal ancestral graphs, respectively. ...
The research leading to these results has received funding from the European Research
Compliance with ethical standards Conflict of interest On behalf of all authors, the corresponding author states ...
doi:10.1007/s41060-018-0097-y
pmid:30957008
pmcid:PMC6428307
dblp:journals/ijdsa/TsagrisBLT18
fatcat:2mn5wm77fzhavckbixzc7weyfq
Happiness as a Driver of Risk-avoiding Behaviour: Theory and an Empirical Study of Seatbelt Wearing and Automobile Accidents
2014
Economica
However, we can define an equivalence class on the space of
Bayesian networks such that Bayesian networks within the same class imply the
same conditional independence structure. ...
While the general method of Gibbs sampling is well-established, the requirement of acyclicity in Bayesian networks makes designing a Gibbs sampler difficult in this context. ...
This approach samples total orders rather than Bayesian networks directly. Often this improves the mixing of the sampler. ...
doi:10.1111/ecca.12094
fatcat:65prnkivsnfw7o6ctyzkujheea
Local Expectation Gradients for Doubly Stochastic Variational Inference
[article]
2015
arXiv
pre-print
Furthermore, the proposed algorithm has interesting similarities with Gibbs sampling but at the same time, unlike Gibbs sampling, it can be trivially parallelized. ...
We introduce local expectation gradients which is a general purpose stochastic variational inference algorithm for constructing stochastic gradients through sampling from the variational distribution. ...
Furthermore, the local expectation algorithm has striking similarities with Gibbs sampling with the important difference, that unlike Gibbs sampling, it can be trivially parallelized. ...
arXiv:1503.01494v1
fatcat:butrtmilqnelrb2y25qfwbmgty
« Previous
Showing results 1 — 15 out of 1,991 results