A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Filters
Exact Inference of Hidden Structure from Sample Data in Noisy-OR Networks
[article]
2013
arXiv
pre-print
In the literature on graphical models, there has been increased attention paid to the problems of learning hidden structure (see Heckerman [H96] for survey) and causal mechanisms from sample data [H96, ...
In this work, we examine some restricted settings in which perfectly reconstruct the hidden structure solely on the basis of observed sample data. ...
Mansour was supported in part by a grant from the Israel Science Foundation. ...
arXiv:1301.7391v1
fatcat:gu37s2u4efe6djdc6cggbcfdze
Unsupervised Learning of Noisy-Or Bayesian Networks
[article]
2013
arXiv
pre-print
This paper considers the problem of learning the parameters in Bayesian networks of discrete variables with known structure and hidden variables. ...
In particular, we show how to learn the parameters for a family of bipartite noisy-or Bayesian networks. ...
Research supported in part by a Google Faculty Research Award, CIMIT award 12-1262, grant UL1 TR000038 from NCATS, and by an NSERC Postgraduate Scholarship. ...
arXiv:1309.6834v1
fatcat:siz76cj5kreatflvwvuhjo4gfq
Variational Learning for Noisy-OR Component Analysis
[chapter]
2005
Proceedings of the 2005 SIAM International Conference on Data Mining
In this work we investigate a class of latent factor models with hidden noisy-or units that let us decouple high dimensional vectors of observable binary random variables using a 'small' number of hidden ...
Since the problem of learning of such models from data is intractable, we develop its variational approximation. ...
Learning of Noisy-or Networks with Hidden Units The problem of learning of noisy-or bipartite networks has been addressed only in fully observable settings, that is, when both sources and observations ...
doi:10.1137/1.9781611972757.33
dblp:conf/sdm/SingliarH05
fatcat:j5u6moer3vaofmd7ihrc7ctw3y
Systems biology informed deep learning for inferring parameters and hidden dynamics
2020
PLoS Computational Biology
Having a reliable and robust algorithm for parameter inference and prediction of the hidden dynamics has been one of the core subjects in systems biology, and is the focus of this study. ...
Using few scattered and noisy measurements, we are able to infer the dynamics of unobserved species, external forcing, and the unknown model parameters. ...
Cell survival inferred dynamics from noisy observations compared with the exact solution. Predictions are performed on equally-spaced time instants in the interval of 0 − 60 hours. ...
doi:10.1371/journal.pcbi.1007575
pmid:33206658
fatcat:dlcrsdaljnd4fbfkbk5pxklhsa
Systems biology informed deep learning for inferring parameters and hidden dynamics
[article]
2019
bioRxiv
pre-print
Having a reliable and robust algorithm for parameter inference and prediction of the hidden dynamics has been one of the core subjects in systems biology, and is the focus of this study. ...
In this work, we present a novel "systems-informed neural network" to infer the dynamics of experimentally unobserved species as well as the unknown parameters in the system of equations. ...
Acknowledgments
384 This work was supported by the National Institutes of Health U01HL142518. ...
doi:10.1101/865063
fatcat:vdpivw7parfwnplq4clfe2dgje
Bayesian Networks for Brain-Computer Interfaces: A Survey
[article]
2022
arXiv
pre-print
This survey covers related existing works in relatively high-level perspectives, classifies the models and algorithms involved, and also summarizes the application of Bayesian Networks or its variants ...
Therefore, deploying Bayesian Networks in the application of Brain-Computer Interfaces becomes an increasingly popular approach in BCI research. ...
Inference Inference in Bayesian Networks is to solve a probability when a Bayesian Network graphical structure is already given. Inference methods could be exact or approximate. ...
arXiv:2206.07487v1
fatcat:cxhkagpjwncrroanp3lbgzozaq
Denoising and Untangling Graphs Using Degree Priors
2003
Neural Information Processing Systems
This paper addresses the problem of untangling hidden graphs from a set of noisy detections of undirected edges. ...
Exact inference in the model is intractable; we present an efficient approximate inference algorithm to compute edge appearance posteriors. ...
Introduction and motivation The inference of hidden graphs from noisy edge appearance data is an important problem with obvious practical application. ...
dblp:conf/nips/MorrisF03
fatcat:vz4hvjozp5gspbyjmpzcxzk5tu
Fast Variational Inference for Large-scale Internet Diagnosis
2007
Neural Information Processing Systems
The inference is fast enough to analyze network logs with billions of entries in a matter of hours. ...
We use this fast inference to diagnose a time series of anomalous HTTP requests taken from a real web service. ...
We are currently using variational stochastic gradient descent to analyze logs that contain billions of requests. We are not aware of any other applications of variational inference at this scale. ...
dblp:conf/nips/PlattKM07
fatcat:ad346fr2srgp5etsud7lhijwcy
Graph Neural Networks Intersect Probabilistic Graphical Models: A Survey
[article]
2022
arXiv
pre-print
Graph Neural Networks (GNNs) are new inference methods developed in recent years and are attracting growing attention due to their effectiveness and flexibility in solving inference and learning problems ...
Graphs are a powerful data structure to represent relational data and are widely used to describe complex real-world data structures. ...
Graph Structure Learning with PGMs Real-world networks might partially contain noisy structures and edges, or a graph that describes observed data is missing in some applicants. ...
arXiv:2206.06089v1
fatcat:twv6sh5f3ngkrjfiyoger6vuuq
Belief networks, hidden Markov models, and Markov random fields: A unifying view
1997
Pattern Recognition Letters
The use of graphs to represent independence structure in multivariate probability models has been pursued in a relatively independent fashion across a wide variety o f research disciplines since the beginning ...
of this century. ...
The research described in this paper was supported in part by the California Institute of Technology and the Air Force O ce of Scienti c Research under grant no. ...
doi:10.1016/s0167-8655(97)01050-7
fatcat:fpuec5yuwfgijnxqotm6wpnp6u
Plan Recognition Using Statistical–Relational Models
[chapter]
2014
Plan, Activity, and Intent Recognition
While the former cannot handle uncertainty in the data, the latter cannot handle structured representations. ...
Neither of these formalisms is suited for abductive reasoning because of the deductive nature of the underlying logical inference. ...
All views expressed are solely those of the authors and do not necessarily reflect the opinions of ARO, DARPA, NSF or any other government agency. ...
doi:10.1016/b978-0-12-398532-3.00003-8
fatcat:eevob56uqzafxndf55l7yzthou
Challenges in Markov chain Monte Carlo for Bayesian neural networks
[article]
2021
arXiv
pre-print
This paper initially reviews the main challenges in sampling from the parameter posterior of a neural network via MCMC. Such challenges culminate to lack of convergence to the parameter posterior. ...
Nevertheless, this paper shows that a non-converged Markov chain, generated via MCMC sampling from the parameter space of a neural network, can yield via Bayesian marginalization a valuable posterior predictive ...
ACKNOWLEDGEMENTS Research sponsored by the Laboratory Directed Research and Development Program of Oak Ridge National Laboratory, managed by UT-Battelle, LLC, for the US Department of Energy under contract ...
arXiv:1910.06539v6
fatcat:a7yyjtpsxvcd5okxwclt5gm3xe
Variational Training for Large-Scale Noisy-OR Bayesian Networks
2019
Conference on Uncertainty in Artificial Intelligence
We propose a stochastic variational inference algorithm for training large-scale Bayesian networks, where noisy-OR conditional distributions are used to capture higher-order relationships. ...
Using stochastic gradient updates based on our variational bounds, we learn noisy-OR Bayesian networks orders of magnitude faster than was possible with prior Monte Carlo learning algorithms, and provide ...
Acknowledgements This research supported in part by NSF CAREER Award No. IIS-1758028. ...
dblp:conf/uai/0001CNYZXS19
fatcat:ibqkwflconhsfjjx3pxi67266m
Approximate inference for medical diagnosis
1999
Pattern Recognition Letters
A drawback is that Bayesian networks become intractable for exact computation if a large medical domain would be modeled in detail. ...
This has obstructed the development of a useful system for in ternal medicine. ...
Acknowled g ements This research is supported by the Technology Foundation STW, applied science division of N\VO and the technology programme of the Ministry of Economic Affairs ...
doi:10.1016/s0167-8655(99)00090-2
fatcat:nagznp23bzfbthzwnx4m4ucme4
Variational Learning in Mixed-State Dynamic Graphical Models
[article]
2013
arXiv
pre-print
The number of computations needed for exact inference is exponential in the sequence length, so we derive an approximate variational inference technique that can also be used to learn the parameters of ...
We present a mixed-state dynamic graphical model in which a hidden Markov model drives a linear dynamic system. ...
Acknowledgments This work was supported in part by National Sci ence Foundation grant IRI-96-34618 and in part by the Army Research Laboratory Cooperative Agree- ...
arXiv:1301.6731v1
fatcat:ju24ji3gzjbi5hkg7d62jd22bi
« Previous
Showing results 1 — 15 out of 15,112 results