21,481 Hits in 3.5 sec

The Linearization of Belief Propagation on Pairwise Markov Networks [article]

Wolfgang Gatterbauer
2016 arXiv   pre-print
The present paper generalizes all prior work and derives an approach that approximates loopy BP on any pairwise MRF with the problem of solving a linear equation system.  ...  was shown to work well for the problem of node classification.  ...  I would like to thank Christos Faloutsos for very convincingly persuading me of the power of linear algebra and continued support.  ... 
arXiv:1502.04956v2 fatcat:jsklf4pltverljmngavwcgohk4

Mutual Conditional Independence and its Applications to Inference in Markov Networks [article]

Niharika Gauraha
2016 arXiv   pre-print
In this article we introduce the concept of mutual conditional independence relationship among elements of an independent set of a Markov network.  ...  The fundamental concepts underlying in Markov networks are the conditional independence and the set of rules called Markov properties that translates conditional independence constraints into graphs.  ...  The three Markov properties usually considered for Markov networks are pairwise, local and the global Markov properties.  ... 
arXiv:1603.03733v1 fatcat:qghhwy77fvfvjpatvqjumfbdf4

Impact of Noise on Molecular Network Inference

Radhakrishnan Nagarajan, Marco Scutari, Alberto de la Fuente
2013 PLoS ONE  
Subsequently, the impact of noise on two popular constraintbased Bayesian network structure learning algorithms such as Grow-Shrink (GS) and Incremental Association Markov Blanket (IAMB) that implicitly  ...  Analytical expressions elucidating the impact of discrepancies in noise variance on pairwise dependencies and conditional dependencies for special cases of these motifs are presented.  ...  Given these set of Markov blankets, identifying the correct network structure is impossible.  ... 
doi:10.1371/journal.pone.0080735 pmid:24339879 pmcid:PMC3855153 fatcat:3x75e5547rfibgre5aemwrtf2u

Inferring species interactions from co-occurrence data with Markov networks [article]

David J. Harris
2015 bioRxiv   pre-print
Here, I apply models from statistical physics, called Markov networks or Markov random fields, that can predict the direct and indirect consequences of any possible species interaction matrix.  ...  Using simulated landscapes with known pairwise interaction strengths, I evaluated Markov networks and six existing approaches.  ...  The Markov network consistently performed best of all.  ... 
doi:10.1101/018861 fatcat:wbnalcsqqraevl22up4qc6fnrq

Inferring species interactions from co-occurrence data with Markov networks

David J. Harris
2016 Ecology  
A linear approximation, based on partial covariances, also performed well 18 as long as the number of sampled locations exceeded the number of species in the data. 19 Indirect effects reliably caused a  ...  Using simulated landscapes with known pairwise interaction 14 strengths, I evaluated Markov networks and several existing approaches.  ...  I then used Gibbs Discussion 219 The results presented above are very promising, as they show that Markov networks can 220 recover much of the variation in species' pairwise interaction strengths  ... 
doi:10.1002/ecy.1605 pmid:27912022 fatcat:dzsnh53kdjfhpfdlfnfutgc5z4

Mutual conditional independence and its applications to model selection in Markov networks

Niharika Gauraha, Swapan K. Parui
2020 Annals of Mathematics and Artificial Intelligence  
We introduce the concept of mutual conditional independence in an independent set of a Markov network, and we prove its equivalence to the Markov properties under certain regularity conditions.  ...  The fundamental concepts underlying Markov networks are the conditional independence and the set of rules called Markov properties that translate conditional independence constraints into graphs.  ...  Acknowledgements The authors are grateful for the constructive inputs given by anonymous reviewers  ... 
doi:10.1007/s10472-020-09690-7 fatcat:nc5vxjfjojherbxzxascl6pma4

Bayesian neural networks and dimensionality reduction [article]

Deborshee Sen and Theodore Papamarkou and David Dunson
2020 arXiv   pre-print
In conducting non-linear dimensionality reduction and feature learning, it is common to suppose that the data lie near a lower-dimensional manifold.  ...  A class of model-based approaches for such problems includes latent variables in an unknown non-linear regression function; this includes Gaussian process latent variable models and variational auto-encoders  ...  This research was sponsored by the Laboratory Directed Research and Development Program of Oak Ridge National Laboratory, managed by UT-Battelle, LLC, for the US Department of Energy under contract DE-AC05  ... 
arXiv:2008.08044v2 fatcat:gdrqqqrw2nbwhclaq2adwe2npi

Hearing the Maximum Entropy Potential of neuronal networks [article]

Rodrigo Cofre, Bruno Cessac (INRIA Sophia Antipolis)
2014 arXiv   pre-print
We provide a method to compute explicitly and exactly this potential as a linear combination of spatio-temporal interactions.  ...  We show that there is a canonical potential whose Gibbs distribution, obtained from the Maximum Entropy Principle (MaxEnt), is the equilibrium distribution of this process.  ...  This work was supported by the French ministry of Research and University of Nice (EDSTIC), INRIA, ERC-NERVI number 227747, KEOPS ANR-CONICYT and European Union Project # FP7-269921 (BrainScales), Renvision  ... 
arXiv:1309.5873v2 fatcat:vtcfu6p7ynb3nb32lkjipef5ii

Improved bounds on the epidemic threshold of exact SIS models on complex networks

Navid Azizan Ruhi, Christos Thrampoulidis, Babak Hassibi
2016 2016 IEEE 55th Conference on Decision and Control (CDC)  
It has been shown that the exact marginal probabilities of infection can be upper bounded by an n-dimensional linear time-invariant system, a consequence of which is that the Markov chain is "fast-mixing  ...  The SIS (susceptible-infected-susceptible) epidemic model on an arbitrary network, without making approximations, is a 2^n-state Markov chain with a unique absorbing state (the all-healthy state).  ...  The authors would like to thank Ahmed Douik, Anatoly Khina and Ehsan Abbasi for insightful discussions on the subject.  ... 
doi:10.1109/cdc.2016.7798804 dblp:conf/cdc/RuhiTH16 fatcat:pa2dnwh7vjgubhhecwecb2rdrq

ARC: Adversarial Robust Cuts for Semi-Supervised and Multi-label Classification

Sima Behpour, Wei Xing, Brian D. Ziebart
2018 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)  
Unfortunately, the hinge loss used to construct these methods often provides a particularly loose bound on the loss function of interest (e.g., the Hamming loss).  ...  We conduct multi-label and semi-supervised binary prediction experiments that demonstrate the benefits of our approach.  ...  Markov networks can be written as log-linear models when their densities are positive.  ... 
doi:10.1109/cvprw.2018.00255 dblp:conf/cvpr/Behpour18 fatcat:4ehxzmk7trddfpgmqtydxmuo7a

Learning associative Markov networks

Ben Taskar, Vassil Chatalbashev, Daphne Koller
2004 Twenty-first international conference on Machine learning - ICML '04  
Our approach exploits a linear programming relaxation for the task of finding the best joint assignment in such networks, which provides an approximate quadratic program (QP) for the problem of learning  ...  We show that for associative Markov network over binary-valued variables, this approximate QP is guaranteed to return an optimal parameterization for Markov networks of arbitrary topology.  ...  A pairwise Markov network is simply a Markov network where all of the cliques involve either a single node or a pair of nodes.  ... 
doi:10.1145/1015330.1015444 dblp:conf/icml/TaskarCK04 fatcat:urj5w667erfrhnw6jllxn47hee

High-dimensional structure learning of binary pairwise Markov networks: A comparative numerical study [article]

Johan Pensar, Yingying Xu, Santeri Puranen, Maiju Pesonen, Yoshiyuki Kabashima, Jukka Corander
2019 arXiv   pre-print
In this work, we perform an extensive numerical study comparing the different types of methods on data generated by binary pairwise Markov networks.  ...  Learning the undirected graph structure of a Markov network from data is a problem that has received a lot of attention during the last few decades.  ...  In this work, we will consider the problem of learning the undirected graph structure of pairwise Markov networks over binary variables.  ... 
arXiv:1901.04345v1 fatcat:qq3hu74cafgy5dluzbqb25fd4q

Discovering the Markov network structure [article]

Edith Kovács, Tamás Szántai
2013 arXiv   pre-print
Using the decomposability of the information content an algorithm is given for discovering the Markov network graph structure endowed by the pairwise Markov property of a given probability distribution  ...  Our algorithm for discovering the pairwise Markov network is illustrated on this example, too.  ...  The pairwise Markov network is given in Figure 2 . Table I ).  ... 
arXiv:1307.0643v1 fatcat:syu7brpzpvfrdnoh6hxqjockvm

Large-Scale Classification of Structured Objects using a CRF with Deep Class Embedding [article]

Eran Goldman, Jacob Goldberger
2017 arXiv   pre-print
We model sequences of images as linear-chain CRFs, and jointly learn the parameters from both local-visual features and neighboring classes.  ...  The visual features are computed by convolutional layers, and the class embeddings are learned by factorizing the CRF pairwise potential matrix.  ...  To efficiently train the network, we introduce a pairwise softmax architecture which optimizes a local approximation of the likelihood.  ... 
arXiv:1705.07420v2 fatcat:pjcw534dsnbdjmgdrbjoduucsa

Learning Deep Structured Models [article]

Liang-Chieh Chen and Alexander G. Schwing and Alan L. Yuille and Raquel Urtasun
2015 arXiv   pre-print
We demonstrate the effectiveness of our algorithm in the tasks of predicting words from noisy images, as well as multi-class classification of Flickr photographs.  ...  Markov random fields (MRFs) are a great mathematical tool to encode such relationships.  ...  ACKNOWLEDGMENTS We thank NVIDIA Corporation for the donation of GPUs used in this research. This work was partially funded by ONR-N00014-14-1-0232.  ... 
arXiv:1407.2538v3 fatcat:2fzubi36mrcdrnba7lqljfbnwe
« Previous Showing results 1 — 15 out of 21,481 results