2,400 Hits in 5.6 sec

Markov Random Field Energy Minimization via Iterated Cross Entropy with Partition Strategy

Jue Wu, Albert C. S. Chung
2007 2007 IEEE International Conference on Acoustics, Speech and Signal Processing - ICASSP '07  
This paper introduces a novel energy minimization method, namely iterated cross entropy with partition strategy (ICEPS), into the Markov random field theory.  ...  We speed up the original cross entropy algorithm by partitioning the MRF site set and assure the effectiveness by iterating the algorithm.  ...  CONCLUSION In this paper, we propose a new solver, namely iterated cross entropy with partition strategy (ICEPS), for Markov random field modeling, which is an important tool not only in image segmentation  ... 
doi:10.1109/icassp.2007.366715 dblp:conf/icassp/WuC07 fatcat:5uah4lnmivavlht5lqfu2yz4tq

On Learning Conditional Random Fields for Stereo

Christopher J. Pal, Jerod J. Weinman, Lam C. Tran, Daniel Scharstein
2010 International Journal of Computer Vision  
To estimate parameters in Markov random fields (MRFs) via maximum likelihood one usually needs to perform approximate probabilistic inference.  ...  Conditional random fields (CRFs) are discriminative versions of traditional MRFs.  ...  Figure 8 shows histograms of the marginal entropies H (Q (X i )) during free energy minimization with two sets of parameters, the initial parameters, Θ v = 1, and the learned Θ v .  ... 
doi:10.1007/s11263-010-0385-z fatcat:5vbs4bup2rc3jkp5bxsbniq5ti

Accounting for conformational entropy in predicting binding free energies of protein-protein interactions

Hetunandan Kamisetty, Arvind Ramanathan, Chris Bailey-Kellogg, Christopher James Langmead
2010 Proteins: Structure, Function, and Bioinformatics  
., via Molecular Dynamics simulations [1, 2, 3, 4]), from which approximate entropies and/or enthalpies are derived.  ...  The conversion of energies into probabilities is done through the partition function, Z, whose calculation involves a summation over all conformations because Z = c∈C exp −E(c) k B T .  ...  Figure 8 : 8 Example Markov Random FieldIn the next iteration of Belief Propagation, the messages will then be These can be used to compute the beliefs at iteration 1 and this process repeated until the  ... 
doi:10.1002/prot.22894 pmid:21120864 fatcat:ng7dmnsddjgyfkvah4s2yr4p3a

Deterministic annealing for unsupervised texture segmentation [chapter]

Thomas Hofmann, Jan Puzicha, Joachim M. Buhmann
1997 Lecture Notes in Computer Science  
In this paper a rigorous mathematical framework of deterministic annealing and mean-field approximation is presented for a general class of partitioning, clustering and segmentation problems.  ...  More specifically, this results in a formulation of texture segmentation as a pairwise data clustering problem with a sparse neighborhood structure.  ...  The approximation accuracy can be expressed by the cross entropy Á´É È À µ, which is automatically minimized by minimizing Ì over É Å , as Ì´É µ ½ Ì Á´É È À µ ÐÓ Ì for all É ¾ È Å .  ... 
doi:10.1007/3-540-62909-2_82 fatcat:sb6mjuptdjfddeklbal4toc6mq

Random Fields in Physics, Biology and Data Science

Enrique Hernández-Lemus
2021 Frontiers in Physics  
For strictly positive probability densities, a Markov random field is also a Gibbs field, i.e., a random field supplemented with a measure that implies the existence of a regular conditional distribution  ...  Markov random fields have been used in statistical physics, dating back as far as the Ehrenfests.  ...  efficient modeling strategies with such random fields.  ... 
doi:10.3389/fphy.2021.641859 fatcat:2bi74vqkureefmtzwinma2yiwq

Hierarchical, Unsupervised Learning with Growing via Phase Transitions

David Miller, Kenneth Rose
1996 Neural Computation  
Minimize the cross-entropy of equation 2.21 to obtain a new hierarchy: {7.  ...  One outgrowth of our method is a probabilis- tic, batch generalization of the Perceptron algorithm and its connection with minimizing cross-entropy.  ... 
doi:10.1162/neco.1996.8.2.425 fatcat:u3jg7m32bbcefo2cvqgpihen6i

Enhancing the predictability and retrodictability of stochastic processes [article]

Nathaniel Rupprecht, Dervis Vural
2019 arXiv   pre-print
For the sake of concreteness we focus on inferring the future and past of Markov processes and illustrate our method on two classes of processes: diffusion on random spatial networks, and thermalizing  ...  In contrast, minimizing entropy produces two very different transition matrices, for λ 0, depending on the type of entropy we minimize.  ...  METHODS Extremization of entropy. We started with a random geometric graph, T (λ = 0), from the ensemble described in the text, where nodes i and j are connected with probability e −β d(i,j) .  ... 
arXiv:1810.06620v3 fatcat:igpt6qvvpvhcnea5rmiuv6ajbe

A probabilistic framework for semi-supervised clustering

Sugato Basu, Mikhail Bilenko, Raymond J. Mooney
2004 Proceedings of the 2004 ACM SIGKDD international conference on Knowledge discovery and data mining - KDD '04  
We present an algorithm that performs partitional semi-supervised clustering of data by minimizing an objective function derived from the posterior energy of the HMRF model.  ...  We propose a probabilistic model for semisupervised clustering based on Hidden Markov Random Fields (HMRFs) that provides a principled framework for incorporating supervision into prototype-based clustering  ...  Hidden Markov Random Field To incorporate pairwise constraints along with an underlying distortion measure between points into a unified probabilistic model, we consider Hidden Markov Random Fields (HMRFs  ... 
doi:10.1145/1014052.1014062 dblp:conf/kdd/BasuBM04 fatcat:nhm75znztffyfbszf4aszvv7ge

Entropy-Controlled Quadratic Markov Measure Field Models for Efficient Image Segmentation

Mariano Rivera, Omar Ocegueda, Jose L. Marroquin
2007 IEEE Transactions on Image Processing  
We present a new Markov Random Field (MRF) based model for parametric image segmentation.  ...  Prior information about segmentation smoothness and low entropy of the probability distribution maps is codified in the form of a MRF with quadratic potentials, so that the optimal estimator is obtained  ...  Of these, the Bayesian formulations are one of the most powerful and general, since they allow for the inclusion of spatial coherence constraints that regularize the solution (via Markov Random Field (  ... 
doi:10.1109/tip.2007.909384 pmid:18092602 fatcat:6hl7nzjvsbdhdh7ch7m2455gum

Speech Recognition Using Augmented Conditional Random Fields

Y. Hifny, S. Renals
2009 IEEE Transactions on Audio, Speech, and Language Processing  
Index Terms-Augmented conditional random fields (ACRFs), augmented spaces, discriminative compression, hidden Markov models (HMMs).  ...  In this paper, a new acoustic modeling paradigm based on augmented conditional random fields (ACRFs) is investigated and developed.  ...  Index Terms-Augmented conditional random fields (ACRFs), augmented spaces, discriminative compression, hidden Markov models (HMMs). I.  ... 
doi:10.1109/tasl.2008.2010286 fatcat:ef3wg5x35fbslnjur723ydwk7e

Information field theory [article]

Torsten Enßlin
2013 arXiv   pre-print
Non-linear image reconstruction and signal analysis deal with complex inverse problems.  ...  To tackle such problems in a systematic way, I present information field theory (IFT) as a means of Bayesian, data based inference on spatially distributed signal fields.  ...  It is interesting to note that this minimal Gibbs free energy is equivalent to a minimal Kullback Leibler distance of G (s − m, D) to P(s|d) or to Maximum Entropy for G (s − m, D) with P(s|d) as the prior  ... 
arXiv:1301.2556v1 fatcat:dobxw2p5dngwvmesbcbzxghqwy

Variational Bayesian inversion (VBI) of quasi-localized seismic attributes for the spatial distribution of geological facies

Muhammad Atif Nawaz, Andrew Curtis
2018 Geophysical Journal International  
We show in a noisy synthetic example that the new method recovered the coefficients of the spatial filter with reasonable accuracy, and recovered the correct facies distribution.  ...  ., by using Markov chain Monte-Carlo -McMC) is the most commonly used approximate inference method but it is computationally expensive and detection of its convergence is often subjective and unreliable  ...  Model We use a so-called hidden Markov random field (HMRF) as the underlying graph behind our method. This defines a Markov random field (MRF) over latent (or unobserved) variables.  ... 
doi:10.1093/gji/ggy163 fatcat:k3n3tsqq6nalfjfkoxbhe5f5n4

Relative Entropy Gradient Sampler for Unnormalized Distributions [article]

Xingdong Feng, Yuan Gao, Jian Huang, Yuling Jiao, Xu Liu
2021 arXiv   pre-print
To determine the nonlinear transforms at each iteration, we consider the Wasserstein gradient flow of relative entropy.  ...  It is characterized by an ODE system with velocity fields depending on the density ratios of the density of evolving particles and the unnormalized target density.  ...  These datasets are partitioned randomly into two parts, the training sets (80%) and the test sets (20%). We repeats the random partition 10 times.  ... 
arXiv:2110.02787v1 fatcat:5iorjxq2jrd4xjdzkv77uibizm

Inverse Ising inference with correlated samples

Benedikt Obermayer, Erel Levine
2014 New Journal of Physics  
carry over to the frequently used mean-field approach to the inverse Ising problem.  ...  Such correlations could arise due to phylogeny but also via other slow dynamical processes.  ...  Specifically, our unified framework minimizes the cross-entropy = −   M X h J 1 ln ( , ) (1) of the entire alignment with respect to the unknown parameter sets h and J, where the fields h cause deviations  ... 
doi:10.1088/1367-2630/16/12/123017 fatcat:2ks7gqiajfgvvcipwkardlwftu

Simulated Annealing [chapter]

Alexander G. Nikolaev, Sheldon H. Jacobson
2010 International Series in Operations Research and Management Science  
test problems to reach optimality (i.e., the minimal mean final energy).  ...  content is being minimized.  ... 
doi:10.1007/978-1-4419-1665-5_1 fatcat:idntn7ghpnc6div67wrwchizru
« Previous Showing results 1 — 15 out of 2,400 results