Filters








9,362 Hits in 4.8 sec

Metropolis-Hastings Algorithms for Estimating Betweenness Centrality in Large Networks [article]

Mostafa Haghir Chehreghani and Talel Abdessalem and and Albert Bifet
2017 arXiv   pre-print
In this paper, first given a network G and a vertex r ∈ V(G), we propose a Metropolis-Hastings MCMC algorithm that samples from the space V(G) and estimates betweenness score of r.  ...  The stationary distribution of our MCMC sampler is the optimal sampling proposed for betweenness centrality estimation.  ...  Although there exist polynomial time and space algorithms for betweenness centrality estimation, the algorithms are expensive in practice.  ... 
arXiv:1704.07351v3 fatcat:hydhecvhdnhttkp3bmofalewxq

Distributed Aggregate Function Estimation by Biphasically Configured Metropolis-Hasting Weight Model

M. Kenyeres, J. Kenyeres, V. Skorpil, R. Burget
2017 Radioengineering  
In this paper, we introduce an optimized weight model for the average consensus algorithm.  ...  It is called the Biphasically configured Metropolis-Hasting weight model and is based on a modification of the Metropolis-Hasting weight model by rephrasing the initial configuration into two parts.  ...  For research, infrastructure of the SIX Center was used.  ... 
doi:10.13164/re.2017.0479 fatcat:ggy6scgiabhuzjwwbe7kxfdaam

Bayesian parameter estimation for a jet-milling model using Metropolis–Hastings and Wang–Landau sampling

Catharine A. Kastner, Andreas Braumann, Peter L.W. Man, Sebastian Mosbach, George P.E. Brownbridge, Jethro Akroyd, Markus Kraft, Chrismono Himawan
2013 Chemical Engineering Science  
Bayesian parameter estimates for a computationally expensive multi-response jet-milling model are computed using the Metropolis-Hastings and Wang-Landau Markov Chain Monte Carlo sampling algorithms.  ...  Comparison of the autocorrelation function between samples generated by the two algorithms shows that the Wang-Landau algorithm exhibits more rapid decay.  ...  It should be noted that it is not appropriate to make a direct comparison between the two algorithms based on these plots, as the kernel density estimation for Metropolis-Hastings is based on more points  ... 
doi:10.1016/j.ces.2012.11.027 fatcat:5hur4nih6jchxicyskmukifa4m

Slice sampler algorithm for generalized Pareto distribution

Mohammad Rostami, Mohd Bakri Adam Yahya, Mohamed Hisham Yahya, Noor Akma Ibrahim
2017 Hacettepe Journal of Mathematics and Statistics  
The results were compared with another commonly used Markov chain Monte Carlo (MCMC) technique called Metropolis-Hastings algorithm.  ...  Moreover, the slice sampler algorithm presents a higher level of stationarity in terms of the scale and shape parameters compared with the Metropolis-Hastings algorithm.  ...  Mira and Tierney [32] prove that the slice sampler algorithm performs better than the corresponding independence Metropolis-Hastings algorithm in terms of asymptotic variance in the central limit theorem  ... 
doi:10.15672/hjms.2017.441 fatcat:jot2gu2onnej7ijq36cctkeuki

Page 1258 of Mathematical Reviews Vol. , Issue 2003B [page]

2003 Mathematical Reviews  
Hastings-Metropolis algorithm with an adaptive proposal.  ...  Summary: “The Hastings-Metropolis algorithm is a general MCMC method for sampling from a density known up to a con- stant.  ... 

Implementations of the Monte Carlo EM Algorithm

Richard A Levine, George Casella
2001 Journal of Computational And Graphical Statistics  
The most flexible and generally applicable approach to obtaining a Monte Carlo sample in each iteration of an MCEM algorithm is through Markov chain Monte Carlo (MCMC) routines such as the Gibbs and Metropolis-Hastings  ...  In particular, we apply an automated rule for increasing the Monte Carlo sample size when the Monte Carlo error overwhelms the EM estimate at any given iteration.  ...  Each iteration of the sample u (r) 1 , . . . u (r) m is generated from a Metropolis-Hastings algorithm with N (0, σ 2 ) candidate distribution.  ... 
doi:10.1198/106186001317115045 fatcat:ra6cegbw5ndebpb52l4bubzjgy

Adaptive independent Metropolis–Hastings

Lars Holden, Ragnar Hauge, Marit Holden
2009 The Annals of Applied Probability  
It is an extension of the independent Metropolis--Hastings algorithm.  ...  We propose an adaptive independent Metropolis--Hastings algorithm with the ability to learn from all previous proposals in the chain except the current location.  ...  The authors thank Arnoldo Frigessi, Håkon Tjelmeland and Øivind Skare for valuable comments and contributions.  ... 
doi:10.1214/08-aap545 fatcat:65qltvx7dfacdb3sl4edbfexua

Variance Reduction for Metropolis-Hastings Samplers [article]

Angelos Alexopoulos, Petros Dellaportas, Michalis K. Titsias
2022 arXiv   pre-print
We introduce a general framework that constructs estimators with reduced variance for random walk Metropolis and Metropolis-adjusted Langevin algorithms.  ...  The resulting estimators require negligible computational cost and are derived in a post-process manner utilising all proposal values of the Metropolis algorithms.  ...  by a Metropolis-Hastings kernel invariant to π.  ... 
arXiv:2203.02268v1 fatcat:dce24u7s5zdw3jdmy6zgbr35iq

Markov Chain Monte Carlo Methods, a survey with some frequent misunderstandings [article]

Christian P. Robert, Wu Changye (U Paris Dauphine)
2020 arXiv   pre-print
These obviously depends on the connection between p and q. For instance, the Metropolis-Hastings algorithm is an i.i.d. sampler when q(·|X n ) = p(·), a choice that is rarely available.  ...  "What is the difference between Metropolis Hastings, Gibbs, Importance, and Rejection sampling?"  ... 
arXiv:2001.06249v1 fatcat:sop4cjdnirdttblrf45owobwt4

Gradient-free MCMC methods for dynamic causal modelling

Biswa Sengupta, Karl J. Friston, Will D. Penny
2015 NeuroImage  
For the Bayesian inversion of a single-node neural mass model, both adaptive and population-based samplers are more efficient compared with random walk Metropolis sampler or slice-sampling; yet adaptive  ...  Slice-sampling yields the highest number of independent samples from the target density - albeit at almost 1000% increase in computational time, in comparison to the most efficient algorithm (i.e., the  ...  For this, we implemented three variants of the Metropolis-Hastings algorithm; along with a slice sampling algorithm.  ... 
doi:10.1016/j.neuroimage.2015.03.008 pmid:25776212 pmcid:PMC4410946 fatcat:3fkzet526fhpnasyfsqtfhlnee

BEST: Bayesian estimation of species trees under the coalescent model

L. Liu
2008 Bioinformatics  
It provides a new option for estimating species phylogenies within the popular Bayesian phylogenetic program MrBayes.  ...  The technique of simulated annealing is adopted along with Metropolis coupling as performed in MrBayes to improve the convergence rate of the Markov Chain Monte Carlo algorithm.  ...  I am indebted to Patricia Brito for writing the manual for BEST1.6 and BEST1.7.  ... 
doi:10.1093/bioinformatics/btn484 pmid:18799483 fatcat:5c64tv4jdreqvi2goweiqkrebi

A Common Derivation for Markov Chain Monte Carlo Algorithms with Tractable and Intractable Targets [article]

Khoa T. Tran
2018 arXiv   pre-print
A generalised version of the Metropolis-Hastings algorithm is constructed with a random number generator and a self-reverse mapping.  ...  A common derivation for many Markov chain Monte Carlo algorithms is useful in drawing connections and comparisons between these algorithms.  ...  Generalised Metropolis-Hasting Algorithm 1.  ... 
arXiv:1607.01985v5 fatcat:iyijj25s2jfz7pvqlmip5acdve

Distinct value estimation on peer-to-peer networks

Zubin Joseph, Gautam Das, Leonidas Fegaras
2008 Proceedings of the 1st ACM international conference on PErvasive Technologies Related to Assistive Environments - PETRA '08  
In this paper, we present a technique to obtain estimations of the number of distinct values matching a query on the network.  ...  However, the sheer scale of these networks has made it difficult to gather statistics that could be used for building new features.  ...  The authors in [33] also suggest possible modifications to the Metropolis-Hastings algorithm for attaining required node sampling distributions.  ... 
doi:10.1145/1389586.1389617 dblp:conf/petra/JosephDF08 fatcat:ybzpdbh425dkrkrvyh2m5qwszi

Optimal Channel Choice for Collaborative Ad-Hoc Dissemination

Liang Hu, Jean-Yves Le Boudec, Milan Vojnoviae
2010 2010 Proceedings IEEE INFOCOM  
Furthermore, we show that the optimum social welfare can be approximated by a decentralized algorithm based on Metropolis-Hastings sampling and give a variant that also accounts for the battery energy.  ...  We show that maximizing system welfare is equivalent to an assignment problem whose solution can be obtained by a centralized greedy algorithm.  ...  We show the results for the Metropolis-Hastings withf assumed to be either known or locally estimated by individual nodes.  ... 
doi:10.1109/infcom.2010.5462163 dblp:conf/infocom/HuBV10 fatcat:3s255kdg5zfw7ha3uf6bblcx2y

Investigation of model based beamforming and Bayesian inversion signal processing methods for seismic localization of underground sources

Geok Lian Oh, Jonas Brunskog
2014 Journal of the Acoustical Society of America  
the Metropolis Hasting algorithm.  ...  MCMC Metropolis Hasting Algorithm The Metropolis Hasting algorithm [13] [14] [15] is an MCMC method.  ...  The second soil model defines a horizontally stratified 5-layer soil structure and P, S wave speeds and soil density values for each layer are described in Table 3 as follows, Fgure 4-1 Description  ... 
doi:10.1121/1.4884765 pmid:25096105 fatcat:qks5kngssndwxf3bqhi7tqnvji
« Previous Showing results 1 — 15 out of 9,362 results