A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is
The Pitman-Yor process, a generalization of Dirichlet process, provides a tractable prior distribution over the space of countably infinite discrete distributions, and has found major applications in Bayesian ... We consider the problem of estimating Shannon's entropy H from discrete data, in cases where the number of possible symbols is unknown or even countably infinite. ... Shlens for retinal data, and Y. W. Teh and A. Cerquetti for helpful comments on the manuscript. ...arXiv:1302.0328v3 fatcat:dmquqh6p2fh7hbki3wbk5x24z4
Our approach follows that of Nemenman et al. , who formulated a Bayesian entropy estimator using a mixture-of-Dirichlet prior over the space of discrete distributions on K bins. ... For the first estimator, we design a novel mixture prior over countable distributions using the Pitman-Yor (PY) process . ...  , who formulated a Bayesian entropy estimator using a mixture-of-Dirichlet prior over the space of discrete distributions on K bins. ...doi:10.1186/1471-2202-14-s1-p316 pmcid:PMC3704886 fatcat:svy3sb4bhzbubjt5rhkzxer26q
Existence, uniqueness and characterisation theorems are presented for the maxentropic distribution for the case of countable support under some assumptions. ... for models for discrete data defined on the cylinder and lattice. ...
We prove corresponding results for stochastic model selection, for both discrete and continuous observation spaces. ... We introduce the entropy potential of a hypothesis class as its worst-case entropy, with regard to the true distribution. ... Acknowledgments Thanks to Marcus Hutter, Thomas Zeugmann, and the anonymous reviewers for their valuable comments. This work was supported by JSPS 21st century COE program C01. ...doi:10.1016/j.tcs.2008.06.038 fatcat:fesrb5p7wbhojndnbvippza7yq
(English and French summaries) [On entropy estimation for distributions with countable support! C. R. Math. Acad. Sci. Paris 335 (2002), no. 9, 763-766. ... Let X,, X2,... be a sequence of discrete i.i.d. random variables with finite entropy. ...
Annals of Statistics
We consider an extension of ϵ-entropy to a KL-divergence based complexity measure for randomized density estimation methods. ... In particular, we show that this technique can lead to improvements of some classical results concerning the convergence of minimum description length and Bayesian posterior distributions. ... The concept of -entropy can be regarded as a notion to measure the complexity of an explicit discretization, usually for a deterministic estimator on a discretenet. ...doi:10.1214/009053606000000704 fatcat:tsnllilqwjcuhgjtgk4cwcbewi
Gull, Bayesian inductive inference and maximum entropy (pp. 53-74); G. Larry Bretthorst, Excerpts from Bayesian spectrum analysis and parame- ter estimation (pp. 75-145); E. T. ... Fougere, Maximum entropy calculations on a discrete probability space (pp. 205-234); R. Blankenbecler and M. H. Partovi, Quantum density matrix and entropic uncertainty (pp. 235-244); A. J. M. ...
Exchangeable random partition processes are the basis for Bayesian approaches to statistical inference in large alphabet settings. ... This provides an understanding of these estimators in the setting of unknown discrete alphabets from the perspective of universal compression. ... Acknowledgments The authors thank the American Institute of Mathematics and NSF for sponsoring a workshop on probability estimation, as well as A. Orlitsky and K. ...doi:10.3390/e16105339 fatcat:4y2xb2cgu5a3dgg2l47qug355a
entropy prior to incorporate limited auditor information. ... for θ . ... The authors are grateful to two reviewers for their valuable comments that improved the manuscript. Conflicts of Interest: The authors declare no conflict of interest. ...doi:10.3390/e20120919 pmid:33266643 fatcat:bboeew23jjbwtlyo3hlsusgkxe
Lecture Notes in Computer Science
to non-stationary processes, using an online Bayesian spectral estimation method to compute the Bayesian surprise. ... Information measures for stationary random processes For an infinite stationary discrete-time random process (X t ) t∈Z , the predictive information rate (PIR), as defined in  , is global measure of ... AR estimation and Bayesian suprise Our method for spectral estimation is based on Kitagawa and Gersch's  'spectral smoothness prior'-they consider autoregressive Gaussian processes and introduce ...doi:10.1007/978-3-642-40020-9_72 fatcat:sntq77l76zfu5icwrapxwmeqgu
2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton)
Exchangeable random partition processes are the basis for Bayesian approaches to statistical inference in large alphabet settings. ... This provides an understanding of these estimators in the setting of unknown discrete alphabets from the perspective of universal compression. ... Acknowledgments The authors thank the American Institute of Mathematics and NSF for sponsoring a workshop on probability estimation, as well as A. Orlitsky and K. ...doi:10.1109/allerton.2010.5707041 fatcat:eed65ujaa5cjjfxc6fi7czhtkm
Here we present a fully general Bayesian nonparametric estimation of the whole class of Tsallis diversity indices under Gnedin-Pitman priors, a large family of random discrete distributions recently deeply ... Bayesian nonparametric estimation of Shannon entropy and Simpson's diversity under uniform and symmetric Dirichlet priors has been already advocated as an alternative to maximum likelihood estimation based ... Acknowledgement The authors wishes to thank Leopoldo Catania for his kind assistance in the development of the R code used in the paper, Mauro Bernardi for providing the R function to obtain highest posterior ...arXiv:1404.3441v2 fatcat:h7olw45g75arpdalu44hk3g2e4
We examine the recently introduced NSB estimator of entropies of severely undersampled discrete variables and devise a procedure for calculating the involved integrals. ... Thus one can estimate entropies with no a priori assumptions about these cardinalities, and a closed form solution for such estimates is given. ... Conclusions We have calculated various asymptotic properties of the NSB estimator for estimation of entropies of discrete random variables. ...doi:10.3390/e13122013 fatcat:77stortn2jdnpd3lr2jp3h4et4
Lecture Notes in Computer Science
We present a general information exponential inequality that measures the statistical complexity of some deterministic and randomized density estimators. ... Therefore bounds obtained for (1) can also be applied to Bayesian posterior distributions. ... measures, then the estimator leads to the Bayesian posterior distribution with λ = 1 (see  ). ...doi:10.1007/978-3-540-27819-1_22 fatcat:pdrh6b64jrdn3h2kfjfhrzs32m
Discretizing the results of MMD mapping using a small dictionary, and estimating the entropy of the resulting, we end up associating each detection with a discrete scalar that has only a limited number ... bearing can be then estimated as: ℎ = − ∑ ( ) log( ( )) =1 . (2.6) Similarly the entropy at constant range can be then estimated as: ℎ = − ∑ ( ) log ( ) =1 . (2.7) Finally, the entropy difference, which ...doi:10.5281/zenodo.43744 fatcat:nngtbjzwxbcpthaafoyxpc5mnm
« Previous Showing results 1 — 15 out of 1,197 results