Filters








79,193 Hits in 4.3 sec

Computing Accurate Probabilistic Estimates of One-D Entropy from Equiprobable Random Samples [article]

Hoshin V Gupta, Mohammed Reza Ehsani, Tirthankar Roy, Maria A Sans-Fuentes, Uwe Ehret, Ali Behrangi
2021 arXiv   pre-print
We develop a simple Quantile Spacing (QS) method for accurate probabilistic estimation of one-dimensional entropy from equiprobable random samples, and compare it with the popular Bin-Counting (BC) method  ...  In contrast to BC, which uses equal-width bins with varying probability mass, the QS method uses estimates of the quantiles that divide the support of the data generating probability density function (  ...  The QS and BC algorithms used in this work are freely accessible for non-commercial use at https://github.com/rehsani/Entropy accessed on 20 February 2021.  ... 
arXiv:2102.12675v1 fatcat:kztvvxdqorhctolpzzt3kfl4ke

A Weighted Generalized Maximum Entropy Estimator with a Data-driven Weight

Ximing Wu
2009 Entropy  
Monte Carlo simulations demonstrate that the proposed W-GME estimator is comparable to and often outperforms the conventional GME estimator, which places equal weights on the entropies of coefficient and  ...  It uses an objective function that is the sum of the entropies for coefficient distributions and disturbance distributions.  ...  Reference [4] proposed using the entropy concept in choosing the unknown distribution of probabilities.  ... 
doi:10.3390/e11040917 fatcat:qw76dlywbnhpll2bxe6mw7qjrq

Coverage-adjusted entropy estimation

Vincent Q. Vu, Bin Yu, Robert E. Kass
2007 Statistics in Medicine  
These formulations involve the fundamental and generally difficult statistical problem of estimating entropy.  ...  The results show that, with a minor modification, the CAE performs much better than the MLE and is better than the best upper bound estimator, due to Paninski, when the number of possible words m is unknown  ...  Paninski for helpful comments and discussions on an earlier version of this work presented at the SAND3 poster session. V. Q.  ... 
doi:10.1002/sim.2942 pmid:17567838 fatcat:fimz3hyg35bf3ndmx5pi63yz5m

Optimal Bounds for Estimating Entropy with PMF Queries [chapter]

Cafer Caferov, Barış Kaya, Ryan O'Donnell, A. C. Cem Say
2015 Lecture Notes in Computer Science  
For an extended survey and results on the fundamental task of estimating entropy, see Paninski [Pan03]; this survey includes justification of discretization, as well as discussion of applications to neuroscience  ...  We consider the task of estimating the entropy of p to within ±∆ (with high probability).  ...  Acknowledgments We thank Clément Canonne for his assistance with our questions about the literature.  ... 
doi:10.1007/978-3-662-48054-0_16 fatcat:ea3jmelx6zdrdmwazo2svga5em

Computing Accurate Probabilistic Estimates of One-D Entropy from Equiprobable Random Samples

Hoshin V. Gupta, Mohammad Reza Ehsani, Tirthankar Roy, Maria A. Sans-Fuentes, Uwe Ehret, Ali Behrangi
2021 Entropy  
We develop a simple Quantile Spacing (QS) method for accurate probabilistic estimation of one-dimensional entropy from equiprobable random samples, and compare it with the popular Bin-Counting (BC) and  ...  In contrast to BC, which uses equal-width bins with varying probability mass, the QS method uses estimates of the quantiles that divide the support of the data generating probability density function (  ...  The QS and BC algorithms used in this work are freely accessible for non-commercial use at https://github.com/rehsani/Entropy accessed on 20 February 2021.  ... 
doi:10.3390/e23060740 pmid:34208344 pmcid:PMC8231182 fatcat:tyaulcs5nzfadfymdj54c4sphu

A Composite Generalized Cross-Entropy Formulation in Small Samples Estimation

R. Bernardini Papalia
2008 Econometric Reviews  
Our second objective is to define a composite GCE estimator which combines information also (local) across subpopulations based on small samples and a (national) population with larger a sample size.  ...  In this study we formulate a Generalized Cross Entropy (GCE) methodology for modeling incomplete information and learning from repeated small samples.  ...  except for the bounds on the support spaces.  ... 
doi:10.1080/07474930801960469 fatcat:bnwtjv7hzvdzboek6ilqivbtvq

INSPECTRE: Privately Estimating the Unseen [article]

Jayadev Acharya, Gautam Kamath, Ziteng Sun, Huanyu Zhang
2018 arXiv   pre-print
We prove almost-tight bounds on the sample size required for this problem for several functionals of interest, including support size, support coverage, and entropy.  ...  Our methods are based on a sensitivity analysis of several state-of-the-art methods for estimating these properties with sublinear sample complexities.  ...  Statement of Results Our theoretical results for estimating support coverage, support size, and entropy are given below.  ... 
arXiv:1803.00008v1 fatcat:6iuphc6mfbbmho3rfg46m6fezm

Applying Information Theory to Small Groups Assessment: Emotions and Well-being at Work

Antonio León García-Izquierdo, Blanca Moreno, Mariano García-Izquierdo
2010 The Spanish Journal of Psychology  
The contrast of the theoretical models by using traditional parametric techniques requires a large sample size to the efficient estimation of the coefficients that quantify the relations between variables  ...  Since the available sample that we have is small, the most common size in European enterprises, we used the maximum entropy principle to explore the emotions that are involved in the psychosocial risks  ...  There are always more unknowns than knowns regardless of the sample size.  ... 
doi:10.1017/s1138741600003887 fatcat:hnsgwrywafgb5cpphiy63s2ll4

Adaptive Blind Deconvolution of Linear Channels Using Renyi's Entropy with Parzen Window Estimation

D. Erdogmus, K.E. Hild, J.C. Principe, M. Lazaro, I. Santamaria
2004 IEEE Transactions on Signal Processing  
In this paper, we investigate the suitability of a class of Parzen-window-based entropy estimates, namely Renyi's entropy, as a criterion for blind deconvolution of linear channels.  ...  Comparisons between maximum and minimum entropy approaches, as well as the effect of entropy order, equalizer length, sample size, and measurement noise on performance, will be investigated through Monte  ...  ACKNOWLEDGMENT The authors would like to thank the anonymous reviewers for their critical and constructive comments that helped improve this paper considerably.  ... 
doi:10.1109/tsp.2004.827202 fatcat:d5figwihxjbzte5rciipyt7chm

Empirical Estimation of Information Measures: A Literature Guide

Sergio Verdú
2019 Entropy  
We give a brief survey of the literature on the empirical estimation of entropy, differential entropy, relative entropy, mutual information and related information measures.  ...  While those quantities are of central importance in information theory, universal algorithms for their estimation are increasingly important in data science, machine learning, biology, neuroscience, economics  ...  Conflicts of Interest: The author declares no conflict of interest.  ... 
doi:10.3390/e21080720 pmid:33267434 fatcat:f3ifrqgomfe5xa4nl5vduaqmr4

Use of Generalized Maximum Entropy Estimation for Freight Flows Modelling and an Application

Esra Satici, Haydar Demirhan
2021 Journal of Data Science  
Freight flows between ten provinces of Turkey is analyzed by using generalized maximum entropy estimator of the log-regression model for freight flow.  ...  One of the models used for modelling "Origin-Destination" freight flows is logregression model obtained by applying a log-transformation to the traditional gravity model.  ...  Acknowledgements We would like to thank an anonymous referee for his careful review, valuable comments and suggestions which improved the clarity and quality of the paper.  ... 
doi:10.6339/jds.201201_10(1).0006 fatcat:43y4qp7x2zfntnrhjem2bslj3y

Recoverable Random Numbers in an Internet of Things Operating System

Taeill Yoo, Ju-Sung Kang, Yongjin Yeom
2017 Entropy  
Because the entropy estimation of LRNG is highly conservative, the process may require more than one minute for starting the transfer.  ...  Furthermore, the design principle of the estimation algorithm is not only heuristic but also unclear.  ...  Ju-Sung Kang contributed to: suggestion of theoretical model and analysis of the data using the probability model.  ... 
doi:10.3390/e19030113 fatcat:qkmthvcuczbh3jfxddaort24ym

Mixture-based estimation of entropy [article]

Stéphane Robin, Luca Scrucca
2022 arXiv   pre-print
When the distribution of the data is unknown, an estimate of the entropy needs be obtained from the data sample itself.  ...  We propose a semi-parametric estimate, based on a mixture model approximation of the distribution of interest.  ...  . • Entropy[MLE]: entropy computed from closed-form expression with MLEs plugged-in for unknown parameters; • EntropyGMM: entropy estimator based on our proposal in equation (8) for GMMs; • UT, VAR,  ... 
arXiv:2010.04058v2 fatcat:kikchv7vmnhtpm22pvylga4iea

INSPECTRE: Privately Estimating the Unseen

Jayadev Acharya, Gautam Kamath, Ziteng Sun, Huanyu Zhang
2020 Journal of Privacy and Confidentiality  
We prove almost-tight bounds on the sample size required for this problem for several functionals of interest, including support size, support coverage, and entropy.  ...  Our methods are based on a sensitivity analysis of several state-of-the-art methods for estimating these properties with sublinear sample complexities  ...  The goal is to estimate the entropy of a distribution to an additive ±α. Statement of Results Our theoretical results for estimating support coverage, support size, and entropy are given below.  ... 
doi:10.29012/jpc.724 fatcat:ri3z5s6bljhv7pz577pwoikqpu

Empirical Maximum Entropy Methods

M. Grendar
2006 AIP Conference Proceedings  
From this vantage point the entropy-based empirical approach to estimation is surveyed.  ...  A method, which we suggest to call the Empirical Maximum Entropy method, is implicitly present at Maximum Entropy Empirical Likelihood method, as its special, non-parametric case.  ...  ACKNOWLEDGEMENTS Financial support of participation at MaxEnt06 workshop from Jaynes Foundation is gratefully acknowledged. Supported also by VEGA grant 1/3016/06.  ... 
doi:10.1063/1.2423302 fatcat:gepzmdtyi5ek3c2ehzd4kpl34a
« Previous Showing results 1 — 15 out of 79,193 results