A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Estimating the Mutual Information between two Discrete, Asymmetric Variables with Limited Samples
[article]
2019
arXiv
pre-print
Determining the strength of non-linear statistical dependencies between two variables is a crucial matter in many research fields. The established measure for quantifying such relations is the mutual information. However, estimating mutual information from limited samples is a challenging task. Since the mutual information is the difference of two entropies, the existing Bayesian estimators of entropy may be used to estimate information. This procedure, however, is still biased in the severely
arXiv:1905.02034v1
fatcat:ubu5ozbx4balnmxcu7y6k7jbdq