A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2018; you can also visit the original URL.
The file type is
Critical Values of a Kernel Density-based Mutual Information Estimator
The 2006 IEEE International Joint Conference on Neural Network Proceedings
Recently, mutual information (MI) has become widely recognized as a statistical measure of dependence that is suitable for applications where data are non-Gaussian, or where the dependency between variables is non-linear. However, a significant disadvantage of this measure is the inability to define an analytical expression for the distribution of MI estimators, which are based upon a finite dataset. This paper deals specifically with a popular kernel density based estimator, for which thedoi:10.1109/ijcnn.2006.1716780 fatcat:kjaohwhx4rdqdl47rseb6x4lqe