Critical Values of a Kernel Density-based Mutual Information Estimator

R.J. May, G.C. Dandy, H.R. Maier, T.M.K.G. Fernando
The 2006 IEEE International Joint Conference on Neural Network Proceedings  
Recently, mutual information (MI) has become widely recognized as a statistical measure of dependence that is suitable for applications where data are non-Gaussian, or where the dependency between variables is non-linear. However, a significant disadvantage of this measure is the inability to define an analytical expression for the distribution of MI estimators, which are based upon a finite dataset. This paper deals specifically with a popular kernel density based estimator, for which the
more » ... ibution is determined empirically using Monte Carlo simulation. The application of the critical values of MI derived from this distribution to a test for independence is demonstrated within the context of a benchmark input variable selection problem.
doi:10.1109/ijcnn.2006.1716780 fatcat:kjaohwhx4rdqdl47rseb6x4lqe