The kernel mutual information

A. Gretton, R. Herbrich, A.J. Smola
2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP '03).  
indeed, we demonstrate that the KGV can also be thought of as We introduce a new contrast function, the kemel mutual information (KMIj, to measure the degree of independence of continuous random variables. This contrast function provides an approximate upper bound on the mutual information, as measured near independence, and is based on a kernel density estimate of the mutual information between a discretised approximation of the continuous random variables. We show that Bach and Jordan's
more » ... generalised variance (KGV) is also an upper bound on the same kernel density estimate, but is looser. Finally, we suggest that the addition of a regularising term in the KGV causes it to approach the KMI, which motivates the introduction of this regularisation.
doi:10.1109/icassp.2003.1202784 dblp:conf/icassp/GrettonHS03 fatcat:cy6erqghebbglarwmf7tttoney