A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
On feature extraction by mutual information maximization
2002
IEEE International Conference on Acoustics Speech and Signal Processing
We present a method for learning discriminative feature transforms using as criterion the mutual information between class labels and transformed features. Instead of a commonly used mutual information measure based on Kullback-Leibler divergence, we use a quadratic divergence measure, which allows us to make an efficient non-parametric implementation and requires no prior assumptions about class densities. In addition to linear transforms, we also discuss nonlinear transforms that are
doi:10.1109/icassp.2002.5743865
dblp:conf/icassp/Torkkola02
fatcat:uzp6s7pznjbezkzyb2of2o7fvy