A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
FSMJ: Feature Selection with Maximum Jensen-Shannon Divergence for Text Categorization
[article]
2016
arXiv
pre-print
In this paper, we present a new wrapper feature selection approach based on Jensen-Shannon (JS) divergence, termed feature selection with maximum JS-divergence (FSMJ), for text categorization. Unlike most existing feature selection approaches, the proposed FSMJ approach is based on real-valued features which provide more information for discrimination than binary-valued features used in conventional approaches. We show that the FSMJ is a greedy approach and the JS-divergence monotonically
arXiv:1606.06366v1
fatcat:ruz5e22ggfe3zobejdlmyomd4q