A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
Submodular Combinatorial Information Measures with Applications in Machine Learning
[article]
2021
arXiv
pre-print
Information-theoretic quantities like entropy and mutual information have found numerous uses in machine learning. It is well known that there is a strong connection between these entropic quantities and submodularity since entropy over a set of random variables is submodular. In this paper, we study combinatorial information measures that generalize independence, (conditional) entropy, (conditional) mutual information, and total correlation defined over sets of (not necessarily random)
arXiv:2006.15412v6
fatcat:ns3ctxnkovdgbo66awqimjzjuy