A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Precise Minimax Regret for Logistic Regression with Categorical Feature Values
2021
International Conference on Algorithmic Learning Theory
We study logistic regression with binary labels and categorical (discrete) feature values. Our goal is to evaluate precisely the (maximal) minimax regret. We express it as the so called Shtarkov sum known in information theory. To the best of our knowledge such a sum was never computed in the context of logistic regression. To be more precise, the pointwise regret of an online algorithm is defined as the (excess) loss it incurs over some value of a constant comparator (weight vector) that is
dblp:conf/alt/JacquetSS21
fatcat:akgjtsvz75aejaju7patsrjkva