A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Improved Rates for Differentially Private Stochastic Convex Optimization with Heavy-Tailed Data
[article]
2022
arXiv
pre-print
We study stochastic convex optimization with heavy-tailed data under the constraint of differential privacy (DP). Most prior work on this problem is restricted to the case where the loss function is Lipschitz. Instead, as introduced by Wang, Xiao, Devadas, and Xu , we study general convex loss functions with the assumption that the distribution of gradients has bounded k-th moments. We provide improved upper bounds on the excess population risk under concentrated DP for convex and strongly
arXiv:2106.01336v6
fatcat:vmnweujd3faytn5lbze66vd4xy