A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Neither Private Nor Fair: Impact of Data Imbalance on Utility and Fairness in Differential Privacy
[article]
2020
arXiv
pre-print
Deployment of deep learning in different fields and industries is growing day by day due to its performance, which relies on the availability of data and compute. Data is often crowd-sourced and contains sensitive information about its contributors, which leaks into models that are trained on it. To achieve rigorous privacy guarantees, differentially private training mechanisms are used. However, it has recently been shown that differential privacy can exacerbate existing biases in the data and
arXiv:2009.06389v3
fatcat:hq6u7dtcubhingsgsjgj6qpoly