Structure and Sensitivity in Differential Privacy: Comparing K-Norm Mechanisms [article]

Jordan Awan, Aleksandra Slavkovic
2019 arXiv   pre-print
Differential privacy (DP), provides a framework for provable privacy protection against arbitrary adversaries, while allowing the release of summary statistics and synthetic data. We address the problem of releasing a noisy real-valued statistic vector T, a function of sensitive data under DP, via the class of K-norm mechanisms with the goal of minimizing the noise added to achieve privacy. First, we introduce the sensitivity space of T, which extends the concepts of sensitivity polytope and
more » ... sitivity hull to the setting of arbitrary statistics T. We then propose a framework consisting of three methods for comparing the K-norm mechanisms: 1) a multivariate extension of stochastic dominance, 2) the entropy of the mechanism, and 3) the conditional variance given a direction, to identify the optimal K-norm mechanism. In all of these criteria, the optimal K-norm mechanism is generated by the convex hull of the sensitivity space. Using our methodology, we extend the objective perturbation and functional mechanisms and apply these tools to logistic and linear regression, allowing for private releases of statistical results. Via simulations and an application to a housing price dataset, we demonstrate that our proposed methodology offers a substantial improvement in utility for the same level of risk.
arXiv:1801.09236v3 fatcat:ffab5ea5kreufgji6jcslh4aiq