Old Techniques in Differentially Private Linear Regression

Or Sheffet
2019 International Conference on Algorithmic Learning Theory  
We introduce three novel differentially private algorithms that approximate the 2 nd -moment matrix of the data. These algorithms, which in contrast to existing algorithms always output positive-definite matrices, correspond to existing techniques in linear regression literature. Thus these techniques have an immediate interpretation and all results known about these techniques are straight-forwardly applicable to the outputs of these algorithms. More specifically, we discuss the following
more » ... techniques. (i) For Ridge Regression, we propose setting the regularization coefficient so that by approximating the solution using Johnson-Lindenstrauss transform we preserve privacy. (ii) We show that adding a batch of d + O( −2 ) random samples to our data preserves differential privacy. (iii) We show that sampling the 2 nd -moment matrix from a Bayesian posterior inverse-Wishart distribution is differentially private. We also give utility bounds for our algorithms and compare them with the existing "Analyze Gauss" algorithm of Dwork et al (2014) .
dblp:conf/alt/Sheffet19 fatcat:p2adframjvbwdlsidoz2upuooi