Filters








1,033 Hits in 5.5 sec

Improving Utility and Security of the Shuffler-based Differential Privacy [article]

Tianhao Wang, Bolin Ding, Min Xu, Zhicong Huang, Cheng Hong, Jingren Zhou, Ninghui Li, Somesh Jha
2020 arXiv   pre-print
This paper investigates this multiple-party setting of LDP. We analyze the system model and identify potential adversaries.  ...  When collecting information, local differential privacy (LDP) alleviates privacy concerns of users because their private information is randomized before being sent it to the central aggregator.  ...  Crypto-aided Differential Privacy.  ... 
arXiv:1908.11515v3 fatcat:z2hey2wmmrduxfzdkaw7w7jczy

RECENT PROGRESS OF DIFFERENTIALLY PRIVATE FEDERATED LEARNING WITH THE SHUFFLE MODEL

Moushira Abdallah Mohamed Ahmed, Shuhui Wu, Laure Deveriane Dushime, Yuanhong Tao
2021 International Journal of Engineering Technologies and Management Research  
Consequently, the usage of modified technique differential privacy federated learning with shuffle model will explores the gap between privacy and accuracy in both models.  ...  Furthermore, we present two types of shuffle, single shuffle and m shuffles with the statistical analysis for each one in boosting the privacy amplification of users with the same level of accuracy by  ...  each one in boosting the privacy amplification of users with the same level of Published 30 November 2021 accuracy by reasoning the practical results of recent papers.  ... 
doi:10.29121/ijetmr.v8.i11.2021.1028 fatcat:2dlseelznndq3aoau64dcrnaby

PRECAD: Privacy-Preserving and Robust Federated Learning via Crypto-Aided Differential Privacy [article]

Xiaolan Gu, Ming Li, Li Xiong
2021 arXiv   pre-print
In this paper, we develop a framework called PRECAD, which simultaneously achieves differential privacy (DP) and enhances robustness against model poisoning attacks with the help of cryptography.  ...  Existing FL protocol designs have been shown to be vulnerable to attacks that aim to compromise data privacy and/or model robustness.  ...  This is achieved by combining DP with secure multi-party computation (MPC) techniques (secret sharing). PRECAD involves two honest-but-curious and non-colluding servers.  ... 
arXiv:2110.11578v1 fatcat:ndwe2a7g6zhxxlb6clouu7tl3e

Privacy Amplification by Decentralization [article]

Edwige Cyffers, Aurélien Bellet
2021 arXiv   pre-print
Analyzing data owned by several parties while achieving a good trade-off between utility and privacy is a key challenge in federated learning and analytics.  ...  For tasks such as real summation, histogram computation and optimization with gradient descent, we propose simple algorithms on ring and complete topologies.  ...  secure multi-party computation protocol.  ... 
arXiv:2012.05326v3 fatcat:zq45xwdnszdmnfq63ou5d43bqm

Advances and Open Problems in Federated Learning [article]

Peter Kairouz, H. Brendan McMahan, Brendan Avent, Aurélien Bellet, Mehdi Bennis, Arjun Nitin Bhagoji, Kallista Bonawitz, Zachary Charles, Graham Cormode, Rachel Cummings, Rafael G.L. D'Oliveira, Hubert Eichner (+47 others)
2021 arXiv   pre-print
FL embodies the principles of focused data collection and minimization, and can mitigate many of the systemic privacy risks and costs resulting from traditional, centralized machine learning and data science  ...  Motivated by the explosive growth in FL research, this paper discusses recent advances and presents an extensive collection of open problems and challenges.  ...  Acknowledgments The authors would like to thank Alex Ingerman and David Petrou for their useful suggestions and insightful comments during the review process.  ... 
arXiv:1912.04977v3 fatcat:efkbqh4lwfacfeuxpe5pp7mk6a

Encode, Shuffle, Analyze Privacy Revisited: Formalizations and Empirical Evaluation [article]

Úlfar Erlingsson, Vitaly Feldman, Ilya Mironov, Ananth Raghunathan, Shuang Song, Kunal Talwar, Abhradeep Thakurta
2020 arXiv   pre-print
These range from abstract algorithms to comprehensive systems with varying assumptions and built upon local differential privacy mechanisms and anonymity.  ...  We also demonstrate how the ESA notion of fragmentation (reporting data aspects in separate, unlinkable messages) improves privacy/utility tradeoffs both in terms of local and central differential-privacy  ...  Also, these systems' abilities are constrained by the specifics of their construction and mechanisms (e.g., built-in sampling rates and means of multi-party cryptographic computation, as in [11] ); some  ... 
arXiv:2001.03618v1 fatcat:r7trb5yhejeb5g7d57lcw2lwya

Towards Distributed Privacy-Preserving Prediction [article]

Lingjuan Lyu, Yee Wei Law, Kee Siong Ng, Shibei Xue, Jun Zhao, Mengmeng Yang, Lei Liu
2020 arXiv   pre-print
First, we introduce the improved Binomial Mechanism and Discrete Gaussian Mechanism to achieve distributed differential privacy.  ...  In privacy-preserving machine learning, individual parties are reluctant to share their sensitive training data due to privacy concerns.  ...  Multi-party Privacy In multi-party scenario where data is sourced from multiple parties and the server is not trustworthy, individual privacy has to be protected.  ... 
arXiv:1910.11478v2 fatcat:xa3m7wzruzfmlhgqlikdop67im

Towards Sparse Federated Analytics: Location Heatmaps under Distributed Differential Privacy with Secure Aggregation [article]

Eugene Bagdasaryan, Peter Kairouz, Stefan Mellem, Adrià Gascón, Kallista Bonawitz, Deborah Estrin, Marco Gruteser
2021 arXiv   pre-print
To achieve this, we revisit the distributed differential privacy concept based on recent results in the secure multiparty computation field and design a scalable and adaptive distributed differential privacy  ...  It aims to ensure differential privacy before data becomes visible to a service provider while maintaining high data accuracy and minimizing resource consumption on users' devices.  ...  Acknowledgments At Cornell Tech, Bagdasaryan is supported in part by a Cornell Digital Life Initiative fellowship and an Apple Scholars in AI/ML fellowship.  ... 
arXiv:2111.02356v1 fatcat:ak4pnrnq5jfgpa3hbp5sycaf2i

How to Democratise and Protect AI: Fair and Differentially Private Decentralised Deep Learning

Lingjuan Lyu, Yitong Li, Karthik Nandakumar, Jiangshan Yu, Xingjun Ma
2020 IEEE Transactions on Dependable and Secure Computing  
A novel reputation system is proposed through digital tokens and local credibility to ensure fairness, in combination with differential privacy to guarantee privacy.  ...  local credibility of each party and generate initial tokens; during the update stage, Differentially Private SGD (DPSGD) is used to facilitate collaborative privacy-preserving deep learning, and local  ...  ACKNOWLEDGMENT This work is partially supported by Faculty of Information Technology, Monash University; and an IBM PhD Fellowship.  ... 
doi:10.1109/tdsc.2020.3006287 fatcat:kwhzr3qj5vbtlehkh4areejsoa

Differentially-Private "Draw and Discard" Machine Learning [article]

Vasyl Pihur, Aleksandra Korolova, Frederick Liu, Subhash Sankuratripati, Moti Yung, Dachuan Huang, Ruogu Zeng
2018 arXiv   pre-print
It is motivated by the desire to achieve differential privacy guarantees in the local model of privacy in a way that satisfies all systems constraints using asynchronous client-server communication and  ...  We then analyze the privacy guarantees provided by our approach against several types of adversaries and showcase experimental results that provide evidence for the framework's viability in practical deployments  ...  , latency, and robustness to spam and abuse.  ... 
arXiv:1807.04369v2 fatcat:hupf7s3efba4fiwj3wmdwnswty

Voting-based Approaches For Differentially Private Federated Learning [article]

Yuqing Zhu, Xiang Yu, Yi-Hsuan Tsai, Francesco Pittaluga, Masoud Faraki, Manmohan chandraker, Yu-Xiang Wang
2021 arXiv   pre-print
Theoretically, by applying secure multi-party computation, we could exponentially amplify the (data-dependent) privacy guarantees when the margin of the voting scores are large.  ...  Differentially Private Federated Learning (DPFL) is an emerging field with many applications.  ...  Differential Privacy: A randomized mechanism M : D → R with a domain D and range R satisfies ( , δ)-differential privacy, if for any two adjacent datasets D, D ∈ D and for any subset of outputs O ⊆ R,  ... 
arXiv:2010.04851v2 fatcat:ff5qqlgdonhefexhjoc5fep32q

Aggregating Votes with Local Differential Privacy: Usefulness, Soundness vs. Indistinguishability [article]

Shaowei Wang, Jiachun Du, Wei Yang, Xinrong Diao, Zichun Liu, Yiwen Nie, Liusheng Huang, Hongli Xu
2019 arXiv   pre-print
amplification attack and view disguise attack.  ...  This work studies the problem of aggregating individual's voting data under the local differential privacy setting, where usefulness and soundness of the aggregated scores are of major concern.  ...  Secure multi-party computation [11] alleviates these problems by securely aggregating scores, but is still fragile to collusion between the counter and other voters, and may have efficiency issues for  ... 
arXiv:1908.04920v1 fatcat:bubpbg3okvgt3kqqpi4cobmx6a

Improved Differentially Private Decentralized Source Separation for fMRI Data [article]

Hafiz Imtiaz, Jafar Mohammadi, Rogers Silva, Bradley Baker, Sergey M. Plis, Anand D. Sarwate, Vince Calhoun
2021 arXiv   pre-print
This indicates that it is possible to have meaningful utility while preserving privacy.  ...  We show that our algorithm outperforms existing approaches on synthetic and real neuroimaging datasets and demonstrate that it can sometimes reach the same level of utility as the corresponding non-private  ...  Differential privacy provides different guarantees (see [39] , [40] for thorough comparisons between Secure Multi-party Computation (SMC) and differential privacy) although we can use MPC protocols  ... 
arXiv:1910.12913v2 fatcat:goebuhpzn5gvjohezl6dzmapme

Local Differential Privacy and Its Applications: A Comprehensive Survey [article]

Mengmeng Yang, Lingjuan Lyu, Jun Zhao, Tianqing Zhu, Kwok-Yan Lam
2020 arXiv   pre-print
We discuss the practical deployment of local differential privacy and explore its application in various domains.  ...  It breaks the shackles of the trusted third party, and allows users to perturb their data locally, thus providing much stronger privacy protection.  ...  Privacy amplification Privacy amplification refers to the strategy that enhances the privacy level without or with a very little effect on the data utility.  ... 
arXiv:2008.03686v1 fatcat:l7z3gip2ivdmvin7lraxd4vciy

Post-Quantum Era Privacy Protection for Intelligent Infrastructures

Lukas Malina, Petr Dzurenda, Sara Ricci, Jan Hajny, Gautam Srivastava, Raimundas Matulevicius, Abasi-amefon O. Affia, Maryline Laurent, Nazatul Haque Sultan, Qiang Tang
2021 IEEE Access  
This in-depth survey begins with an overview of security and privacy threats in IoT/IIs.  ...  This paper also overviews how PETs can be deployed in practical use cases in the scope of IoT/IIs, and maps some current projects, pilots, and products that deal with PETs.  ...  PRIVACY-ENHANCING COMPUTATIONS AND DATA STORING 1) SECURE MULTI-PARTY COMPUTATIONS Secure Multi-party Computation (SMC) is a cryptographic problem in which n parties collaborate to compute a common value  ... 
doi:10.1109/access.2021.3062201 fatcat:kqcwwqjfjnds7bzlrid7r6gjlu
« Previous Showing results 1 — 15 out of 1,033 results