Filters








4,532 Hits in 3.5 sec

Decision Tree Classification with Differential Privacy: A Survey [article]

Sam Fletcher, Md Zahidul Islam
2019 arXiv   pre-print
We analyze both greedy and random decision trees, and the conflicts that arise when trying to balance privacy requirements with the accuracy of the model.  ...  In this survey, we focus on one particular data mining algorithm -- decision trees -- and how differential privacy interacts with each of the components that constitute decision tree algorithms.  ...  Differential privacy has become the de-facto privacy standard around the world in recent years, with the U.S.  ... 
arXiv:1611.01919v2 fatcat:dyal7yucujgfffadqqaoyzfene

Differentially Private Actor and Its Eligibility Trace

Kanghyeon Seo, Jihoon Yang
2020 Electronics  
In this paper, we confirm the applicability of differential privacy methods to the actors updated using the policy gradient algorithm and discuss the advantages of such an approach with regard to differentially  ...  In addition, we measured the cosine similarity between the differentially private applied eligibility trace and the non-differentially private eligibility trace to analyze whether their anonymity is appropriately  ...  Wang and Hegde [13] suggested a differentially private Q-learning algorithm in a continuous space to preserve a privacy of the value function approximator by adding a Gaussian process noise to the value  ... 
doi:10.3390/electronics9091486 fatcat:3gvviyvubrd4voe3uumkmxyhee

pMSE Mechanism: Differentially Private Synthetic Data with Maximal Distributional Similarity [article]

Joshua Snoke, Aleksandra Slavković
2018 arXiv   pre-print
We propose a method that maximizes the distributional similarity of the synthetic data relative to the original data using a measure known as the pMSE, while guaranteeing epsilon-differential privacy.  ...  We also give simulations for the accuracy of linear regression coefficients generated from the synthetic data compared with the accuracy of non-differentially private synthetic data and other differentially  ...  Differential Privacy Preliminaries Differentially Privacy is a formal framework for quantifying the disclosure risk associated with the release of statistics or raw data derived from private input data  ... 
arXiv:1805.09392v1 fatcat:djlkzqzdvfecva23yy5jicuova

Multinomial Random Forest: Toward Consistency and Privacy-Preservation [article]

Yiming Li, Jiawang Bai, Jiawei Li, Xue Yang, Yong Jiang, Chun Li, Shutao Xia
2020 arXiv   pre-print
Theoretically, we prove the consistency of the proposed MRF and analyze its privacy-preservation within the framework of differential privacy.  ...  Instead of deterministic greedy split rule or with simple randomness, the MRF adopts two impurity-based multinomial distributions to randomly select a split feature and a split value respectively.  ...  Since the distribution of X has a non-zero density, each node has a positive measure with respect to µ X .  ... 
arXiv:1903.04003v3 fatcat:tzlzd2j575hznc2owyn7vadxm4

A Practical Approach to Navigating the Tradeoff Between Privacy and Precise Utility [article]

Chandra Sharma, George Amariucai
2020 arXiv   pre-print
for the relative values that each user associates with their own privacy and utility.  ...  This paper discusses the intricacies of our utility model and the corresponding privacy-utility tradeoff, and introduces a heuristic greedy algorithm to solve the problem.  ...  If a variable is saturated, we ignore the variable for the current iteration and continue with the other variables.  ... 
arXiv:2003.04916v1 fatcat:2bl37k456vb3nnu4mizhbow7sm

Differentially Private Decomposable Submodular Maximization [article]

Anamay Chaturvedi, Huy Nguyen, Lydia Zakynthinou
2020 arXiv   pre-print
We complement our theoretical bounds with experiments demonstrating empirical performance, which improves over the differentially private algorithms for the general case of submodular maximization and  ...  We extend this work by designing differentially private algorithms for both monotone and non-monotone decomposable submodular maximization under general matroid constraints, with competitive utility guarantees  ...  Algorithm 2 is an adaptation of the Measured Continuous Greedy algorithm introduced by Feldman et al. [2011] .  ... 
arXiv:2005.14717v1 fatcat:5eyoews3pfgu5bda6rn3wmpktu

An Efficient Dummy-Based Location Privacy-Preserving Scheme for Internet of Things Services

Yongwen Du, Gang Cai, Xuejun Zhang, Ting Liu, Jinghua Jiang
2019 Information  
Specifically, the Enhanced-DLP adopts an improved greedy scheme to efficiently select dummy locations to form a k-anonymous set.  ...  To protect user privacy, a variety of location privacy-preserving schemes have been recently proposed.  ...  [13] proposed a context-aware location privacy-preserving solution with differential perturbation, which can enhance the user's location privacy without requiring a trusted third party (TTP).  ... 
doi:10.3390/info10090278 fatcat:vuf2e6w7mzd5fetj4tuhjaggvq

A Practical Framework for Privacy-Preserving Data Analytics

Liyue Fan, Hongxia Jin
2015 Proceedings of the 24th International Conference on World Wide Web - WWW '15  
To alleviate the high perturbation errors introduced by the differential privacy mechanism, we present two methods with different sampling techniques to draw a subset of individual data for analysis.  ...  In this paper, we propose a practical framework for data analytics, while providing differential privacy guarantees to individual data contributors.  ...  The ( , δ)-probabilistic differential privacy [19] achieves -differential privacy with high probability, i.e. ≥ (1−δ).  ... 
doi:10.1145/2736277.2741122 dblp:conf/www/FanJ15 fatcat:ba74x3s6ergw7bjpv4bps3lm2m

What Does The Crowd Say About You? Evaluating Aggregation-based Location Privacy

Apostolos Pyrgelis, Carmela Troncoso, Emiliano De Cristofaro
2017 Proceedings on Privacy Enhancing Technologies  
We then use the framework to quantify the privacy loss stemming from aggregate location data, with and without the protection of differential privacy, using two real-world mobility datasets.  ...  Furthermore, to bound information leakage from the aggregates, they can perturb the input of the aggregation or its output to ensure that these are differentially private.  ...  [40] combine encryption with data randomization to achieve differential privacy for time-series data.  ... 
doi:10.1515/popets-2017-0043 dblp:journals/popets/PyrgelisTC17 fatcat:n6o6zhewrzbxfltqrvsg65oubm

What Does The Crowd Say About You? Evaluating Aggregation-based Location Privacy [article]

Apostolos Pyrgelis, Carmela Troncoso, Emiliano De Cristofaro
2017 arXiv   pre-print
We then use the framework to quantify the privacy loss stemming from aggregate location data, with and without the protection of differential privacy, using two real-world mobility datasets.  ...  Furthermore, to bound information leakage from the aggregates, they can perturb the input of the aggregation or its output to ensure that these are differentially private.  ...  [40] combine encryption with data randomization to achieve differential privacy for time-series data.  ... 
arXiv:1703.00366v3 fatcat:ucvjfhgi7rghheuoptcmiq3o2m

DP-XGBoost: Private Machine Learning at Scale [article]

Nicolas Grislain, Joan Gonzalvez
2021 arXiv   pre-print
There has been many works on practical systems to compute statistical queries with Differential Privacy (DP).  ...  And one of the many reasons people and organizations did not share as much as expected is the privacy risk associated with data sharing operations.  ...  Indeed, differential privacy provides a theoretical framework for measuring and bounding the privacy loss occuring when one publishes an analysis on a private dataset.  ... 
arXiv:2110.12770v1 fatcat:g36xkh26mjantd63ocw6fqdvfa

Enhancement of k-anonymity algorithm for privacy preservation in social media

Aanchal Sharma, Sudhir Pathak
2018 International Journal of Engineering & Technology  
With the knowledge obtained from the users of social networks, attackers can easily attack the privacy of several victims.  ...  The traditional research on privacy-protected data publishing can only deal with relational data and even cannot applied to the data of social networking.  ...  [23] proposed 'Greedy k-member' along with 'Systematic clustering' approach to reduce the loss of information.  ... 
doi:10.14419/ijet.v7i2.27.11747 fatcat:y6f2ovbux5h3rghzzs6bxg2haq

Task Allocation in Mobile Crowd Sensing: State of the Art and Future Opportunities [article]

Jiangtao Wang, Leye Wang, Yasha Wang, Daqing Zhang, Linghe Kong
2018 arXiv   pre-print
Mobile Crowd Sensing (MCS) is the special case of crowdsourcing, which leverages the smartphones with various embedded sensors and user's mobility to sense diverse phenomenon in a city.  ...  first present the unique features of MCS allocation compared to generic crowdsourcing, and then provide a comprehensive review for diversifying problem formulation and allocation algorithms together with  ...  In other words, if an adversary foreknows that a user has a probability of P in a location L, with the ε-differential-privacy protection, the adversary's confidence probability of the user at L will not  ... 
arXiv:1805.08418v2 fatcat:m7ajfuob6rd7hdxsejsrm5rwmu

Privacy-Preserving Schema Reuse [chapter]

Nguyen Quoc Viet Hung, Do Son Thanh, Nguyen Thanh Tam, Karl Aberer
2014 Lecture Notes in Computer Science  
Instead of showing original schemas, the framework returns an anonymized schema with maximal utility while satisfying these privacy constraints.  ...  Addressing this problem, we develop a framework that enables privacy-preserving schema reuse.  ...  In this experiment, we study the proposed algorithm with two heuristics: without lookahead (Greedy) and with lookahead (Lookahead).  ... 
doi:10.1007/978-3-319-05813-9_16 fatcat:bqcvundquvenhcesi7kx2enfri

Towards user-oriented privacy for recommender system data: A personalization-based approach to gender obfuscation for user profiles

Manel Slokom, Alan Hanjalic, Martha Larson
2021 Information Processing & Management  
PerBlur is formulated within a user-oriented paradigm of recommender system data privacy that aims at making privacy solutions understandable, unobtrusive, and useful for the user.  ...  In this paper, we propose a new privacy solution for the data used to train a recommender system, i.e., the user-item matrix.  ...  For this reason, next in Table 6 , we omit PerBlur with random removal and we continue with PerBlur with no removal and PerBlur with greedy removal.  ... 
doi:10.1016/j.ipm.2021.102722 fatcat:o6szbxzukzdovpgomg77zdcqcu
« Previous Showing results 1 — 15 out of 4,532 results