Filters








138,556 Hits in 3.6 sec

Learning with User-Level Privacy [article]

Daniel Levy, Ziteng Sun, Kareem Amin, Satyen Kale, Alex Kulesza, Mehryar Mohri, Ananda Theertha Suresh
2021 arXiv   pre-print
We propose and analyze algorithms to solve a range of learning tasks under user-level differential privacy constraints.  ...  We show that for high-dimensional mean estimation, empirical risk minimization with smooth losses, stochastic convex optimization, and learning hypothesis classes with finite metric entropy, the privacy  ...  , PAC learning with user-level privacy [35] , and bounding user contributions in ML models [6, 28] .  ... 
arXiv:2102.11845v3 fatcat:6rtefzaombdkffbgo2l57dckl4

Enhancing the Privacy of Federated Learning with Sketching [article]

Zaoxing Liu, Tian Li, Virginia Smith, Vyas Sekar
2019 arXiv   pre-print
Federated learning methods run training tasks directly on user devices and do not share the raw user data with third parties.  ...  We evaluate the feasibility of sketching-based federated learning with a prototype on three representative learning models.  ...  Traditional federated learning (e.g., [31] ) reaches this level as it does not share the user raw data by offloading the training process to user devices. • Level-2: Global privacy.  ... 
arXiv:1911.01812v1 fatcat:mquqgd2ykjepdidtf5bx4rkkpq

Interpretable Machine Learning for Privacy-Preserving Pervasive Systems [article]

Benjamin Baron, Mirco Musolesi
2019 arXiv   pre-print
In this paper, we propose a machine learning interpretability framework that enables users to understand how these generated traces violate their privacy.  ...  Our everyday interactions with pervasive systems generate traces that capture various aspects of human behavior and enable machine learning algorithms to extract latent information about users.  ...  Alternatively, aggregation methods might guarantee a certain level of anonymity with respect to the trace of other users.  ... 
arXiv:1710.08464v6 fatcat:fv66extdtzf65ofz7amyjwhdqq

dpUGC: Learn Differentially Private Representation for User Generated Contents [article]

Xuan-Son Vu, Son N. Tran, Lili Jiang
2019 arXiv   pre-print
Based on it, we propose a user-level approach to learn personalized differentially private word embedding model on user generated contents (UGC).  ...  To our best knowledge, this is the first work of learning user-level differentially private word embedding model from text for sharing.  ...  User-level word embedding learning: Given a collection of user-level data {D 1 , . . . , D u , . . . , D k } where each user-level data D u contains a number of documents about user u.  ... 
arXiv:1903.10453v1 fatcat:xjvrj6pecjgxpaihzqk3fx2x3a

Demonstration of KAPUER: A privacy policy manager on Android

Arnaud Oglaza, Romain Laborde, Pascale Zarate
2016 2016 13th IEEE Annual Consumer Communications & Networking Conference (CCNC)  
The system includes an authorization recommendation engine that learns user's preferences in terms of privacy and proposes him high level rules to protect his privacy from applications.  ...  ' privacy policies on Android.  ...  This decision allows KAPUER to learn the user's preferences in terms of privacy. Once it has learns preferences through some requests, Kapuer proposes high level rules to the user.  ... 
doi:10.1109/ccnc.2016.7444781 dblp:conf/ccnc/OglazaLZ16 fatcat:y7rxhhgpgbbfler7o6sqre44xe

Mining Privacy Settings to Find Optimal Privacy-Utility Tradeoffs for Social Network Services

Shumin Guo, Keke Chen
2012 2012 International Conference on Privacy, Security, Risk and Trust and 2012 International Confernece on Social Computing  
We propose a framework for users to conveniently tune the privacy settings towards the user desired privacy level and social utilities.  ...  It mines the privacy settings of a large number of users in a SNS, e.g., Facebook, to generate latent trait models for the level of privacy concern and the level of utility preference.  ...  We can also learn each user's level of privacy concern θ i from the data. Figure 8 shows the distribution of the number of users according to the level of privacy concern.  ... 
doi:10.1109/socialcom-passat.2012.22 dblp:conf/socialcom/GuoC12 fatcat:r3ink7s7wjfdpo2lf6257ctnj4

Using students' tracking data in E-learning: Are we always aware of security and privacy concerns?

Madeth May, Sebastien George
2011 2011 IEEE 3rd International Conference on Communication Software and Networks  
This paper presents a study on security and privacy concerns in E-learning.  ...  While the study covers an analysis of some existing research data of security in Elearning and user privacy protection provisions, it helps us gain a broader perspective of utilizing the tracking approach  ...  learning activities with a high-level protection of user privacy.  ... 
doi:10.1109/iccsn.2011.6013764 fatcat:5zidusd7zjglpf55zaqmfboggy

Helping Users Managing Context-based Privacy Preference

Md. Zulfikar Alom, Barbara Carminati, Elena Ferrari
2019 Zenodo  
To cope with this issue, in this paper, we propose a context-based privacy management service that helps users to manage their privacy preferences setting under different contexts.  ...  Today, users interact with a variety of online services offered by different providers.  ...  Next, we compute the satisfaction level of the users regarding privacy preference suggestions generated by both approaches using various learning strategies.  ... 
doi:10.5281/zenodo.4675585 fatcat:bvxrl5yndfhspazztsr7j57z5e

Adapting Users' Privacy Preferences in Smart Environments

Md. Zulfikar Alom, Barbara Carminati, Elena Ferrari
2019 Zenodo  
At this aim, we exploit machine learning algorithms to build a classifier, which is able to make decisions on future service requests, by learning which privacy preference components a user is prone to  ...  To cope with this challenge, in this paper, we propose a soft privacy matching mechanism, able to relax, in a controlled way, some conditions of users' privacy preferences such to match with service providers  ...  They also used a supervised machine learning approach to learn users privacy preferences by iteratively asking them questions regarding their sharing activities with friends.  ... 
doi:10.5281/zenodo.4675684 fatcat:t7x2uoprynbm3p5htfwobrrx24

Enhancing Differential Privacy for Federated Learning at Scale

Chunghun Baek, Sungwook Kim, Dongkyun Nam, Jihoon Park
2021 IEEE Access  
INDEX TERMS Differential privacy, federated learning, user dropout, noise calibration.  ...  We first observe that user dropouts of an FL network may lead to failure in achieving the desired level of privacy protection, i.e., over-consumption of the privacy budget.  ...  Therefore, privacy guarantee needs to be ensured at the user level (user-level privacy).  ... 
doi:10.1109/access.2021.3124020 fatcat:7hfedfykqfcapfo3la572ewr7i

Federated Learning with Bayesian Differential Privacy [article]

Aleksei Triastcyn, Boi Faltings
2019 arXiv   pre-print
We consider the problem of reinforcing federated learning with formal privacy guarantees.  ...  budget below 1 at the client level, and below 0.1 at the instance level.  ...  FEDERATED LEARNING WITH BAYESIAN DIFFERENTIAL PRIVACY In this section, we adapt the Bayesian differential privacy framework and its accounting method to guarantee the clientlevel privacy, the level most  ... 
arXiv:1911.10071v1 fatcat:4ysxxmat7rgnteiskb4cyfynki

Stronger Privacy for Federated Collaborative Filtering with Implicit Feedback [article]

Lorenzo Minto, Moritz Haller, Hamed Haddadi, Benjamin Livshits
2021 arXiv   pre-print
To address this shortcoming, we propose a practical federated recommender system for implicit data under user-level local differential privacy (LDP).  ...  Even on the full dataset, we show that it is possible to achieve reasonable utility with HR@10>0.5 without compromising user privacy.  ...  We will first show that individual gradient updates satisfy event-level -local differential privacy, yielding -local differential privacy at user-level.  ... 
arXiv:2105.03941v3 fatcat:l3gpsrcitzeuxj6ahvrcnfckhe

"What if?" Predicting Individual Users' Smart Home Privacy Preferences and Their Changes

Natã M. Barbosa, Joon S. Park, Yaxing Yao, Yang Wang
2019 Proceedings on Privacy Enhancing Technologies  
With this in mind, many developers market their products with a focus on privacy in order to gain user trust, yet privacy tensions arise with the growing adoption of these devices and the risk of inappropriate  ...  Therefore, it is important for developers to consider individual user preferences and how they would change under varying circumstances, in order to identify actionable steps towards developing user trust  ...  We acknowledge Daniel Acuña for his invaluable guidance on the development of our machine learning models, and the people of the SALT Lab in the School of Information Studies at Syracuse University.  ... 
doi:10.2478/popets-2019-0066 dblp:journals/popets/BarbosaPYW19 fatcat:25akvjvb7facbpvwm4xos22f2m

Learning to share: Engineering adaptive decision-support for online social networks

Yasmin Rafiq, Luke Dickens, Alessandra Russo, Arosha K. Bandara, Mu Yang, Avelie Stuart, Mark Levine, Gul Calikli, Blaine A. Price, Bashar Nuseibeh
2017 2017 32nd IEEE/ACM International Conference on Automated Software Engineering (ASE)  
This paper presents a learning to share approach that enables the incorporation of more nuanced privacy controls into OSNs.  ...  Some online social networks (OSNs) allow users to define friendship-groups as reusable shortcuts for sharing information with multiple contacts.  ...  value associated with a privacy breach (a known reshare) at sensitivity level l, and b > 0 controls how risk averse the user is (their risk posture).  ... 
doi:10.1109/ase.2017.8115641 dblp:conf/kbse/RafiqDRBYSLCPN17 fatcat:rpzwzd3ky5gehbls5yg72obvxm

PDMFRec

Erika Duriakova, Elias Z. Tragos, Barry Smyth, Neil Hurley, Francisco J. Peña, Panagiotis Symeonidis, James Geraci, Aonghus Lawlor
2019 Proceedings of the 13th ACM Conference on Recommender Systems - RecSys '19  
We demonstrate the effectiveness of this approach by considering different levels of user privacy in comparison with state-of-the-art alternatives.  ...  This approach introduces an increased risk when it comes to user privacy. In this short paper we propose an alternative, user-centric, privacy enhanced, decentralised approach to MF.  ...  We divide this experiment into two parts, with respect to the two levels of privacy discussed in Section 3.3.  ... 
doi:10.1145/3298689.3347035 dblp:conf/recsys/DuriakovaTSHPSG19 fatcat:awvm3m2jyng6fbcbgmyby2ai54
« Previous Showing results 1 — 15 out of 138,556 results