A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2006; you can also visit the original URL.
The file type is application/pdf
.
Filters
Testing Disjointness of Private Datasets
[chapter]
2005
Lecture Notes in Computer Science
Two parties, say Alice and Bob, possess two sets of elements that belong to a universe of possible values and wish to test whether these sets are disjoint or not. ...
This problem has many applications in commercial settings where two mutually distrustful parties wish to decide with minimum possible disclosure whether there is any overlap between their private datasets ...
Naturally a private disjointness test is a special case of this problem; nevertheless, no efficient protocol construction of a private disjointness test that is entirely independent from generic secure ...
doi:10.1007/11507840_13
fatcat:mqeccr6ehbdazbnf4bdsbhyq2e
Multi Party Distributed Private Matching, Set Disjointness and Cardinality of Set Intersection with Information Theoretic Security
[chapter]
2009
Lecture Notes in Computer Science
Previous solutions for Distributed Private Matching and Cardinality Set Intersection were cryptographically secure and the previous Set Disjointness solution, though information theoretically secure, is ...
In this paper, we focus on the specific problems of Private Matching, Set Disjointness and Cardinality Set Intersection in information theoretic settings. ...
This is an example of private set disjointness test. ...
doi:10.1007/978-3-642-10433-6_2
fatcat:myuw62dbtjckrbuic76ezm6br4
Differentially Private Random Decision Forests using Smooth Sensitivity
[article]
2017
arXiv
pre-print
We propose a new differentially-private decision forest algorithm that minimizes both the number of queries required, and the sensitivity of those queries. ...
To do so, we build an ensemble of random decision trees that avoids querying the private data except to find the majority class label in the leaf nodes. ...
Scope query f to xt where xt is a disjoint subset of x = {x 1 , ..., xτ } 23: for all Leaf nodes L = {L i , ∀i} in tree Ft, F = {F 1 , ..., Fτ } do 24: f (xt ∩ L i ) ← Scope f (xt) with all the tests in ...
arXiv:1606.03572v3
fatcat:ydwi43754zgbzonwc4gh4bnhiu
Separating Computational and Statistical Differential Privacy in the Client-Server Model
[chapter]
2016
Lecture Notes in Computer Science
in the clientserver model that can be efficiently performed with CDP, but is infeasible to perform with information-theoretic differential privacy. ...
Differential privacy is a mathematical definition of privacy for statistical data analysis. ...
We are grateful to an anonymous reviewer for pointing out that our original construction based on non-interactive witness indistinguishable proofs could be modified to accommodate 2-message proofs (zaps ...
doi:10.1007/978-3-662-53641-4_23
fatcat:yqkzdvdty5d3tds4v5mnup3mpy
Federated Semi-Supervised Learning with Inter-Client Consistency Disjoint Learning
[article]
2021
arXiv
pre-print
FedMatch improves upon naive combinations of federated learning and semi-supervised learning approaches with a new inter-client consistency loss and decomposition of the parameters for disjoint learning ...
Thus the private data at each client may be either partly labeled, or completely unlabeled with labeled data being available only at the server, which leads us to a new practical federated learning problem ...
We use Fashion-MNIST dataset for this setting, and split Fashion-MNIST (70, 000) into training (63, 000), valid (3, 500), and test (3, 500) sets. ...
arXiv:2006.12097v3
fatcat:znubc5dbsbcqhaift6rjeeftuu
Efficient Private Proximity Testing with GSM Location Sketches
[chapter]
2012
Lecture Notes in Computer Science
Due to the simplicity of private equality testing, our resulting scheme for location tag-based private proximity testing is several orders of magnitude more efficient than previous solutions. ...
A protocol for private proximity testing allows two mobile users communicating through an untrusted third party to test whether they are in close physical proximity without revealing any additional information ...
Private equality testing: Given a location sketch, phones can test for proximity using a private equality test on their sketches. ...
doi:10.1007/978-3-642-32946-3_7
fatcat:2cl54cd54zgqrhpf6uyuj2wx2m
New Directions in Distributed Deep Learning: Bringing the Network at Forefront of IoT Design
[article]
2020
arXiv
pre-print
challenges to large-scale adoption of deep learning at the edge: (i) Hardware-constrained IoT devices, (ii) Data security and privacy in the IoT era, and (iii) Lack of network-aware deep learning algorithms for ...
We then provide a unified view targeting three research directions that naturally emerge from the above challenges: (1) Federated learning for training deep networks, (2) Data-independent deployment of ...
Government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright notation thereon. ...
arXiv:2008.10805v1
fatcat:kwdkpkt2rjcltbphdlraulqncq
Stochastic gradient descent with differentially private updates
2013
2013 IEEE Global Conference on Signal and Information Processing
In this paper, we derive differentially private versions of stochastic gradient descent, and test them empirically. ...
Differential privacy is a recent framework for computation on sensitive data, which has shown considerable promise in the regime of large datasets. ...
ACKNOWLEDGEMENT KC and SS would like to thank NIH U54-HL108460, the Hellman Foundation, and NSF IIS 1253942 for support. ...
doi:10.1109/globalsip.2013.6736861
dblp:conf/globalsip/SongCS13
fatcat:6hy5t2biwzcivdyrvxbhkcm2nm
Privacy-Preserving Gradient Boosting Decision Trees
2020
PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE
Sensitivity and privacy budget are two key design aspects for the effectiveness of differential private models. ...
The Gradient Boosting Decision Tree (GBDT) is a popular machine learning model for various tasks in recent years. ...
The synthetic datasets are generated using scikit-learn 3 (Pedregosa et al. 2011) . We show test errors and RMSE (root mean square error) for the classification and regression task, respectively. ...
doi:10.1609/aaai.v34i01.5422
fatcat:xdhboypld5bnxcrtp3t56elmme
DP-XGBoost: Private Machine Learning at Scale
[article]
2021
arXiv
pre-print
In this work we describe and implement a DP fork of a battle tested ML model: XGBoost. ...
Our approach beats by a large margin previous attempts at the task, in terms of accuracy achieved for a given privacy budget. ...
By leveraging XGBoost capabilities to handle large-scale datasets and its integration with Spark we thus obtained an efficient method to train DP boosted trees for real-world applications. ...
arXiv:2110.12770v1
fatcat:g36xkh26mjantd63ocw6fqdvfa
Privacy-Preserving Gradient Boosting Decision Trees
[article]
2021
arXiv
pre-print
Sensitivity and privacy budget are two key design aspects for the effectiveness of differential private models. ...
The Gradient Boosting Decision Tree (GBDT) is a popular machine learning model for various tasks in recent years. ...
The authors thank Xuejun Zhao for her discussion and Shun Zhang for his comments to improve the paper. ...
arXiv:1911.04209v4
fatcat:gctry2shgjby5dd73ouv7an2py
Heterogeneous Data-Aware Federated Learning
[article]
2020
arXiv
pre-print
With the industrialization of the FL framework, we identify several problems hampering its successful deployment, such as presence of non i.i.d data, disjoint classes, signal multi-modality across datasets ...
Federated learning (FL) is an appealing concept to perform distributed training of Neural Networks (NN) while keeping data private. ...
We conducted test with different settings of C ∈ {10, 20, 50} for the i.i.d and non-i.i.d settings and C ∈ {10, 20, 31} for the disjoint settings (since we are capped by the number of classes in the latter ...
arXiv:2011.06393v1
fatcat:u7wah6zsjja27m5qkcu62kruma
A One-Pass Private Sketch for Most Machine Learning Tasks
[article]
2020
arXiv
pre-print
Inspired by recent progress toward general-purpose data release algorithms, we propose a private sketch, or small summary of the dataset, that supports a multitude of machine learning tasks including regression ...
We prove competitive error bounds for DP kernel density estimation. Existing methods for DP kernel density estimation scale poorly, often exponentially slower with an increase in dimensions. ...
We use the MatLab code released by [26] and report the mean squared error (MSE) on a held-out test set in Figure 4 . ...
arXiv:2006.09352v1
fatcat:btryq2ftjnad7jyasj7wqrx2oi
EdgeAI: A Vision for Deep Learning in IoT Era
[article]
2019
arXiv
pre-print
Overcoming these challenges is crucial for rapid adoption of learning on IoT-devices in order to truly enable EdgeAI. ...
The significant computational requirements of deep learning present a major bottleneck for its large-scale adoption on hardware-constrained IoT-devices. ...
Then, for rapid adoption of deep learning at the edge, an important question is, how can we perform model compression without using the original, private or alternate datasets? ...
arXiv:1910.10356v1
fatcat:6df62csanbcldaf5q6y47wymt4
Privacy-preserving Decentralized Aggregation for Federated Learning
[article]
2020
arXiv
pre-print
Our secure aggregation protocol based on this novel group communication pattern design leads to an efficient algorithm for federated training with privacy guarantees. ...
We evaluate our federated training algorithm on image classification and next-word prediction applications over benchmark datasets with 9 and 15 distributed sites. ...
Fig. 3 : 3 Test Accuracy per Round for Training over Different Datasets with 9 and 15 Peers. ...
arXiv:2012.07183v2
fatcat:fazprhigvzeb3algcza2soyf3i
« Previous
Showing results 1 — 15 out of 5,263 results