162 Hits in 3.5 sec

Data-dependent PAC-Bayes priors via differential privacy [article]

Gintare Karolina Dziugaite, Daniel M. Roy
2019 arXiv   pre-print
We show how an ϵ-differentially private data-dependent prior yields a valid PAC-Bayes bound, and then show how non-private mechanisms for choosing priors can also yield generalization bounds.  ...  The Probably Approximately Correct (PAC) Bayes framework (McAllester, 1999) can incorporate knowledge about the learning algorithm and (data) distribution through the use of distribution-dependent priors  ...  0.06 4 DP PAC-Bayes bound on test 0.21 0.35 0.65 1 Lever PAC-Bayes with τ = τ 2 on test 0.26 1 1 1 Supplementary Material for "Data-dependent PAC-Bayes priors via differential privacy" See  ... 
arXiv:1802.09583v2 fatcat:rgppt5chlrcflo7jpi2iizgp5y

Entropy-SGD optimizes the prior of a PAC-Bayes bound: Generalization properties of Entropy-SGD and data-dependent priors [article]

Gintare Karolina Dziugaite, Daniel M. Roy
2019 arXiv   pre-print
In order to obtain a valid generalization bound, we rely on a result showing that data-dependent priors obtained by stochastic gradient Langevin dynamics (SGLD) yield valid PAC-Bayes bounds provided the  ...  Entropy-SGD works by optimizing the bound's prior, violating the hypothesis of the PAC-Bayes theorem that the prior is chosen independently of the data.  ...  We then introduce several existing learning bounds that use differential privacy, including the PAC-Bayes bounds outlined above that use data-dependent priors.  ... 
arXiv:1712.09376v3 fatcat:l3fssx5csbhedcrtl2ojaaznle

PAC-Bayes Learning Bounds for Sample-Dependent Priors

Pranjal Awasthi, Satyen Kale, Stefani Karp, Mehryar Mohri
2020 Neural Information Processing Systems  
We present a series of new PAC-Bayes learning guarantees for randomized algorithms with sample-dependent priors.  ...  We also provide a flexible framework for computing PAC-Bayes bounds, under certain stability assumptions on the sample-dependent priors, and show how to use this framework to give more refined bounds when  ...  We now provide our general PAC-Bayes bounds with sample-dependent priors.  ... 
dblp:conf/nips/AwasthiKKM20 fatcat:twba5zfr65fhdfptetooxqz27i

Self-Certifying Classification by Linearized Deep Assignment [article]

Bastian Boll, Alexander Zeilmann, Stefania Petra, Christoph Schnörr
2022 arXiv   pre-print
Building on the recent PAC-Bayes literature and data-dependent priors, this approach enables (i) to use risk bounds as training objectives for learning posterior distributions on the hypothesis space and  ...  We propose a novel class of deep stochastic predictors for classifying metric data on graphs within the PAC-Bayes risk certification paradigm.  ...  Unlike this work, we do not make use of differential privacy to account for sharing data between prior and posterior.  ... 
arXiv:2201.11162v2 fatcat:yzo4mlahf5gydfkj6qbhfkxuty

Tighter risk certificates for neural networks [article]

María Pérez-Ortiz and Omar Rivasplata and John Shawe-Taylor and Csaba Szepesvári
2021 arXiv   pre-print
We further experiment with different types of priors on the weights (both data-free and data-dependent priors) and neural network architectures.  ...  These two training objectives are derived from tight PAC-Bayes bounds.  ...  We rigorously study and illustrate 'PAC-Bayes with Backprop' (PBB), a generic strategy to derive (probabilistic) neural network training methods from PAC-Bayes bounds. 2.  ... 
arXiv:2007.12911v3 fatcat:efoankqx6vbwvdeh34k76mijdm

On the role of data in PAC-Bayes bounds [article]

Gintare Karolina Dziugaite, Kyle Hsu, Waseem Gharbieh, Gabriel Arpino, Daniel M. Roy
2020 arXiv   pre-print
The dominant term in PAC-Bayes bounds is often the Kullback--Leibler divergence between the posterior and prior.  ...  For so-called linear PAC-Bayes risk bounds based on the empirical risk of a fixed posterior kernel, it is possible to minimize the expected value of the bound by choosing the prior to be the expected posterior  ...  Finally, we evaluate minimizing a PAC-Bayes bound with our data-dependent priors as a learning algorithm.  ... 
arXiv:2006.10929v2 fatcat:2nkrcd66efao3fwcmgsamuqug4

Nonlinear Collaborative Scheme for Deep Neural Networks [article]

Hui-Ling Zhen, Xi Lin, Alan Z. Tang, Zhenhua Li, Qingfu Zhang, Sam Kwong
2018 arXiv   pre-print
gradient descent; (iii) tighter PAC-Bayes bound.  ...  To some extent, we bridge the gap between learning (i.e., minimizing the new objective function) and generalization (i.e., minimizing a PAC-Bayes bound) in the new scheme.  ...  Differential Privacy Here, to investigate that we do not need to worry whether the prior of PAC-Bayes bound is dependent on samples/observations or not, we introduce the definition of differential privacy  ... 
arXiv:1811.01316v1 fatcat:34dlmj23crc4hjszq44vwx6ti4

Efficient hyperparameter optimization by way of PAC-Bayes bound minimization [article]

John J. Cherian, Andrew G. Taube, Robert T. McGibbon, Panagiotis Angelikopoulos, Guy Blanc, Michael Snarski, Daniel D. Richman, John L. Klepeis, David E. Shaw
2020 arXiv   pre-print
Here we present an alternative objective that is equivalent to a Probably Approximately Correct-Bayes (PAC-Bayes) bound on the expected out-of-sample error.  ...  Such methods often yield overfit models, however, leading to poor performance on unseen data.  ...  Data-dependent PAC-Bayes priors via differential privacy. In Proceedings of the 32nd International Conference on Neural Information Processing Systems, NIPS18, pp. 84408450, Red Hook, NY, USA, 2018.  ... 
arXiv:2008.06431v1 fatcat:h5ftzr446jclrg5ibiz5tviwyq

PAC-Bayes Analysis Beyond the Usual Bounds [article]

Omar Rivasplata, Ilja Kuzborskij, Csaba Szepesvari, John Shawe-Taylor
2020 arXiv   pre-print
In this setting the unknown quantity of interest is the expected risk of the data-dependent randomized predictor, for which upper bounds can be derived via a PAC-Bayes analysis, leading to PAC-Bayes bounds  ...  We present three bounds that illustrate the use of data-dependent priors, including one for the unbounded square loss.  ...  Data-dependent PAC-Bayes priors via differential privacy. In Advances in Neural Information Processing Systems (NeurIPS), pages 8430-8441, 2018b. S. N. Ethier and T. G. Kurtz.  ... 
arXiv:2006.13057v3 fatcat:e3abu75fhjeyrgow344obqjv7m

A Limitation of the PAC-Bayes Framework [article]

Roi Livni, Shay Moran
2021 arXiv   pre-print
the PAC-Bayes bound is arbitrarily large.  ...  In this manuscript we present a limitation for the PAC-Bayes framework. We demonstrate an easy learning task that is not amenable to a PAC-Bayes analysis.  ...  Let us note that in the context of pure differential privacy, the connection between PAC-Bayes analysis and privacy has been established in [14] .  ... 
arXiv:2006.13508v3 fatcat:xcl2qs3wzvezfd5yolr3enl5ee

Information Complexity and Generalization Bounds [article]

Pradeep Kr. Banerjee, Guido Montúfar
2021 arXiv   pre-print
Moreover, we obtain new bounds for data-dependent priors and unbounded loss functions.  ...  Optimizing the bounds gives rise to variants of the Gibbs algorithm, for which we discuss two practical examples for learning with neural networks, namely, Entropy- and PAC-Bayes- SGD.  ...  Differentially private data-dependent priors A PAC-Bayesian bound such as (3) stipulates that the prior Q be chosen before the draw of the training sample S.  ... 
arXiv:2105.01747v1 fatcat:e3uz3uvqgjfvtfoalec34kfzca

Higher-Order Generalization Bounds: Learning Deep Probabilistic Programs via PAC-Bayes Objectives [article]

Jonathan Warrell, Mark Gerstein
2022 arXiv   pre-print
Here, we offer a framework for representing and learning flexible PAC-Bayes bounds as stochastic programs using DPP-based methods.  ...  We test our framework using single- and multi-task generalization settings on synthetic and biological data, showing improved performance and generalization prediction using flexible DPP model representations  ...  complexity than a PAC-Bayesian data-dependent prior (we note that both methods had access to the same training/validation data during optimization).  ... 
arXiv:2203.15972v1 fatcat:dtwfrfys5ndxzpfp74rsnorlmi

User-friendly introduction to PAC-Bayes bounds [article]

Pierre Alquier
2021 arXiv   pre-print
Since the original PAC-Bayes bounds of D.  ...  Very recently, PAC-Bayes bounds received a considerable attention: for example there was workshop on PAC-Bayes at NIPS 2017, "(Almost) 50 Shades of Bayesian Learning: PAC-Bayesian trends and insights",  ...  Acknowledgements I learnt so much about PAC-Bayes bounds and related topics from my PhD advisor, all my co-authors, friends, students and twitter pals... that I will not even try to make a list.  ... 
arXiv:2110.11216v4 fatcat:ck4oiea6c5gd7ejpst37zwgcuu

Hypothesis Set Stability and Generalization [article]

Dylan J. Foster and Spencer Greenberg and Satyen Kale and Haipeng Luo and Mehryar Mohri and Karthik Sridharan
2020 arXiv   pre-print
We present a study of generalization for data-dependent hypothesis sets.  ...  Our main result is a generalization bound for data-dependent hypothesis sets expressed in terms of a notion of hypothesis set stability and a notion of Rademacher complexity for data-dependent hypothesis  ...  More closely related to this paper is the work of Dziugaite and Roy [2018a,b] , who develop PAC-Bayes bounds by choosing the prior via a data-dependent differentially private mechanism, and also showed  ... 
arXiv:1904.04755v3 fatcat:q5g4q7mwnjdpllkhtbdu2lvgga

Model-Agnostic Private Learning via Stability [article]

Raef Bassily, Om Thakkar, Abhradeep Thakurta
2018 arXiv   pre-print
We design differentially private learning algorithms that are agnostic to the learning model.  ...  ., instead of outputting a model based on the training data, they provide predictions for a set of m feature vectors that arrive online.  ...  In contrast, one would achieve a dependence of roughly √ m by using the advanced composition property of differential privacy [DRV10] .  ... 
arXiv:1803.05101v1 fatcat:q6ancdv5u5afjo52eq23zb7wsa
« Previous Showing results 1 — 15 out of 162 results