A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Filters
A PAC-Bayes Bound for Tailored Density Estimation
[chapter]
2010
Lecture Notes in Computer Science
In this paper we construct a general method for reporting on the accuracy of density estimation. ...
Using variational methods from statistical learning theory we derive a PAC, algorithm-dependent bound on the distance between the data generating distribution and a learned approximation. ...
Conclusions and Future Work In this paper we have derived a PAC-Bayes bound for density estimation with a loss function that allows us to tailor the density estimate to a function class of interest. ...
doi:10.1007/978-3-642-16108-7_15
fatcat:c745dpjybjdllkqhakzivtmsra
A Primer on PAC-Bayesian Learning
[article]
2019
arXiv
pre-print
The present paper aims at providing a self-contained survey on the resulting PAC-Bayes framework and some of its main theoretical and algorithmic developments. ...
Generalised Bayesian learning algorithms are increasingly popular in machine learning, due to their PAC generalisation properties and flexibility. ...
The author warmly thanks Omar Rivasplata for his careful reading and suggestions. ...
arXiv:1901.05353v3
fatcat:vy73fwwanvfofbp3azhsrdq5v4
Risk bounds for aggregated shallow neural networks using Gaussian prior
[article]
2022
arXiv
pre-print
The main contribution is a precise nonasymptotic assessment of the estimation error appearing in the PAC-Bayes bound. ...
Combining bounds on estimation and approximation errors, we establish risk bounds that are sharp enough to lead to minimax rates of estimation over Sobolev smoothness classes. ...
We say then that f n satisfies a PAC-Bayes inequality in-expectation. ...
arXiv:2112.11086v2
fatcat:4vujmvoiffdivnuxbyn3sfa4u4
Bayesian fractional posteriors
[article]
2016
arXiv
pre-print
Second, we derive a novel Bayesian oracle inequality based on a PAC-Bayes inequality in misspecified models. ...
We also illustrate the theory in Gaussian process regression and density estimation problems. ...
Many previous results on PAC-Bayes type inequalities are specifically tailored to classification (bounded loss, [11, 12, 51] ) or regression (squared loss, [16, 27, 39, 51] ) problems. ...
arXiv:1611.01125v2
fatcat:ucscj4we7ja55hvyrbkdy5efgq
Meta-Learning by Adjusting Priors Based on Extended PAC-Bayes Theory
[article]
2019
arXiv
pre-print
We present a framework for meta-learning that is based on generalization error bounds, allowing us to extend various PAC-Bayes bounds to meta-learning. ...
Learning takes place through the construction of a distribution over hypotheses based on the observed tasks, and its utilization for learning a new task. ...
ACKNOWLEDGMENTS We thank Asaf Cassel, Guy Tennenholtz, Baruch Epstein, Daniel Soudry, Elad Hoffer and Tom Zahavy for helpful discussions of this work, and the anonymous reviewers for their helpful comment ...
arXiv:1711.01244v8
fatcat:uvbegjw6erezjebbinjrrejzpe
PAC-Bayesian Generalisation Error Bounds for Gaussian Process Classification
2003
Journal of machine learning research
In this paper, by applying the PAC-Bayesian theorem of McAllester (1999a), we prove distributionfree generalisation error bounds for a wide range of approximate Bayesian GP classification techniques. ...
As is shown in experiments on a real-world task, the bounds can be very tight for moderate training sample sizes. ...
PAC-Bayesian bound for Bayes classifiers Very recently, Meir and Zhang (2003) obtained a strong PAC-Bayesian result for Bayes classifiers which can be written as expectations over a uniformly bounded ...
doi:10.1162/153244303765208386
fatcat:apqdzkzodjfwphsfjbm7mcw3ge
Editors' Introduction
[chapter]
2013
Lecture Notes in Computer Science
He formulates a general framework for a large class of learning algorithms for such languages and, using this framework, he reviews Angluin's classical LSTAR algorithm and compares it with various contemporary ...
He also studied the learnability of regular languages and context-free languages; a sample result, obtained in collaboration with Franck Thollard, is that the class of regular languages can be PAC-learned ...
In A PAC-Bayes Bound for Tailored Density Estimation, Matthew Higgs and John Shawe-Taylor consider the problem of density estimation with an unusual twist: they want their solution to be tailored to the ...
doi:10.1007/978-3-642-40935-6_1
fatcat:pchrsvhjezfbvh6dfplqhxhgcy
On some recent advances on high dimensional Bayesian statistics
2015
ESAIM Proceedings and Surveys
On the theoretical side, we describe some recent advances in Bayesian consistency for a nonparametric hidden Markov model as well as new pac-Bayesian results for different models of high dimensional regression ...
After giving some brief motivations in a short introduction, we describe new advances in the understanding of Bayes posterior computation as well as theoretical contributions in non parametric and high ...
The pac-Bayesian paradigm The pac theory consists in deriving risk bound on randomized estimators (see for example [Val84] ). ...
doi:10.1051/proc/201551016
fatcat:ydqnd43mlrgk5je4hh7ywvfwd4
PAC-Bayes and Domain Adaptation
[article]
2018
arXiv
pre-print
a new tighter domain adaptation bound for the target risk. ...
We provide two main contributions in PAC-Bayesian theory for domain adaptation where the objective is to learn, from a source distribution, a well-performing majority vote on a different, but related, ...
In this scenario, one may estimate the values of β q (T X S X ), and even η T \S , by using unsupervised density estimation methods. ...
arXiv:1707.05712v2
fatcat:sgvyai2sczavhb2kxda2qsoosi
Information-Theoretic Local Minima Characterization and Regularization
[article]
2020
arXiv
pre-print
We provide theoretical analysis including a generalization bound and empirically demonstrate the success of our approach in both capturing and improving the generalizability of DNNs. ...
Experiments are performed on CIFAR-10, CIFAR-100 and ImageNet for various network architectures. ...
We provide its theoretical analysis, primarily a generalization bound based on PAC-Bayes (McAllester, 1999b; a) . ...
arXiv:1911.08192v2
fatcat:7ufib6jxrjejpjpr5tas3wadoi
10.1162/153244301753683717
2000
Applied Physics Letters
We suggest the Bayes point machine as a well-founded improvement which approximates the Bayes-optimal decision by the centre of mass of version space. ...
Kernel-classifiers comprise a powerful class of non-linear decision functions for binary classification. ...
Special thanks go to Patrick Haffner for pointing out the speed improvement by exploiting sparsity of the MNIST and USPS images and to Jun Liao for pointing out a mistake in Algorithm 2. ...
doi:10.1162/153244301753683717
fatcat:v6rvmmko2ffm5m4cu3wnkw3dyy
Pessimistic Model-based Offline Reinforcement Learning under Partial Coverage
[article]
2021
arXiv
pre-print
., realizability in the function class), CPPO has a PAC guarantee with offline data only providing partial coverage, i.e., it can learn a policy that competes against any policy that is covered by the ...
; (2) factored MDP where the partial coverage condition is defined using density ratio based concentrability coefficients associated with individual factors. ...
Acknowledgement The authors would like to thank Nan Jiang, Tengyang Xie for valuable feedback. ...
arXiv:2107.06226v2
fatcat:samqjfn7crgmxaeqhf3yx3snbi
PAC-Bayes and Domain Adaptation
2019
Neurocomputing
domain adaptation bound for the target risk. ...
We provide two main contributions in PAC-Bayesian theory for domain adaptation where the objective is to learn, from a source distribution, a well-performing majority vote on a different, but related, ...
In this scenario, one may estimate the values of β q (T X S X ), and even η T \S , by using unsupervised density estimation methods. ...
doi:10.1016/j.neucom.2019.10.105
fatcat:kxc2yfvawvhnnnef5bdwxxlif4
Rademacher Complexity Bounds for a Penalized Multi-class Semi-supervised Algorithm (Extended Abstract)
2018
Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
We propose Rademacher complexity bounds for multi-class classifiers trained with a two-step semi-supervised model. ...
fixed threshold stands for clustering consistency. ...
On another level and under the PAC-Bayes setting, [Kääriäinen, 2005] showed that in the realizable case where the hypothesis set contains the Bayes classifier, the obtained excess risk bound takes the ...
doi:10.24963/ijcai.2018/800
dblp:conf/ijcai/MaximovAH18
fatcat:ey5qlhwnsfafrfyxqy447wu4o4
On Generalization Error Bounds of Noisy Gradient Methods for Non-Convex Learning
[article]
2020
arXiv
pre-print
We develop a new framework, termed Bayes-Stability, for proving algorithm-dependent generalization error bounds. ...
We obtain new generalization bounds for the continuous Langevin dynamic in this setting by developing a new Log-Sobolev inequality for the parameter distribution at any time. ...
We develop a new method for proving generalization bounds, termed as Bayes-Stability, by incorporating ideas from the PAC-Bayesian theory into the stability framework. ...
arXiv:1902.00621v4
fatcat:4jqqzwnq6ral3lci4zmmjbtutm
« Previous
Showing results 1 — 15 out of 193 results