Filters








1,282 Hits in 5.3 sec

Rademacher and Gaussian Complexities: Risk Bounds and Structural Results [chapter]

Peter L. Bartlett, Shahar Mendelson
2001 Lecture Notes in Computer Science  
We investigate the use of certain data-dependent estimates of the complexity of a function class, called Rademacher and Gaussian complexities.  ...  We consider function classes that can be expressed as combinations of functions from basis classes and show how the Rademacher and Gaussian complexities of such a function class can be bounded in terms  ...  Thanks to Arthur Gretton, Jonathan Baxter and Gábor Lugosi for helpful discussions, to the anonymous reviewers for suggestions, and especially to the reviewer who suggested a substantially simpler proof  ... 
doi:10.1007/3-540-44581-1_15 fatcat:vh4ohkuy5vgv3mwhelpcssoppu

Rademacher complexity of stationary sequences [article]

Daniel J. McDonald, Cosma Rohilla Shalizi
2017 arXiv   pre-print
Our proof and the result are simpler than previous analyses with dependent data or stochastic adversaries which use sequential Rademacher complexities rather than the expected Rademacher complexity for  ...  We also derive empirical Rademacher results without mixing assumptions resulting in fully calculable upper bounds.  ...  We first present our main result, which bounds E Z [Γ n (H)] with the Rademacher complexity and discuss its proof.  ... 
arXiv:1106.0730v2 fatcat:axiwkmen2rbgnaxfmzkdi6hej4

Discussion: Local Rademacher complexities and oracle inequalities in risk minimization

Peter L. Bartlett, Shahar Mendelson
2006 Annals of Statistics  
Discussion of "2004 IMS Medallion Lecture: Local Rademacher complexities and oracle inequalities in risk minimization" by V. Koltchinskii [arXiv:0708.0083]  ...  F r = {f ∈ F : P f = r}, DISCUSSION OF LOCAL RADEMACHER COMPLEXITIES 5  ...  We have seen that we can obtain a data-dependent version of the local Rademacher bounds that can be used as complexity penalties in model selection methods.  ... 
doi:10.1214/009053606000001028 fatcat:fy5scao32zgx7lah6ut5qasfjq

Hyperparameter Learning for Conditional Kernel Mean Embeddings with Rademacher Complexity Bounds [article]

Kelvin Hsu, Richard Nock, Fabio Ramos
2018 arXiv   pre-print
For conditional kernel mean embeddings with categorical targets and arbitrary inputs, we propose a hyperparameter learning framework based on Rademacher complexity bounds to prevent overfitting by balancing  ...  data fit against model complexity.  ...  Bache, K. and Lichman, M. (2013). UCI machine learning repository. Bartlett, P. L. and Mendelson, S. (2002). Rademacher and Gaussian complexities: Risk bounds and structural results.  ... 
arXiv:1809.00175v3 fatcat:3gp53t3x6bct3kavbuynpi4wc4

Semi-supervised Vector-valued Learning: From Theory to Algorithm [article]

Jian Li, Yong Liu, Weiping Wang
2021 arXiv   pre-print
Using local Rademacher complexity and unlabeled data, we derive novel data-dependent excess risk bounds for learning vector-valued functions in both the kernel space and linear space.  ...  Motivated by our theoretical analysis, we propose a unified framework for learning vector-valued functions, incorporating both local Rademacher complexity and Laplacian regularization.  ...  The framework combines the structural risk minimization (SRM) framework with two additional terms to bound local Rademacher complexity and makes use of unlabeled samples.  ... 
arXiv:1909.04883v3 fatcat:rlqh5hiaczggplnna7mcgvm4p4

Rademacher complexity and spin glasses: A link between the replica and statistical theories of learning [article]

Alia Abbara, Benjamin Aubin, Florent Krzakala, Lenka Zdeborová
2020 arXiv   pre-print
Statistical learning theory provides bounds of the generalization gap, using in particular the Vapnik-Chervonenkis dimension and the Rademacher complexity.  ...  Using this connection, one may reinterpret many results of the literature as rigorous Rademacher bounds in a variety of models in the high-dimensional statistics limit.  ...  The results for Rademacher variable y and with ϕ(z) = sign(z) are depicted in Fig. 1 . Interestingly, the bounds on the Rademacher complexity also induce consequences on spin glass physics.  ... 
arXiv:1912.02729v2 fatcat:2ps5xmzbjjg4ln7kol7ysekd34

On the Rademacher Complexity of Linear Hypothesis Sets [article]

Pranjal Awasthi, Natalie Frank, Mehryar Mohri
2020 arXiv   pre-print
We give both upper and lower bounds on the Rademacher complexity of these families and show that our bounds improve upon or match existing bounds, which are known only for 1 ≤ p ≤ 2.  ...  We present a tight analysis of the empirical Rademacher complexity of the family of linear hypothesis classes with weight vectors bounded in ℓ_p-norm for any p ≥ 1.  ...  (b) Same as (a), but for Gaussian matrices. Peter L. Bartlett and Shahar Mendelson. Rademacher and Gaussian complexities: Risk bounds and structural results.  ... 
arXiv:2007.11045v1 fatcat:cbekymbumvddrcmlsx5zn4kvzq

A Distribution Dependent and Independent Complexity Analysis of Manifold Regularization [article]

Alexander Mey, Tom Viering, Marco Loog
2019 arXiv   pre-print
By viewing Manifold regularization as a kernel method we then derive Rademacher bounds which allow for a distribution dependent analysis.  ...  Here, we derive sample complexity bounds based on pseudo-dimension for models that add a convex data dependent regularization term to a supervised learning process, as is in particular done in Manifold  ...  For this we derive computational feasible upper and lower bounds on the Rademacher complexity of MR.  ... 
arXiv:1906.06100v2 fatcat:k644o3i3e5fdtfht25k7ytthou

Robust learning and complexity dependent bounds for regularized problems [article]

Geoffrey Chinot
2019 arXiv   pre-print
We obtain bounds on the L_2-estimation error and the excess risk that depend on ϕ(f^*), where f^* is the minimizer of the risk over a class F.  ...  Results for the RERM are derived under weak assumptions on the outputs and a sub-Gaussian assumption on the class { (f-f^*)(X), f ∈ F }.  ...  Acknowledgements I would like to thank Guillaume Lecué and Matthieu Lerasle for their precious advices on this work.  ... 
arXiv:1902.02238v3 fatcat:qqegsnbtojhizncrqatp7tq5pq

The Statistical Complexity of Early-Stopped Mirror Descent [article]

Tomas Vaškevičius, Varun Kanade, Patrick Rebeschini
2020 arXiv   pre-print
By completing an inequality that characterizes convexity for the squared loss, we identify an intrinsic link between offset Rademacher complexities and potential-based convergence analysis of mirror descent  ...  We apply our theory to recover, in a clean and elegant manner via rather short proofs, some of the recent results in the implicit regularization literature, while also showing how to improve upon them  ...  Our main results, in a short and transparent way, yield bounds on the excess risk of the iterates of (both continuous-time and discrete-time) mirror descent using offset Rademacher complexities.  ... 
arXiv:2002.00189v2 fatcat:mue26qfbqvfa3exrxw5ymd6d4y

On the Sample Complexity of Predictive Sparse Coding [article]

Nishant A. Mehta, Alexander G. Gray
2012 arXiv   pre-print
; and 2) the high or infinite-dimensional setting, where only dimension-free bounds are useful.  ...  In the overcomplete setting, we present an estimation error bound that decays as Õ(sqrt(d k/m)) with respect to d and k.  ...  Theorem 3 then essentially implies a bound on the Rademacher complexity of each smaller subclass. The union bound over [N] and the two ε-covers finishes the result.  ... 
arXiv:1202.4050v2 fatcat:efk6xoszizarxmvodtcngriqkm

Transductive Rademacher Complexity and its Applications

R. El-Yaniv, D. Pechyony
2009 The Journal of Artificial Intelligence Research  
Our technique is based on a novel general error bound for transduction in terms of transductive Rademacher complexity, together with a novel bounding technique for Rademacher averages for particular algorithms  ...  We develop a technique for deriving data-dependent error bounds for transductive learning algorithms based on transductive Rademacher complexity.  ...  We thank Yair Wiener for useful comments. 8 In the bound (26) the meaning of R m+u ( e B g,e g(q) ) is as follows. For any q let A = e g(q) and Rm+u( e B g,g(q) ) = Rm+u( e Bg,A).  ... 
doi:10.1613/jair.2587 fatcat:u562zo7i7vfijj3qppazmozmui

Majorizing Measures, Sequential Complexities, and Online Learning [article]

Adam Block, Yuval Dagan, Sasha Rakhlin
2021 arXiv   pre-print
We introduce the technique of generic chaining and majorizing measures for controlling sequential Rademacher complexity.  ...  Finally, we establish a tight contraction inequality for worst-case sequential Rademacher complexity.  ...  Rademacher and Gaussian complexities: Risk bounds and structural results. Journal of Machine Learning Research, 3(Nov), 463-482. Bartlett, Peter L, Long, Philip M, & Williamson, Robert C. 1996.  ... 
arXiv:2102.01729v1 fatcat:5vfwvxkbsrdftcb4zwnoluvmoq

Some Local Measures of Complexity of Convex Hulls and Generalization Bounds [article]

Olivier Bousquet, Vladimir Koltchinskii, Dmitry Panchenko
2004 arXiv   pre-print
We investigate measures of complexity of function classes based on continuity moduli of Gaussian and Rademacher processes.  ...  We also obtain new bounds on generalization error in terms of localized Rademacher complexities.  ...  We combine this result with some new bounds on the generalization error in function learning problems based on localized Rademacher complexities.  ... 
arXiv:math/0405340v1 fatcat:pcttxyyiefebfkizk2uwubx6u4

Risk Bounds and Rademacher Complexity in Batch Reinforcement Learning [article]

Yaqi Duan, Chi Jin, Zhiyuan Li
2021 arXiv   pre-print
However, with completeness assumptions, the excess risk of FQI and a minimax style algorithm can be again bounded by the Rademacher complexity of the corresponding function classes. (3) Fast statistical  ...  Concretely, we view the Bellman error as a surrogate loss for the optimality gap, and prove the followings: (1) In double sampling regime, the excess risk of Empirical Risk Minimizer (ERM) is bounded by  ...  In the following Theorem 5.2, we upper bound the excess risk of the output of FQI in terms of Rademacher complexity. Theorem 5.2 (FQI, Rademacher complexity).  ... 
arXiv:2103.13883v1 fatcat:7opjsmiwcvewvmgdb3b36ewewu
« Previous Showing results 1 — 15 out of 1,282 results