Filters








141 Hits in 4.6 sec

Fat-shattering and the learnability of real-valued functions

Peter L. Bartlett, Philip M. Long, Robert C. Williamson
1994 Proceedings of the seventh annual conference on Computational learning theory - COLT '94  
, the fat-shattering function, introduced by Kearns and Schapire.  ...  We consider the problem of learning real-valued functions from random examples when the function values are corrupted with noise.  ...  agnostic learnability of a class F of real-valued functions.  ... 
doi:10.1145/180139.181158 dblp:conf/colt/BartlettLW94 fatcat:5klfi6ingjfbna5x7idh7ufooy

Fat-Shattering and the Learnability of Real-Valued Functions

Peter L. Bartlett, Philip M. Long, Robert C. Williamson
1996 Journal of computer and system sciences (Print)  
, the fat-shattering function, introduced by Kearns and Schapire.  ...  We consider the problem of learning real-valued functions from random examples when the function values are corrupted with noise.  ...  agnostic learnability of a class F of real-valued functions.  ... 
doi:10.1006/jcss.1996.0033 fatcat:jf5t2kakt5cy3b3xdhb5jibrf4

Generalization of Elman networks [chapter]

Barbara Hammer
1997 Lecture Notes in Computer Science  
Here, we nd constructions leading to lower bounds for the fat shattering dimension that are linear resp. of order log 2 in the input length even in the case of limited weights and inputs.  ...  Since niteness of this magnitude is equivalent to learnability, there is no a priori guarantee for the generalization capability of Elman networks.  ...  1; 1], an equivalent condition to learnability is the niteness of the fat shattering dimension: The -fat shattering dimension fat (F) is the largest size of a set S = fx 1 ; : : :; x n g such that reference  ... 
doi:10.1007/bfb0020189 fatcat:43bzgimw25bgda3najw775cemi

PAC learnability under non-atomic measures: a problem by Vidyasagar [article]

Vladimir Pestov
2012 arXiv   pre-print
Similar results are obtained for function learning in terms of fat-shattering dimension modulo countable sets, but, just like in the classical distribution-free case, the finiteness of this parameter is  ...  The uniform Glivenko--Cantelli property with respect to non-atomic measures is no longer a necessary condition, and consistent learnability cannot in general be expected.  ...  Acknowledgements The author is most grateful to two anonymous referees for their thorough reading of the paper and numerous useful suggestions which have helped to improve the presentation considerably  ... 
arXiv:1105.5669v3 fatcat:hzakh5nqqvhlrcl5rbbmvtvdwm

On efficient agnostic learning of linear combinations of basis functions

Wee Sun Lee, Peter L. Bartlett, Robert C. Williamson
1995 Proceedings of the eighth annual conference on Computational learning theory - COLT '95  
We also show that the sample complexity for learning the linear combinations grows polynomially if and only if a combinatorial property of the class of basis functions, called the fat-shattering function  ...  With the quadratic loss function, we show that the class of linear combinations of a set of basis functions is efficiently agnostically learnable if and only if the class of basis functions is efficiently  ...  Acknowledgements We would like to thank Pascal Koiran for numerous discussions which resulted in substantial improvements in several of the results in this paper.  ... 
doi:10.1145/225298.225343 dblp:conf/colt/LeeBW95 fatcat:vs6gf45o3rcj5iunj4nfbq67au

Learnability of Bipartite Ranking Functions [chapter]

Shivani Agarwal, Dan Roth
2005 Lecture Notes in Computer Science  
The problem of ranking, in which the goal is to learn a real-valued ranking function that induces a ranking or ordering over an instance space, has recently gained attention in machine learning.  ...  Our first main result provides a sufficient condition for the learnability of a class of ranking functions F : we show that F is learnable if its bipartite rank-shatter coefficients, which measure the  ...  The exposition in this paper is influenced in large parts by the excellent text of Anthony and Bartlett [16] .  ... 
doi:10.1007/11503415_2 fatcat:k3kfcixtjnhkxdzh2ywmj25gz4

Bounding the Fat Shattering Dimension of a Composition Function Class Built Using a Continuous Logic Connective [article]

Hubert Haoyang Duan
2011 arXiv   pre-print
Using results by Mendelson-Vershynin and Talagrand, we bound the Fat Shattering dimension of scale e of this new function class in terms of the Fat Shattering dimensions of the collection's classes.  ...  Two combinatorial parameters, the Vapnik-Chervonenkis (VC) dimension and its generalization, the Fat Shattering dimension of scale e, are explained and a few examples of their calculations are given with  ...  If F can ǫ-shatter arbitrarily large finite subsets, then the Fat Shattering dimension of scale ǫ of F is defined to be ∞.When the function class F consists of only functions taking values in {0, 1}, then  ... 
arXiv:1105.4618v1 fatcat:57ulccjb6jfllnpx7wl62l37yu

Page 5898 of Mathematical Reviews Vol. , Issue 98I [page]

1998 Mathematical Reviews  
(SGP-SING-IS; Singapore) ; Williamson, Robert C. (5-ANU-EG; Canberra) Fat-shattering and the learnability of real-valued functions.  ...  The authors study the problem of learning real-valued functions in the presence of noise.  ... 

On Learnability, Complexity and Stability [chapter]

Silvia Villa, Lorenzo Rosasco, Tomaso Poggio
2013 Empirical Inference  
We consider the fundamental question of learnability of a hypotheses class in the supervised learning setting and in the general learning setting introduced by Vladimir Vapnik.  ...  We survey classic results characterizing learnability in term of suitable notions of complexity, as well as more recent results that establish the connection between learnability and stability of a learning  ...  For the the square and absolute loss functions and Y compact, the characterization of learnability in terms of γ-fat shattering dimension can be used.  ... 
doi:10.1007/978-3-642-41136-6_7 dblp:conf/birthday/VillaRP13 fatcat:672cm6k3rnfnbd2azofttt54au

The Learnability of Unknown Quantum Measurements [article]

Hao-Chung Cheng, Min-Hsiu Hsieh, Ping-Cheng Yeh
2015 arXiv   pre-print
Our main result in the paper is that, for learning an unknown quantum measurement, the upper bound, given by the fat-shattering dimension, is linearly proportional to the dimension of the underlying Hilbert  ...  function.  ...  Let F be a set of real-valued functions on a domain X .  ... 
arXiv:1501.00559v1 fatcat:5wtqou5pi5e6hgcu54n4rphyce

On Learnability, Complexity and Stability [article]

Silvia Villa, Lorenzo Rosasco, Tomaso Poggio
2013 arXiv   pre-print
We consider the fundamental question of learnability of a hypotheses class in the supervised learning setting and in the general learning setting introduced by Vladimir Vapnik.  ...  We survey classic results characterizing learnability in term of suitable notions of complexity, as well as more recent results that establish the connection between learnability and stability of a learning  ...  For the the square and absolute loss functions and Y compact, the characterization of learnability in terms of γ-fat shattering dimension can be used.  ... 
arXiv:1303.5976v1 fatcat:irmbuvhr4zaafmgz7w2xjpi47m

Generalization ability of folding networks

B. Hammer
2001 IEEE Transactions on Knowledge and Data Engineering  
We nd bounds on the VC, pseudo-, and fat shattering dimension of folding networks with various activation functions. As a consequence, valid generalization of folding networks can be guaranteed.  ...  We propose two approaches which take the speci c distribution into account and allow us to derive explicit bounds on the deviation of the empirical error from the real error of a learning algorithm: The  ...  For a real valued function class F the -fat shattering dimension fat (F) is the largest size of a set that is -fat shattered by F, i.e. for the points x 1 , . . . x n there exist reference points r 1 ,  ... 
doi:10.1109/69.917560 fatcat:n7huvhyquvdpbibijsim372jda

On Learnability under General Stochastic Processes [article]

A. Philip Dawid, Ambuj Tewari
2022 arXiv   pre-print
We provide two natural notions of learnability of a function class under a general stochastic process. We show that both notions are in fact equivalent to online learnability.  ...  Statistical learning theory under independent and identically distributed (iid) sampling and online learning theory for worst case individual sequences are two of the best developed branches of learning  ...  Acknowledgments Thanks to the organizers and attendees of the Fifth Bayesian, Fiducial, and Frequentist Conference (BFF5) held from May 6-9, 2018 in Ann Arbor, MI, USA.  ... 
arXiv:2005.07605v3 fatcat:wp75fhdnd5a4xls4znfw3b32ka

Pseudo-dimension of quantum circuits [article]

Matthias C. Caro, Ishaun Datta
2020 arXiv   pre-print
We prove pseudo-dimension bounds on the output probability distributions of quantum circuits; the upper bounds are polynomial in circuit depth and number of gates.  ...  are PAC-learnable.  ...  Note that, trivially, fat F (γ) ≤ Pdim(F) holds for every γ > 0 and for every real-valued function class F.  ... 
arXiv:2002.01490v2 fatcat:fjyrdij74nfmjcth4wip5qtaka

On the Equivalence between Online and Private Learnability beyond Binary Classification [article]

Young Hun Jung, Baekjin Kim, Ambuj Tewari
2021 arXiv   pre-print
Our extension involves studying a novel variant of the Littlestone dimension that depends on a tolerance parameter and on an appropriate generalization of the concept of threshold functions beyond binary  ...  Alon et al. [2019] and Bun et al. [2020] recently showed that online learnability and private PAC learnability are equivalent in binary classification.  ...  Acknowledgments and Disclosure of Funding We acknowledge the support of NSF via grants CAREER IIS-1452099 and IIS-2007055.  ... 
arXiv:2006.01980v3 fatcat:2evrzmik5jgnba4yy7xy2pj3ea
« Previous Showing results 1 — 15 out of 141 results