Filters








4,908 Hits in 6.6 sec

The sample complexity of agnostic learning under deterministic labels

Shai Ben-David, Ruth Urner
2014 Annual Conference Computational Learning Theory  
However, agnostic learning of deterministic labels has so far received little research attention.  ...  First, we show that the sample complexity of learning a binary hypothesis class (with respect to deterministic labeling functions) is not fully determined by the VC-dimension of the class.  ...  Acknowledgments We thank Peter Bartlett for suggesting the classes H d for Theorem 9. This work was supported in part by AFOSR grant FA9550-09-1-0538.  ... 
dblp:conf/colt/Ben-DavidU14 fatcat:pe6k2gwedvhc5bk3nkpjrxsrxy

The sample complexity of agnostic learning with deterministic labels

Shai Ben-David, Ruth Urner
2014 International Symposium on Artificial Intelligence and Mathematics  
We investigate agnostic learning when there is no noise in the labeling function, that is, the labels are deterministic.  ...  We introduce a new combinatorial parameter of a class of binary valued functions and show that it provides a full combinatorial characterization of the sample complexity of deterministic -label agnostic  ...  Acknowledgement We would like to thank Peter Bartlett for suggesting the classes H d for Theorem 6.  ... 
dblp:conf/isaim/Ben-DavidU14 fatcat:jgzamnuswrb2jjuf6m6z3ooskm

Active Learning - Modern Learning Theory [chapter]

Maria-Florina Balcan, Ruth Urner
2014 Encyclopedia of Algorithms  
Both in the realizable and agnostic settings, there is a well-developed theory of sample complexity [13] , quantifying in terms of the so-called VC-dimension (a measure of complexity of a concept class  ...  Cross-References PAC learning Sample complexity Computational complexity of learning Figure 1 : 1 The margin-based active learning algorithm after iteration k.  ... 
doi:10.1007/978-3-642-27848-8_769-2 fatcat:5hcjmipp4rcb7pn2654b2zn5se

Active Learning – Modern Learning Theory [chapter]

Maria-Florina Balcan, Ruth Urner
2016 Encyclopedia of Algorithms  
Both in the realizable and agnostic settings, there is a well-developed theory of sample complexity [13] , quantifying in terms of the so-called VC-dimension (a measure of complexity of a concept class  ...  Cross-References PAC learning Sample complexity Computational complexity of learning Figure 1 : 1 The margin-based active learning algorithm after iteration k.  ... 
doi:10.1007/978-1-4939-2864-4_769 fatcat:amdxmajq6zcnnn3zlhrtmeoddq

Realizable Learning is All You Need [article]

Max Hopkins, Daniel Kane, Shachar Lovett, Gaurav Mahajan
2021 arXiv   pre-print
The equivalence of realizable and agnostic learnability is a fundamental phenomenon in learning theory.  ...  More generally, we argue that the equivalence of realizable and agnostic learning is actually a special case of a broader phenomenon we call property generalization: any desirable property of a learning  ...  Acknowledgements The authors would like to thank Shay Moran, Russell Impagliazzo, and Omar Montasser for enlightening discussions.  ... 
arXiv:2111.04746v1 fatcat:h3kx6pf6azeyfcd5yqn3cqtijq

Access to Unlabeled Data can Speed up Prediction Time

Ruth Urner, Shai Shalev-Shwartz, Shai Ben-David
2011 International Conference on Machine Learning  
Semi-supervised learning (SSL) addresses the problem of training a classifier using a small number of labeled examples and many unlabeled examples.  ...  from much fewer labeled examples than without such a sample.  ...  Shai Shalev-Shwartz acknowledges the support of the Israeli Science Foundation grant number 598-10.  ... 
dblp:conf/icml/UrnerSB11 fatcat:oazgr5dk2rhpniy4xgdjwmauqy

On Communication Complexity of Classification Problems [article]

Daniel M. Kane and Roi Livni and Shay Moran and Amir Yehudayoff
2018 arXiv   pre-print
This work studies distributed learning in the spirit of Yao's model of communication complexity: consider a two-party setting, where each of the players gets a list of labelled examples and they communicate  ...  The derivation of these results hinges on a type of decision problems we term " realizability problems" where the goal is deciding whether a distributed input sample is consistent with an hypothesis from  ...  Consider the problem of agnostically learning under the promise that the input sample is consistent with some target function. Is there a learning protocol in this case with sample complexity o(1/ )?  ... 
arXiv:1711.05893v3 fatcat:iidwoxfatnf5fongszb633kygm

Selective Sampling on Probabilistic Data [chapter]

Peng Peng, Raymond Chi-Wing Wong
2014 Proceedings of the 2014 SIAM International Conference on Data Mining  
In the literature of supervised learning, most existing studies assume that the labels provided by the labelers are deterministic, which may introduce noise easily in many real-world applications.  ...  We prove that in our setting the label complexity can be reduced dramatically. Finally, we conducted comprehensive experiments in order to verify the effectiveness of our proposed labeling framework.  ...  Acknowledgements: The research is supported by grant FSGRF13EG27.  ... 
doi:10.1137/1.9781611973440.4 dblp:conf/sdm/PengW14 fatcat:e7qhtfd63fg3xm2zuupulbm6fe

Sample Complexity Bounds on Differentially Private Learning via Communication Complexity

Vitaly Feldman, David Xiao
2015 SIAM journal on computing (Print)  
Sample complexity of private PAC and agnostic learning was studied in a number of prior works starting with [Kasiviswanathan et al., 2011] .  ...  We show that the sample complexity of learning with (pure) differential privacy can be arbitrarily higher than the sample complexity of learning without the privacy constraint or the sample complexity  ...  , and for valuable discussions regarding the sample complexity of privately learning threshold functions.  ... 
doi:10.1137/140991844 fatcat:cpslhvmun5a33gdyrc3iqlxe3a

Sample Complexity Bounds on Differentially Private Learning via Communication Complexity [article]

Vitaly Feldman, David Xiao
2015 arXiv   pre-print
We show that the sample complexity of learning with (pure) differential privacy can be arbitrarily higher than the sample complexity of learning without the privacy constraint or the sample complexity  ...  For any t, there exists a class C such that the sample complexity of (pure) α-differentially private PAC learning is Ω(t/α) but the sample complexity of the relaxed (α,β)-differentially private PAC learning  ...  , and for valuable discussions regarding the sample complexity of privately learning threshold functions.  ... 
arXiv:1402.6278v4 fatcat:fgcdgmdy3ncstnn7ba4rhs2voe

Distribution-Independent Reliable Learning [article]

Varun Kanade, Justin Thaler
2014 arXiv   pre-print
We study several questions in the reliable agnostic learning framework of Kalai et al. (2009), which captures learning tasks in which one type of error is costlier than others.  ...  Our algorithms also satisfy strong attribute-efficiency properties, and provide smooth tradeoffs between sample complexity and running time.  ...  This research was carried out while the authors were at the Simons Institute for the Theory of Computing at the University of California, Berkeley.  ... 
arXiv:1402.5164v1 fatcat:tzmjjvdloveshhtm43rjgepjei

Agnostic Domain Adaptation [chapter]

Alexander Vezhnevets, Joachim M. Buhmann
2011 Lecture Notes in Computer Science  
The supervised learning paradigm assumes in general that both training and test data are sampled from the same distribution.  ...  When this assumption is violated, we are in the setting of transfer learning or domain adaptation: Here, training data from a source domain, aim to learn a classifier which performs well on a target domain  ...  Acknowledgements This work has been supported by the Swiss National Science Foundation under grant #200021-117946.  ... 
doi:10.1007/978-3-642-23123-0_38 fatcat:dzyisdlcenhz3nwzaeknbwptku

On Basing Lower-Bounds for Learning on Worst-Case Assumptions

Benny Applebaum, Boaz Barak, David Xiao
2008 2008 49th Annual IEEE Symposium on Foundations of Computer Science  
Our results hold even in the stronger model of agnostic learning.  ...  These results are obtained by showing that lower bounds for improper learning are intimately related to the complexity of zero-knowledge arguments and to the existence of weak cryptographic primitives.  ...  Such reductions interact with the learner by supplying it with distributions of labeled examples, and obtaining from the learner hypotheses predicting the labels under the distributions (if such predictors  ... 
doi:10.1109/focs.2008.35 dblp:conf/focs/ApplebaumBX08 fatcat:44nkrplugncare53zwbzd26lwu

Adaptive Learning with Robust Generalization Guarantees [article]

Rachel Cummings, Katrina Ligett, Kobbi Nissim, Aaron Roth, Zhiwei Steven Wu
2016 arXiv   pre-print
We prove that every hypothesis class that is PAC learnable is also PAC learnable in a robustly generalizing fashion, with almost the same sample complexity.  ...  The traditional notion of generalization---i.e., learning a hypothesis whose empirical error is close to its true error---is surprisingly brittle.  ...  Acknowledgements We thank Adam Smith and Raef Bassily for helpful comments about adaptive composition of perfectly generalizing mechanisms, and for pointing out an error in an earlier version of this paper  ... 
arXiv:1602.07726v2 fatcat:tz6ltogqlrbdlmpdlr4fxoxwk4

Domain adaptation–can quantity compensate for quality?

Shai Ben-David, Ruth Urner
2013 Annals of Mathematics and Artificial Intelligence  
We show this under the assumptions of covariate shift as well as a bound on the ratio of the probability weights between the source (training) and target (test) distribution.  ...  generated sample by a (possibly larger) sample generated by a different distribution without worsening the error guarantee on the learned classifier.  ...  Then, the H-proper Domain Adaptation problem w.r.t. the class W can be (1, , δ, m( /3, δ/2), n( /3, δ/2)+ m ( /3, δ/2))-solved, where m is the sample complexity function for agnostically learning H.  ... 
doi:10.1007/s10472-013-9371-9 fatcat:3ved5a3sarhkxe5gwroo4ntyky
« Previous Showing results 1 — 15 out of 4,908 results