A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is `application/pdf`

.

## Filters

##
###
The sample complexity of agnostic learning under deterministic labels

2014
*
Annual Conference Computational Learning Theory
*

However,

dblp:conf/colt/Ben-DavidU14
fatcat:pe6k2gwedvhc5bk3nkpjrxsrxy
*agnostic**learning**of**deterministic**labels*has so far received little research attention. ... First, we show that*the**sample**complexity**of**learning*a binary hypothesis class (with respect to*deterministic**labeling*functions) is not fully determined by*the*VC-dimension*of**the*class. ... Acknowledgments We thank Peter Bartlett for suggesting*the*classes H d for Theorem 9. This work was supported in part by AFOSR grant FA9550-09-1-0538. ...##
###
The sample complexity of agnostic learning with deterministic labels

2014
*
International Symposium on Artificial Intelligence and Mathematics
*

We investigate

dblp:conf/isaim/Ben-DavidU14
fatcat:jgzamnuswrb2jjuf6m6z3ooskm
*agnostic**learning*when there is no noise in*the**labeling*function, that is,*the**labels*are*deterministic*. ... We introduce a new combinatorial parameter*of*a class*of*binary valued functions and show that it provides a full combinatorial characterization*of**the**sample**complexity**of**deterministic*-*label**agnostic*... Acknowledgement We would like to thank Peter Bartlett for suggesting*the*classes H d for Theorem 6. ...##
###
Active Learning - Modern Learning Theory
[chapter]

2014
*
Encyclopedia of Algorithms
*

Both in

doi:10.1007/978-3-642-27848-8_769-2
fatcat:5hcjmipp4rcb7pn2654b2zn5se
*the*realizable and*agnostic*settings, there is a well-developed theory*of**sample**complexity*[13] , quantifying in terms*of**the*so-called VC-dimension (a measure*of**complexity**of*a concept class ... Cross-References PAC*learning**Sample**complexity*Computational*complexity**of**learning*Figure 1 : 1*The*margin-based active*learning*algorithm after iteration k. ...##
###
Active Learning – Modern Learning Theory
[chapter]

2016
*
Encyclopedia of Algorithms
*

Both in

doi:10.1007/978-1-4939-2864-4_769
fatcat:amdxmajq6zcnnn3zlhrtmeoddq
*the*realizable and*agnostic*settings, there is a well-developed theory*of**sample**complexity*[13] , quantifying in terms*of**the*so-called VC-dimension (a measure*of**complexity**of*a concept class ... Cross-References PAC*learning**Sample**complexity*Computational*complexity**of**learning*Figure 1 : 1*The*margin-based active*learning*algorithm after iteration k. ...##
###
Realizable Learning is All You Need
[article]

2021
*
arXiv
*
pre-print

*The*equivalence

*of*realizable and

*agnostic*learnability is a fundamental phenomenon in

*learning*theory. ... More generally, we argue that

*the*equivalence

*of*realizable and

*agnostic*

*learning*is actually a special case

*of*a broader phenomenon we call property generalization: any desirable property

*of*a

*learning*... Acknowledgements

*The*authors would like to thank Shay Moran, Russell Impagliazzo, and Omar Montasser for enlightening discussions. ...

##
###
Access to Unlabeled Data can Speed up Prediction Time

2011
*
International Conference on Machine Learning
*

Semi-supervised

dblp:conf/icml/UrnerSB11
fatcat:oazgr5dk2rhpniy4xgdjwmauqy
*learning*(SSL) addresses*the*problem*of*training a classifier using a small number*of**labeled*examples and many unlabeled examples. ... from much fewer*labeled*examples than without such a*sample*. ... Shai Shalev-Shwartz acknowledges*the*support*of**the*Israeli Science Foundation grant number 598-10. ...##
###
On Communication Complexity of Classification Problems
[article]

2018
*
arXiv
*
pre-print

This work studies distributed

arXiv:1711.05893v3
fatcat:iidwoxfatnf5fongszb633kygm
*learning*in*the*spirit*of*Yao's model*of*communication*complexity*: consider a two-party setting, where each*of**the*players gets a list*of**labelled*examples and they communicate ...*The*derivation*of*these results hinges on a type*of*decision problems we term " realizability problems" where*the*goal is deciding whether a distributed input*sample*is consistent with an hypothesis from ... Consider*the*problem*of**agnostically**learning**under**the*promise that*the*input*sample*is consistent with some target function. Is there a*learning*protocol in this case with*sample**complexity*o(1/ )? ...##
###
Selective Sampling on Probabilistic Data
[chapter]

2014
*
Proceedings of the 2014 SIAM International Conference on Data Mining
*

In

doi:10.1137/1.9781611973440.4
dblp:conf/sdm/PengW14
fatcat:e7qhtfd63fg3xm2zuupulbm6fe
*the*literature*of*supervised*learning*, most existing studies assume that*the**labels*provided by*the**labelers*are*deterministic*, which may introduce noise easily in many real-world applications. ... We prove that in our setting*the**label**complexity*can be reduced dramatically. Finally, we conducted comprehensive experiments in order to verify*the*effectiveness*of*our proposed*labeling*framework. ... Acknowledgements:*The*research is supported by grant FSGRF13EG27. ...##
###
Sample Complexity Bounds on Differentially Private Learning via Communication Complexity

2015
*
SIAM journal on computing (Print)
*

*Sample*

*complexity*

*of*private PAC and

*agnostic*

*learning*was studied in a number

*of*prior works starting with [Kasiviswanathan et al., 2011] . ... We show that

*the*

*sample*

*complexity*

*of*

*learning*with (pure) differential privacy can be arbitrarily higher than

*the*

*sample*

*complexity*

*of*

*learning*without

*the*privacy constraint or

*the*

*sample*

*complexity*... , and for valuable discussions regarding

*the*

*sample*

*complexity*

*of*privately

*learning*threshold functions. ...

##
###
Sample Complexity Bounds on Differentially Private Learning via Communication Complexity
[article]

2015
*
arXiv
*
pre-print

We show that

arXiv:1402.6278v4
fatcat:fgcdgmdy3ncstnn7ba4rhs2voe
*the**sample**complexity**of**learning*with (pure) differential privacy can be arbitrarily higher than*the**sample**complexity**of**learning*without*the*privacy constraint or*the**sample**complexity*... For any t, there exists a class C such that*the**sample**complexity**of*(pure) α-differentially private PAC*learning*is Ω(t/α) but*the**sample**complexity**of**the*relaxed (α,β)-differentially private PAC*learning*... , and for valuable discussions regarding*the**sample**complexity**of*privately*learning*threshold functions. ...##
###
Distribution-Independent Reliable Learning
[article]

2014
*
arXiv
*
pre-print

We study several questions in

arXiv:1402.5164v1
fatcat:tzmjjvdloveshhtm43rjgepjei
*the*reliable*agnostic**learning*framework*of*Kalai et al. (2009), which captures*learning*tasks in which one type*of*error is costlier than others. ... Our algorithms also satisfy strong attribute-efficiency properties, and provide smooth tradeoffs between*sample**complexity*and running time. ... This research was carried out while*the*authors were at*the*Simons Institute for*the*Theory*of*Computing at*the*University*of*California, Berkeley. ...##
###
Agnostic Domain Adaptation
[chapter]

2011
*
Lecture Notes in Computer Science
*

*The*supervised

*learning*paradigm assumes in general that both training and test data are

*sampled*from

*the*same distribution. ... When this assumption is violated, we are in

*the*setting

*of*transfer

*learning*or domain adaptation: Here, training data from a source domain, aim to

*learn*a classifier which performs well on a target domain ... Acknowledgements This work has been supported by

*the*Swiss National Science Foundation

*under*grant #200021-117946. ...

##
###
On Basing Lower-Bounds for Learning on Worst-Case Assumptions

2008
*
2008 49th Annual IEEE Symposium on Foundations of Computer Science
*

Our results hold even in

doi:10.1109/focs.2008.35
dblp:conf/focs/ApplebaumBX08
fatcat:44nkrplugncare53zwbzd26lwu
*the*stronger model*of**agnostic**learning*. ... These results are obtained by showing that lower bounds for improper*learning*are intimately related to*the**complexity**of*zero-knowledge arguments and to*the*existence*of*weak cryptographic primitives. ... Such reductions interact with*the*learner by supplying it with distributions*of**labeled*examples, and obtaining from*the*learner hypotheses predicting*the**labels**under**the*distributions (if such predictors ...##
###
Adaptive Learning with Robust Generalization Guarantees
[article]

2016
*
arXiv
*
pre-print

We prove that every hypothesis class that is PAC learnable is also PAC learnable in a robustly generalizing fashion, with almost

arXiv:1602.07726v2
fatcat:tz6ltogqlrbdlmpdlr4fxoxwk4
*the*same*sample**complexity*. ...*The*traditional notion*of*generalization---i.e.,*learning*a hypothesis whose empirical error is close to its true error---is surprisingly brittle. ... Acknowledgements We thank Adam Smith and Raef Bassily for helpful comments about adaptive composition*of*perfectly generalizing mechanisms, and for pointing out an error in an earlier version*of*this paper ...##
###
Domain adaptation–can quantity compensate for quality?

2013
*
Annals of Mathematics and Artificial Intelligence
*

We show this

doi:10.1007/s10472-013-9371-9
fatcat:3ved5a3sarhkxe5gwroo4ntyky
*under**the*assumptions*of*covariate shift as well as a bound on*the*ratio*of**the*probability weights between*the*source (training) and target (test) distribution. ... generated*sample*by a (possibly larger)*sample*generated by a different distribution without worsening*the*error guarantee on*the**learned*classifier. ... Then,*the*H-proper Domain Adaptation problem w.r.t.*the*class W can be (1, , δ, m( /3, δ/2), n( /3, δ/2)+ m ( /3, δ/2))-solved, where m is*the**sample**complexity*function for*agnostically**learning*H. ...
« Previous

*Showing results 1 — 15 out of 4,908 results*