A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
The sample complexity of agnostic learning with deterministic labels
2014
International Symposium on Artificial Intelligence and Mathematics
We show that in this setting, in contrast to the fully agnostic learning setting (with possibly noisy labeling functions), the sample complexity of learning a binary hypothesis class is not fully determined ...
We investigate agnostic learning when there is no noise in the labeling function, that is, the labels are deterministic. ...
Acknowledgement We would like to thank Peter Bartlett for suggesting the classes H d for Theorem 6. ...
dblp:conf/isaim/Ben-DavidU14
fatcat:jgzamnuswrb2jjuf6m6z3ooskm
The sample complexity of agnostic learning under deterministic labels
2014
Annual Conference Computational Learning Theory
First, we show that the sample complexity of learning a binary hypothesis class (with respect to deterministic labeling functions) is not fully determined by the VC-dimension of the class. ...
However, agnostic learning of deterministic labels has so far received little research attention. ...
Acknowledgments We thank Peter Bartlett for suggesting the classes H d for Theorem 9. This work was supported in part by AFOSR grant FA9550-09-1-0538. ...
dblp:conf/colt/Ben-DavidU14
fatcat:pe6k2gwedvhc5bk3nkpjrxsrxy
Active Learning - Modern Learning Theory
[chapter]
2014
Encyclopedia of Algorithms
Both in the realizable and agnostic settings, there is a well-developed theory of sample complexity [13] , quantifying in terms of the so-called VC-dimension (a measure of complexity of a concept class ...
Margin-based active learning While the disagreement-based active learning line of work provided the first general understanding of the sample complexity benefits with active learning for arbitrary concept ...
doi:10.1007/978-3-642-27848-8_769-2
fatcat:5hcjmipp4rcb7pn2654b2zn5se
Active Learning – Modern Learning Theory
[chapter]
2016
Encyclopedia of Algorithms
Both in the realizable and agnostic settings, there is a well-developed theory of sample complexity [13] , quantifying in terms of the so-called VC-dimension (a measure of complexity of a concept class ...
Margin-based active learning While the disagreement-based active learning line of work provided the first general understanding of the sample complexity benefits with active learning for arbitrary concept ...
doi:10.1007/978-1-4939-2864-4_769
fatcat:amdxmajq6zcnnn3zlhrtmeoddq
Realizable Learning is All You Need
[article]
2021
arXiv
pre-print
The equivalence of realizable and agnostic learnability is a fundamental phenomenon in learning theory. ...
More generally, we argue that the equivalence of realizable and agnostic learning is actually a special case of a broader phenomenon we call property generalization: any desirable property of a learning ...
Acknowledgements The authors would like to thank Shay Moran, Russell Impagliazzo, and Omar Montasser for enlightening discussions. ...
arXiv:2111.04746v1
fatcat:h3kx6pf6azeyfcd5yqn3cqtijq
On Communication Complexity of Classification Problems
[article]
2018
arXiv
pre-print
This work studies distributed learning in the spirit of Yao's model of communication complexity: consider a two-party setting, where each of the players gets a list of labelled examples and they communicate ...
The derivation of these results hinges on a type of decision problems we term " realizability problems" where the goal is deciding whether a distributed input sample is consistent with an hypothesis from ...
What is the sample complexity of c-agnostically learning a class of VC dimension d? ...
arXiv:1711.05893v3
fatcat:iidwoxfatnf5fongszb633kygm
Selective Sampling on Probabilistic Data
[chapter]
2014
Proceedings of the 2014 SIAM International Conference on Data Mining
In the literature of supervised learning, most existing studies assume that the labels provided by the labelers are deterministic, which may introduce noise easily in many real-world applications. ...
In many applications like crowdsourcing, however, many labelers may simultaneously label the same group of instances and thus the label of each instance is associated with a probability. ...
Acknowledgements: The research is supported by grant FSGRF13EG27. ...
doi:10.1137/1.9781611973440.4
dblp:conf/sdm/PengW14
fatcat:e7qhtfd63fg3xm2zuupulbm6fe
Sample Complexity Bounds on Differentially Private Learning via Communication Complexity
2015
SIAM journal on computing (Print)
Sample complexity of private PAC and agnostic learning was studied in a number of prior works starting with [Kasiviswanathan et al., 2011] . ...
We show that the sample complexity of learning with (pure) differential privacy can be arbitrarily higher than the sample complexity of learning without the privacy constraint or the sample complexity ...
, and for valuable discussions regarding the sample complexity of privately learning threshold functions. ...
doi:10.1137/140991844
fatcat:cpslhvmun5a33gdyrc3iqlxe3a
Sample Complexity Bounds on Differentially Private Learning via Communication Complexity
[article]
2015
arXiv
pre-print
We show that the sample complexity of learning with (pure) differential privacy can be arbitrarily higher than the sample complexity of learning without the privacy constraint or the sample complexity ...
Sample complexity of private PAC and agnostic learning was studied in a number of prior works starting with (Kasiviswanathan et al., 2008) but a number of basic questions still remain open, most notably ...
, and for valuable discussions regarding the sample complexity of privately learning threshold functions. ...
arXiv:1402.6278v4
fatcat:fgcdgmdy3ncstnn7ba4rhs2voe
Access to Unlabeled Data can Speed up Prediction Time
2011
International Conference on Machine Learning
Semi-supervised learning (SSL) addresses the problem of training a classifier using a small number of labeled examples and many unlabeled examples. ...
from much fewer labeled examples than without such a sample. ...
Shai Shalev-Shwartz acknowledges the support of the Israeli Science Foundation grant number 598-10. ...
dblp:conf/icml/UrnerSB11
fatcat:oazgr5dk2rhpniy4xgdjwmauqy
Domain-Adversarial and Conditional State Space Model for Imitation Learning
[article]
2021
arXiv
pre-print
We experimentally evaluated the model predictive control performance via imitation learning for continuous control of sparse reward tasks in simulators and compared it with the performance of the existing ...
To remove domain-dependent information from the states, the model is trained with domain discriminators in an adversarial manner, and the reconstruction is conditioned on domain labels. ...
ACKNOWLEDGMENTS Most of the experiments were conducted in ABCI (AI Bridging Cloud Infrastructure), built by the National Institute of Advanced Industrial Science and Technology, Japan. ...
arXiv:2001.11628v2
fatcat:guvlnnn66jdsllsvqangahnmoa
Distribution-Independent Reliable Learning
[article]
2014
arXiv
pre-print
We study several questions in the reliable agnostic learning framework of Kalai et al. (2009), which captures learning tasks in which one type of error is costlier than others. ...
Our algorithms also satisfy strong attribute-efficiency properties, and provide smooth tradeoffs between sample complexity and running time. ...
This research was carried out while the authors were at the Simons Institute for the Theory of Computing at the University of California, Berkeley. ...
arXiv:1402.5164v1
fatcat:tzmjjvdloveshhtm43rjgepjei
Multiclass learnability and the ERM principle
[article]
2014
arXiv
pre-print
We study the sample complexity of multiclass prediction in several learning settings. ...
under permutations of label names. ...
Amit Daniely is a recipient of the Google Europe Fellowship in Learning Theory, and this research is supported in part by this Google Fellowship ...
arXiv:1308.2893v2
fatcat:c3lyqopxfbhibpk2onqwhtjzq4
MTL-NAS: Task-Agnostic Neural Architecture Search Towards General-Purpose Multi-Task Learning
2020
2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
The code of our paper is available at https: //github.com/bhpfelix/MTLNAS. ...
This is realized with a minimum entropy regularization on the architecture weights during the search phase, which makes the architecture weights converge to near-discrete values and therefore achieves ...
This work was partially supported by NSFC 61773295 and 61771201, NSF of Hubei Province 2019CFA037, and Guangdong R&D key project of China 2019B010155001. ...
doi:10.1109/cvpr42600.2020.01156
dblp:conf/cvpr/GaoBJMJL20
fatcat:g2fsuhizsbetfh2l26fnocmdie
MTL-NAS: Task-Agnostic Neural Architecture Search towards General-Purpose Multi-Task Learning
[article]
2020
arXiv
pre-print
The code of our paper is available at https://github.com/bhpfelix/MTLNAS. ...
This is realized with a minimum entropy regularization on the architecture weights during the search phase, which makes the architecture weights converge to near-discrete values and therefore achieves ...
This work was partially supported by NSFC 61773295 and 61771201, NSF of Hubei Province 2019CFA037, and Guangdong R&D key project of China 2019B010155001. ...
arXiv:2003.14058v1
fatcat:eqkyyoz2s5dc5gk3g3ysbhqcey
« Previous
Showing results 1 — 15 out of 5,277 results