Filters








954 Hits in 6.4 sec

On the Information Complexity of Proper Learners for VC Classes in the Realizable Case [article]

Mahdi Haghifam, Gintare Karolina Dziugaite, Shay Moran, Daniel M. Roy
2020 arXiv   pre-print
In fact, we exhibit VC classes for which the CMI of any proper learner cannot be bounded by any real-valued function of the VC dimension only.  ...  We provide a negative resolution to a conjecture of Steinke and Zakynthinou (2020a), by showing that their bound on the conditional mutual information (CMI) of proper learners of Vapnik–Chervonenkis (VC  ...  In this short note, we provide a counterexample to this conjecture for proper learners in the realizable case.  ... 
arXiv:2011.02970v1 fatcat:bjt4w5jtwzhuherj3qxpj66d44

Open Problem: Information Complexity of VC Learning

Thomas Steinke, Lydia Zakynthinou
2020 Annual Conference Computational Learning Theory  
Steinke and Zakynthinou (2020) prove Conjecture 8 for the special case of threshold functions on the real line. The conjectures are stated for proper learners.  ...  Unfortunately, Bassily et al. (2018) showed that any proper and consistent learner for threshold functions on an unbounded domain must have unbounded mutual information (for worst-case distributions)  ... 
dblp:conf/colt/SteinkeZ20a fatcat:nds74bmvkbc2beu4hvnk2rrzda

Towards a Unified Information-Theoretic Framework for Generalization [article]

Mahdi Haghifam, Gintare Karolina Dziugaite, Shay Moran, Daniel M. Roy
2021 arXiv   pre-print
We further show that an inherent limitation of proper learning of VC classes contradicts the existence of a proper learner with constant CMI, and it implies a negative resolution to an open problem of  ...  In this work, we investigate the expressiveness of the "conditional mutual information" (CMI) framework of Steinke and Zakynthinou (2020) and the prospect of using it to provide a unified framework for  ...  Acknowledgments The authors would like to thank Blair Bilodeau, Mufan Bill Li, and Jeffery Negrea for feedback on drafts of this work.  ... 
arXiv:2111.05275v2 fatcat:ji25ndu3kfatbocbuirlq3sbwy

Computable PAC Learning of Continuous Features

Nathanael Ackerman, Julian Asilis, Jieqi Di, Cameron Freer, Jean-Baptiste Tristan
2022 Proceedings of the 37th Annual ACM/IEEE Symposium on Logic in Computer Science  
We also give a presentation of a hypothesis class that does not admit any proper computable PAC learner with computable sample function, despite the underlying class being PAC learnable.  ...  We provide sufficient conditions on a hypothesis class to ensure than an empirical risk minimizer (ERM) is computable, and bound the strong Weihrauch degree of an ERM under more general conditions.  ...  ACKNOWLEDGMENTS The authors would like to thank Caleb Miller for valuable discussion on the topic, particularly in helping refine the notion of computable PAC learning and in describing the computable  ... 
doi:10.1145/3531130.3533330 fatcat:uosarl2vtjbh5g2hubkryi4fhi

Black-box Certification and Learning under Adversarial Perturbations [article]

Hassan Ashtiani, Vinayak Pathak, Ruth Urner
2022 arXiv   pre-print
We analyze a PAC-type framework of semi-supervised learning and identify possibility and impossibility results for proper learning of VC-classes in this setting.  ...  We also consider the viewpoint of a black-box adversary that aims at finding adversarial examples, showing that the existence of an adversary with polynomial query complexity can imply the existence of  ...  Acknowledgements We thank the Vector Institute for providing us with the meeting space in which this work was developed! Ruth Urner and Hassan Ashtiani were supported by NSERC Discovery Grants.  ... 
arXiv:2006.16520v2 fatcat:zbbxo3rimzfanoaqzok75ovpbe

Average-Case Information Complexity of Learning [article]

Ido Nachum, Amir Yehudayoff
2018 arXiv   pre-print
There exists a proper learning algorithm that reveals O(d) bits of information for most concepts in the class. This result is a special case of a more general phenomenon we explore.  ...  How many bits of information are revealed by a learning algorithm for a concept class of VC-dimension d?  ...  In some cases, for any consistent and proper algorithm, there is always a scenario in which a large amount of information is revealed.  ... 
arXiv:1811.09923v1 fatcat:orqzmdicsvdw3m7zpgpz4szxga

On computable learning of continuous features [article]

Nathanael Ackerman and Julian Asilis and Jieqi Di and Cameron Freer and Jean-Baptiste Tristan
2021 arXiv   pre-print
We also give a presentation of a hypothesis class that does not admit any proper computable PAC learner with computable sample function, despite the underlying class being PAC learnable.  ...  We provide sufficient conditions for learners that are empirical risk minimizers (ERM) to be computable, and bound the strong Weihrauch degree of an ERM learner under more general conditions.  ...  Acknowledgements The authors would like to thank Caleb Miller for valuable discussion on the topic, particularly in helping refine the notion of computable PAC learning and in describing the computable  ... 
arXiv:2111.14630v1 fatcat:bd4rfuldhfat5p4rxtqr7keply

On Communication Complexity of Classification Problems [article]

Daniel M. Kane and Roi Livni and Shay Moran and Amir Yehudayoff
2018 arXiv   pre-print
For example, we provide combinatorial characterizations of the classes that can be learned with efficient communication in the proper-case as well as in the improper-case.  ...  This work studies distributed learning in the spirit of Yao's model of communication complexity: consider a two-party setting, where each of the players gets a list of labelled examples and they communicate  ...  Acknowledgements We thank Abbas Mehrabian and Ruth Urner for insightful discussions.  ... 
arXiv:1711.05893v3 fatcat:iidwoxfatnf5fongszb633kygm

Learners that Use Little Information [article]

Raef Bassily, Shay Moran, Ido Nachum, Jonathan Shafer, Amir Yehudayoff
2018 arXiv   pre-print
On the other hand, we show that in the distribution-dependent setting every VC class has empirical risk minimizers that do not reveal a lot of information.  ...  We discuss an approach that allows us to prove upper bounds on the amount of information that algorithms reveal about their inputs, and also provide a lower bound by showing a simple concept class for  ...  A Lower Bound on Information In this section we show that any proper consistent learner for the class of thresholds cannot use only little information with respect to all realizable distributions D.  ... 
arXiv:1710.05233v3 fatcat:qyyh72a4fbeizgczpsa5m6ix4m

Limits of Private Learning with Access to Public Data [article]

Noga Alon, Raef Bassily, Shay Moran
2019 arXiv   pre-print
We study the limits of learning in this setting in terms of private and public sample complexities.  ...  We show that any hypothesis class of VC-dimension d can be agnostically learned up to an excess error of α using only (roughly) d/α public examples and d/α^2 private labeled examples.  ...  One can show that in this setting every VC class can be learned privately with (roughly) the same sample complexity as in the standard, non-private, case.  ... 
arXiv:1910.11519v1 fatcat:nv7pcfvylrhhrlm4ro4awelcoe

Sample-Efficient Learning of Mixtures [article]

Hassan Ashtiani, Shai Ben-David, Abbas Mehrabian
2018 arXiv   pre-print
Our mixture learning algorithm has the property that, if the F-learner is proper/agnostic, then the F^k-learner would be proper/agnostic as well.  ...  This general result enables us to improve the best known sample complexity upper bounds for a variety of important mixture classes.  ...  We would like to thank the reviewers of the ALT conference and also Yaoliang Yu for pointing out mistakes in earlier versions of this paper.  ... 
arXiv:1706.01596v3 fatcat:lcxyplmqzzhsxpi4dabke6esty

VC Classes are Adversarially Robustly Learnable, but Only Improperly [article]

Omar Montasser, Steve Hanneke, Nathan Srebro
2019 arXiv   pre-print
The requirement of being improper is necessary as we exhibit examples of hypothesis classes H with finite VC dimension that are not robustly PAC learnable with any proper learning rule.  ...  We study the question of learning an adversarially robust predictor. We show that any hypothesis class H with finite VC dimension is robustly PAC learnable with an improper learning rule.  ...  And so, even though vc(H) = ∞, a single example suffices to inform the learner of whether to produce the all-positive or all-negative function.  ... 
arXiv:1902.04217v2 fatcat:r2dmtirarnaoxhs4ljojlohcsm

Learning Privately with Labeled and Unlabeled Examples [article]

Amos Beimel, Kobbi Nissim, Uri Stemmer
2015 arXiv   pre-print
The first construction is of learners where the labeled sample complexity is proportional to the VC dimension of the concept class, however, the unlabeled sample complexity of the algorithm is as big as  ...  In addition, we show that in some settings the labeled sample complexity does not depend on the privacy parameters of the learner.  ...  We thank Aryeh Kontorovich, Adam Smith, and Salil Vadhan for helpful discussions of ideas in this work.  ... 
arXiv:1407.2662v3 fatcat:lo6zgyoi5bef7dg5lecwqeodwe

Closure Properties for Private Classification and Online Prediction [article]

Noga Alon, Amos Beimel, Shay Moran, Uri Stemmer
2020 arXiv   pre-print
The improved bounds on the sample complexity of private learning are derived algorithmically via transforming a private learner for the original class to a private learner for the composed class '.  ...  Using the same ideas we show that any ( proper or improper) private algorithm that learns a class of functions in the realizable case (i.e., when the examples are labeled by some function in the class)  ...  Acknowledgements We thank Adam Klivans and Roi Livni for insightful discussions.  ... 
arXiv:2003.04509v3 fatcat:czvzmrnvkvbpxgkzlvyew23tyy

Boosting Simple Learners [article]

Noga Alon and Alon Gonen and Elad Hazan and Shay Moran
2022 arXiv   pre-print
Formally, we assume the class of weak hypotheses has a bounded VC dimension.  ...  (ii) Expressivity: Which tasks can be learned by boosting weak hypotheses from a bounded VC class? Can complex concepts that are "far away" from the class be learned?  ...  Acknowledgements We thank Yoav Freund and Rob Schapire for useful discussions.  ... 
arXiv:2001.11704v4 fatcat:a3opfqebajceldksxlgcfsgfrm
« Previous Showing results 1 — 15 out of 954 results