Filters








10 Hits in 5.5 sec

PAC learnability of a concept class under non-atomic measures: a problem by Vidyasagar [article]

Vladimir Pestov
2010 arXiv   pre-print
Vidyasagar, we state a necessary and sufficient condition for distribution-free PAC learnability of a concept class C under the family of all non-atomic (diffuse) measures on the domain Ω.  ...  Besides, learnability of C under non-atomic measures does not imply the uniform Glivenko-Cantelli property with regard to non-atomic measures.  ...  are PAC learnable under the family P na (Ω) of all non-atomic probability measures on Ω.  ... 
arXiv:1006.5090v1 fatcat:3qkto4dfr5cavhya5wlzo7vuny

PAC learnability under non-atomic measures: a problem by Vidyasagar [article]

Vladimir Pestov
2012 arXiv   pre-print
In response to a 1997 problem of M. Vidyasagar, we state a criterion for PAC learnability of a concept class C under the family of all non-atomic (diffuse) measures on the domain Ω.  ...  sufficient but not necessary for PAC learnability under non-atomic measures.  ...  Of course the remaining imperfections are all author's own.  ... 
arXiv:1105.5669v3 fatcat:hzakh5nqqvhlrcl5rbbmvtvdwm

A Note on Sample Complexity of Learning Binary Output Neural Networks under Fixed Input Distributions

V Pestov
2010 2010 Eleventh Brazilian Symposium on Neural Networks  
We show that the learning sample complexity of a sigmoidal neural network constructed by Sontag (1992) required to achieve a given misclassification error under a fixed purely atomic distribution can grow  ...  The rate can be superexponential, a non-recursive function, etc. We further observe that Sontag's ANN is not Glivenko-Cantelli under any input distribution having a non-atomic part.  ...  Does there exist a non-atomic probability measure on R under which the Sontag ANN is PAC learnable? Problem 3.  ... 
doi:10.1109/sbrn.2010.10 dblp:conf/sbrn/Pestov10 fatcat:tuq5na7rendbdl5mdvt2qua5xu

Guest Editors' foreword

Marcus Hutter, Frank Stephan, Vladimir Vovk, Thomas Zeugmann
2013 Theoretical Computer Science  
The topic of Pestov's paper is already explained by its title, PAC learnability under non-atomic measures: A problem by Vidyasagar.  ...  In 1997 Vidyasagar posed the problem of characterizing learnability under non-atomic distributions, where a distribution D is said to be non-atomic if every set A with D(A) > 0 has a subset B with 0 <  ... 
doi:10.1016/j.tcs.2012.10.007 fatcat:ciit7zasgzgytfxx66tzb5onku

Editors' Introduction [chapter]

Sanjay Jain, Rémi Munos, Frank Stephan, Thomas Zeugmann
2013 Lecture Notes in Computer Science  
He also studied the learnability of regular languages and context-free languages; a sample result, obtained in collaboration with Franck Thollard, is that the class of regular languages can be PAC-learned  ...  using a polynomial amount of data and processing time, provided that the distributions of the samples are restricted to be generated by one of a large family of related probabilistic deterministic finite  ...  The purpose of Vladimir Pestov's work is explained already in its title, PAC Learnability of a Concept Class under Non-Atomic Measures: A Problem by Vidyasagar.  ... 
doi:10.1007/978-3-642-40935-6_1 fatcat:pchrsvhjezfbvh6dfplqhxhgcy

Predictive PAC Learning and Process Decompositions [article]

Cosma Rohilla Shalizi, Aryeh Kontorovich
2013 arXiv   pre-print
We informally call a stochastic process learnable if it admits a generalization error approaching zero in probability for any concept class with finite VC-dimension (IID processes are the simplest example  ...  In particular, we give a novel PAC generalization bound for mixtures of learnable processes with a generalization error that is not worse than that of each mixture component.  ...  The latter is an immediate consequence of the VC-dimension characterization of PAC learnability: Theorem 1 Suppose that the concept class H is PAC learnable from IID samples.  ... 
arXiv:1309.4859v1 fatcat:xurmzcu7cbhk7faopcedo4c5wi

An Approach to One-Bit Compressed Sensing Based on Probably Approximately Correct Learning Theory [article]

Mehmet Eren Ahsen, Mathukumalli Vidyasagar
2017 arXiv   pre-print
In this paper, the problem of one-bit compressed sensing (OBCS) is formulated as a problem in probably approximately correct (PAC) learning.  ...  By coupling this estimate with well-established results in PAC learning theory, we show that a consistent algorithm can recover a k-sparse vector with O(k (n/k)) measurements, given only the signs of the  ...  The concept class C is said to be PAC learnable under the family of probability measures P is there exists a PAC algorithm.  ... 
arXiv:1710.07973v1 fatcat:dt6pv4rijfg3hobxia4fjukms4

Policy Transforms and Learning Optimal Policies [article]

Thomas M. Russell
2020 arXiv   pre-print
We characterize learnability of a set of policy options by the existence of a decision rule that closely approximates the maximin optimal value of the policy transform with high probability.  ...  We study the problem of choosing optimal policy rules in uncertain environments using models that may be incomplete and/or partially identified.  ...  Definition A.4 (Agnostic PAC Learnability).  ... 
arXiv:2012.11046v1 fatcat:z2awheu3gnewdaocb24w2jf544

Predictive PAC Learning and Process Decompositions

Cosma Rohilla Shalizi, Aryeh Kontorovich
Advances in Neural Information Processing Systems  
We informally call a stochastic process learnable if it admits a generalization error approaching zero in probability for any concept class with finite VC-dimension (IID processes are the simplest example  ...  In particular, we give a novel PAC generalization bound for mixtures of learnable processes with a generalization error that is not worse than that of each mixture component.  ...  The latter is an immediate consequence of the VC-dimension characterization of PAC learnability: Theorem 1 Suppose that the concept class H is PAC learnable from IID samples.  ... 
pmid:26321855 pmcid:PMC4551412 fatcat:giynhtyuufbnhmlayzbsrhbani

Learning probabilistic automata with SMT solving

Dario Veltri, Joshua Moerman, Joost-Pieter Katoen, Thomas Noll
2021
Our algorithm then encodes the learning problem in the existential theory of the reals, to ensure the result to be a probabilistic automaton itself.  ...  We present an active learning algorithm capable of finding a probabilistic automaton computing the distribution yielded by a stochastic system, given it can be modelled as such.  ...  Learning Weighted Finite Automata In this section we will explore the learnability of weighted finite automata, the more general class of automata.  ... 
doi:10.18154/rwth-2021-03668 fatcat:7xcczh5mufamvh3wizu4m3bpiy