A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Filters
Learning Logic programs with random classification noise
[chapter]
1997
Lecture Notes in Computer Science
We review the polynomial PAC learnability of nonrecursive, determinate, constant-depth Horn clauses in the presence of such noise. ...
Also, we show that arbitrary nonrecursive Horn clauses with forest background knowledge remain polynomially PAC learnable in the presence of noise. ...
Kearns noise-tolerant learning algorithms in the random classi cation model. ...
doi:10.1007/3-540-63494-0_63
fatcat:ecttkvrzo5eytjdjnpidqai5lm
Uniform-distribution attribute noise learnability
1999
Proceedings of the twelfth annual conference on Computational learning theory - COLT '99
We study the problem of PAC-learning Boolean functions with random attribute noise under the uniform distribution. ...
In fact, we show that this style algorithm is the best possible for learning in the presence of attribute noise. ...
They showed that if the product noise rates p i are unknown, then no PAC-learning algorithm exists that can tolerate a noise rate higher than 2 , where is the required-accuracy parameter for PAC learning ...
doi:10.1145/307400.307414
dblp:conf/colt/BshoutyJT99
fatcat:vjcvnk6ngvazfinlp7fxrb2fay
Uniform-distribution attribute noise learnability
2003
Information and Computation
We study the problem of PAC-learning Boolean functions with random attribute noise under the uniform distribution. ...
In fact, we show that this style algorithm is nearly the best possible for learning in the presence of attribute noise. ...
They showed that if the product noise rates p i are unknown, then no PAC-learning algorithm exists that can tolerate a noise rate higher than 2 , where is the required-accuracy parameter for PAC-learning ...
doi:10.1016/s0890-5401(03)00135-4
fatcat:422zu2rhvfev5ov7syq4akv7xm
On Noise-Tolerant Learning of Sparse Parities and Related Problems
[chapter]
2011
Lecture Notes in Computer Science
For learning parities on r out of n variables, we give an algorithm that runs in time poly log 1 δ , 1 1−2η n (1+(2η) 2 +o(1))r/2 and uses only r log(n/δ)ω (1) (1−2η) 2 samples in the random noise setting ...
Even though efficient algorithms for learning sparse parities in the presence of noise would have major implications to learning other hypothesis classes, our work is the first to give a bound better than ...
While the parity problem can be solved efficiently in the noise-free PAC setting using Gaussian elimination, the search of an efficient noise-tolerant algorithm has run into serious barriers. ...
doi:10.1007/978-3-642-24412-4_32
fatcat:znayc6hykre4tc776e3oaokd4u
Attribute-Efficient Learning of Halfspaces with Malicious Noise: Near-Optimal Label Complexity and Noise Tolerance
[article]
2021
arXiv
pre-print
Though recent works have established attribute-efficient learning algorithms under various types of label noise (e.g. bounded noise), it remains an open question when and how s-sparse halfspaces can be ...
Our algorithm can be straightforwardly tailored to the passive learning setting, and we show that the sample complexity is Õ(1/ϵs^2 log^5 d) which also enjoys the attribute efficiency. ...
Our algorithm leverages the well-established margin-based active learning framework, with a particular treatment on attribute efficiency, label complexity, and noise tolerance. ...
arXiv:2006.03781v5
fatcat:naszgqupqvgtfp3a5jczlc5hfm
On the Power of Localized Perceptron for Label-Optimal Learning of Halfspaces with Adversarial Noise
[article]
2021
arXiv
pre-print
Prior to this work, existing online algorithms designed for tolerating the adversarial noise are subject to either label complexity polynomial in 1/ϵ, or suboptimal noise tolerance, or restrictive marginal ...
where ϵ∈ (0, 1) is the target error rate, our algorithm PAC learns the underlying halfspace with near-optimal label complexity of Õ(d · polylog(1/ϵ)) and sample complexity of Õ(d/ϵ). ...
We have presented the first attribute-efficient, label-efficient, and noise-tolerant algorithm in the online setting, under the general isotropic log-concave marginal distributions. ...
arXiv:2012.10793v3
fatcat:4w3t4m3ij5es3hddwbcxkohhpm
Efficient active learning of sparse halfspaces with arbitrary bounded noise
[article]
2021
arXiv
pre-print
Our active learning algorithm and its theoretical guarantees also immediately translate to new state-of-the-art label and sample complexity results for full-dimensional active and passive halfspace learning ...
efficient algorithm with label complexity Õ((s ln d/ϵ)^2^poly(1/(1-2η))), which is label-efficient only when the noise rate η is a fixed constant. ...
Main Algorithm We present Algorithm 1, our noise-tolerant attribute-efficient active learning algorithm, in this section. ...
arXiv:2002.04840v3
fatcat:zsakvyusvjdpzp63rh43jhmcoa
Learning juntas in the presence of noise
2007
Theoretical Computer Science
It is shown that large classes of Boolean concepts that depend on a small number of variables-so-called juntas-can be learned efficiently from random examples corrupted by random attribute and classification ...
For the attribute noise, we have to assume that it is generated by a product distribution since otherwise fault-tolerant learning is in general impossible: we construct a noise distribution P and a concept ...
Finally, we prove that without restricting the attribute noise distributions (for example to product distributions), noise-tolerant learning is in general impossible: we construct an attribute noise distribution ...
doi:10.1016/j.tcs.2007.05.014
fatcat:z43lsv55xbfgrhh6gxpn3vikiy
Learning Juntas in the Presence of Noise
[chapter]
2006
Lecture Notes in Computer Science
It is shown that large classes of Boolean concepts that depend on a small number of variables-so-called juntas-can be learned efficiently from random examples corrupted by random attribute and classification ...
For the attribute noise, we have to assume that it is generated by a product distribution since otherwise fault-tolerant learning is in general impossible: we construct a noise distribution P and a concept ...
Finally, we prove that without restricting the attribute noise distributions (for example to product distributions), noise-tolerant learning is in general impossible: we construct an attribute noise distribution ...
doi:10.1007/11750321_37
fatcat:yjdmdrxe45f7vg4rarvzb2x3wq
Specification and Simulation of Statistical Query Algorithms for Efficiency and Noise Tolerance
1998
Journal of computer and system sciences (Print)
The advantage of specifying learning algorithms in this model is that SQ algorithms can be simulated in the probably approximately correct (PAC) model, both in the absence and in the presence of noise. ...
In this paper, we introduce a new method for specifying statistical query algorithms based on a type of relative error and provide simulations in the noisefree and noise-tolerant PAC models which yield ...
Specifically, an SQ algorithm can be simulated in the PAC model in the presence of classification noise, malicious errors, attribute noise and even hybrid models combining these different noises [6 8, ...
doi:10.1006/jcss.1997.1558
fatcat:fzb5m5crrvbepo6qlcmw6jjrim
The Power of Localization for Efficiently Learning Linear Separators with Noise
[article]
2018
arXiv
pre-print
We introduce a new approach for designing computationally efficient learning algorithms that are tolerant to noise, and demonstrate its effectiveness by designing algorithms with improved noise tolerance ...
that can tolerate a nearly information-theoretically optimal noise rate of η = Ω(ϵ). ...
algorithm's computation), we design an efficient algorithm that can learn with accuracy 1 − ǫ while tolerating an Ω(ǫ) noise rate. ...
arXiv:1307.8371v9
fatcat:t5p72u5hebfozdroj2x6y2hrui
Robust Learning under Strong Noise via SQs
[article]
2020
arXiv
pre-print
In this model, we show that every SQ learnable class admits an efficient learning algorithm with OPT + ϵ misclassification error for a broad class of noise models. ...
First, we build on a recent result by that showed noise tolerance of distribution-independently evolvable concept classes under Massart noise. ...
As a result, most RCN-tolerant PAC learning algorithms were either derived from the SQ model, or can be easily cast into it. ...
arXiv:2010.09106v1
fatcat:lpw5v37jv5ezzojb2vm5hynvyq
Improved Algorithms for Efficient Active Learning Halfspaces with Massart and Tsybakov noise
[article]
2021
arXiv
pre-print
We give a computationally-efficient PAC active learning algorithm for d-dimensional homogeneous halfspaces that can tolerate Massart noise (Massart and Nédélec, 2006) and Tsybakov noise (Tsybakov, 2004 ...
learning algorithms. ...
Selective sampling algorithms for cost-sensitive multiclass prediction. In International Conference on Machine Learning, pages 1220-1228. PMLR, 2013. ...
arXiv:2102.05312v2
fatcat:dao5abrlw5apfmiyzqo66iwmri
Noise Tolerant Variants of the Perceptron Algorithm
2007
Journal of machine learning research
In particular the perceptron with margin is an effective method for tolerating noise and stabilizing the algorithm. ...
One type of algorithm aims for noise tolerance by replacing the last hypothesis of the perceptron with another hypothesis or a vote among hypotheses. ...
This is captured by the agnostic PAC learning model (Kearns et al., 1994) . ...
dblp:journals/jmlr/KhardonW07
fatcat:us7ye4vvzja57gxs5tnsdhqpla
Maximum Margin Coresets for Active and Noise Tolerant Learning
2007
International Joint Conference on Artificial Intelligence
learning in the presence of outlier noise. ...
We show various applications including a novel coreset based analysis of large margin active learning and a polynomial time (in the number of input data and the amount of noise) algorithm for agnostic ...
While there are many definitions of noise, such as attribute, label, and malicious noise, we consider a very general model, outlier noise. ...
dblp:conf/ijcai/Har-PeledRZ07
fatcat:xjj32d65gbbd3mk3dinpj74idi
« Previous
Showing results 1 — 15 out of 952 results