A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2008; you can also visit the original URL.
The file type is `application/pdf`

.

## Filters

##
###
Optimally-Smooth Adaptive Boosting and Application to Agnostic Learning
[chapter]

2002
*
Lecture Notes in Computer Science
*

This allows adaptively solving problems whose solution is based

doi:10.1007/3-540-36169-3_10
fatcat:nu6xunw5jfhfxam3d32k36auka
*on*smooth*boosting*(like noise tolerant*boosting*and DNF membership learning), while preserving the original (non-adaptive) solution's complexity ... We derive a lower*bound*for the final error achievable by*boosting*in the agnostic model and show that our algorithm actually achieves that accuracy (within a constant factor). ... Our upper*bound**on*the final error is 1 1/2 − β err D (F ) + ζ, where ζ is any real so that the time complexity of the solution is*polynomial*in 1/ζ. ...##
###
Page 4863 of Mathematical Reviews Vol. , Issue 2004f
[page]

2004
*
Mathematical Reviews
*

(IL-TECH-C; Haifa) ;
Gavinsky, Dmitry (IL-TECH-C; Haifa)

*On**boosting**with**polynomially**bounded**distributions*. (English summary) J. Mach. Learn. Res. 3 (2002), Spec. Issue Comput. Learn. ... A smooth*boosting*algorithm constructs only*distributions*D; for the weak learner which are*polynomially*near to the original*distribution*D, i.e. ...##
###
On Boosting with Optimal Poly-Bounded Distributions
[chapter]

2001
*
Lecture Notes in Computer Science
*

We construct a framework which allows an algorithm to turn the

doi:10.1007/3-540-44581-1_32
fatcat:hcmhfchnyjc7rhtuu5krgiwfcu
*distributions*produced by some*boosting*algorithms into*polynomially*smooth*distributions*(w.r.t. the PAC oracle's*distribution*),*with*minimal ... Further, we explore the case of Freund and Schapire's AdaBoost algorithm,*bounding*its*distributions*to*polynomially*smooth. ... Definitions and Notation We call a*boosting*algorithm producing only*polynomially*near-D*distributions**polynomially*near-D (the word "*polynomially*" will be omitted sometimes). ...##
###
An Efficient Membership-Query Algorithm for Learning DNF with Respect to the Uniform Distribution

1997
*
Journal of computer and system sciences (Print)
*

We present a membership-query algorithm for eficiently learning DNF

doi:10.1006/jcss.1997.1533
fatcat:pys4awlasbd7zhijra7f2u7ffa
*with*respect to the uniform*distribution*. ... The algorithm utilizes*one*of Freund's*boosting*techniques and relies*on*the fact that*boosting*does not require a completely dist ri but ion-in depend ent weak learner. ... Yoav Freund graciously provided unpublished details about his*boosting*algorithm. Avrim Blum and Merrick Furst provided many useful comments*on*an early draft of this paper. ...##
###
Agnostic Boosting
[chapter]

2001
*
Lecture Notes in Computer Science
*

error of an hypothesis from F under the

doi:10.1007/3-540-44581-1_33
fatcat:wc7x7pes6fcjlaldhpwajdtsae
*distribution*P (note that for some*distributions*the*bound*may exceed a half). ... While this generalization guarantee is significantly weaker than the*one*resulting from the known PAC*boosting*algorithms,*one*should note that the assumption required for -weak agnostic learner is much ... We therefore work out an upper*bound**on*the rate at which the*boosting**distributions*may change. ...##
###
General Bounds on Statistical Query Learning and PAC Learning with Noise via Hypothesis Boosting

1998
*
Information and Computation
*

We derive general

doi:10.1006/inco.1998.2664
fatcat:ekl7hiex3rhpnbghyqzlw7662q
*bounds**on*the complexity of learning in the Statistical Query model and in the PAC model*with*classification noise. ... The*boosting*is efficient and is used to show our main result of the first general upper*bounds**on*the complexity of strong SQ learning. ... We may then use the*boosting*technique to arrive at a strong SQ learning algorithm*with*nearly optimal dependence*on*E.*bounded*the number of queries and lower*bounded*the minimum tolerance in terms of ...##
###
Boosting in the presence of noise

2003
*
Proceedings of the thirty-fifth ACM symposium on Theory of computing - STOC '03
*

We also give a matching lower

doi:10.1145/780542.780573
dblp:conf/stoc/KalaiS03
fatcat:ebwkgtcxqzgsrj4o3drr4cnoli
*bound*by showing that no efficient black-box*boosting*algorithm can*boost*accuracy beyond the noise rate (assuming that*one*-way functions exist). ...*Boosting*algorithms are procedures that "*boost*" low-accuracy weak learning algorithms to achieve arbitrarily high accuracy. ... Theorem 13 gives a lower*bound*of*on*the accuracy level which any*polynomial*time black box*boosting*algorithm can achieve. ...##
###
Boosting in the presence of noise

2003
*
Proceedings of the thirty-fifth ACM symposium on Theory of computing - STOC '03
*

We also give a matching lower

doi:10.1145/780572.780573
fatcat:n47hwiaovbg2rp6stysmsslsnq
*bound*by showing that no efficient black-box*boosting*algorithm can*boost*accuracy beyond the noise rate (assuming that*one*-way functions exist). ...*Boosting*algorithms are procedures that "*boost*" low-accuracy weak learning algorithms to achieve arbitrarily high accuracy. ... Theorem 13 gives a lower*bound*of*on*the accuracy level which any*polynomial*time black box*boosting*algorithm can achieve. ...##
###
Boosting in the presence of noise

2005
*
Journal of computer and system sciences (Print)
*

We also give a matching lower

doi:10.1016/j.jcss.2004.10.015
fatcat:yrv5mk3cnvdo5ojrar4zi7vxbe
*bound*by showing that no efficient black-box*boosting*algorithm can*boost*accuracy beyond the noise rate (assuming that*one*-way functions exist). ...*Boosting*algorithms are procedures that "*boost*" low-accuracy weak learning algorithms to achieve arbitrarily high accuracy. ... Theorem 13 gives a lower*bound*of*on*the accuracy level which any*polynomial*time black box*boosting*algorithm can achieve. ...##
###
Distribution-Specific Agnostic Boosting
[article]

2009
*
arXiv
*
pre-print

This allows

arXiv:0909.2927v1
fatcat:4ckz5ryasngmzmu4bxsbgietam
*boosting*a*distribution*-specific weak agnostic learner to a strong agnostic learner*with*respect to the same*distribution*. ... ., 2008) follow the same strategy as*boosting*algorithms in the PAC model: the weak learner is executed*on*the same target function but over different*distributions**on*the domain. ... Theorem 1.2 If C is efficiently agnostically learnable*with*respect to*distribution*D then TH(W, C) is efficiently PAC learnable over D for any W upper-*bounded*by a*polynomial*in the learning parameters ...##
###
Cryptographic hardness of distribution-specific learning

1993
*
Proceedings of the twenty-fifth annual ACM symposium on Theory of computing - STOC '93
*

Fortunately, there exist known

doi:10.1145/167088.167197
dblp:conf/stoc/Kharitonov93
fatcat:xkpthlaatzhthgqvnd63fbr6lq
*boosting*algorithms*with*the property that if the initial*distribution*is uniform, then all*distributions*constructed by the*boosting*algorithm will remain close to uniform ... We show that the size*bound*for this parity -and hence the running time of our algorithmdepends directly*on*the extent of D's deviation from the uniform*distribution*. ... Fortunately, there exist known*boosting*algorithms*with*the property that if the initial*distribution*is uniform, then all*distributions*constructed by the*boosting*algorithm will remain close to uniform ...##
###
Page 4956 of Mathematical Reviews Vol. , Issue 99g
[page]

1999
*
Mathematical Reviews
*

From the proposed

*boosting*method for the SQ model, general*bounds**on*the complexity of SQ learning are derived. ... PAC learning*with*noise via hypothesis*boosting*. ...##
###
Algorithms and hardness results for parallel large margin learning

2011
*
Neural Information Processing Systems
*

Our main negative result deals

dblp:conf/nips/ServedioL11
fatcat:zpw3jhgzfne7xnel3r35qexcli
*with**boosting*, which is a standard approach to learning large-margin halfspaces. ... In contrast, naive parallel algorithms that learn a γ-margin halfspace in time that depends polylogarithmically*on*n have Ω(1/γ 2 ) runtime dependence*on*γ. ... Composing*polynomials*constantly many times yields a*polynomial*, which gives the claimed bit-length*bound*for u + . 2 The first inequality is (9.50) from [3] . ...##
###
Learning DNF Expressions from Fourier Spectrum
[article]

2013
*
arXiv
*
pre-print

We introduce a new approach to learning (or approximating) a

arXiv:1203.0594v3
fatcat:b6n62mueyjaf3fxgb3rflbogza
*polynomial*threshold functions which is based*on*creating a function*with*range [-1,1] that approximately agrees*with*the unknown function*on*... This improves*on*((s ·(ns/))^(s/)·(1/), n)*bound*of Servedio (2001). ... Such an algorithm is necessary since, in the*boosting*-based approach of Jackson (1997) , the weak learner needs to learn*with*respect to*distributions*which depend*on*previous weak hypotheses. ...##
###
Open Problem: The Statistical Query Complexity of Learning Sparse Halfspaces

2014
*
Annual Conference Computational Learning Theory
*

This definition was originally stated in the context of online mistake-

dblp:conf/colt/Feldman14
fatcat:vkvvrabtujf2rdlktijiibvjcu
*bound*model (Littlestone, 1987) . ... In this problem the learner is given random examples labeled by an unknown halfspace function f*on*R n . Further f is r-sparse, that is it depends*on*at most r out of n variables. ... A lower*bound**on*the query complexity of the SQ algorithm gives a lower*bound**on*its running time. ...
« Previous

*Showing results 1 — 15 out of 25,133 results*