A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is `application/pdf`

.

## Filters

##
###
Exact learning of subclasses of CDNF formulas with membership queries
[chapter]

1996
*
Lecture Notes in Computer Science
*

In particular we show the exact learnability of read-

doi:10.1007/3-540-61332-3_151
fatcat:u6a2fcl3vvc5zg72crdgw2lqjq
*k**monotone*CDNF formulas, Satk O(log n)-CDNF, and O( p log n)-size CDNF from membership queries only. ... We show how to combine known*learning*algorithms that use membership and equivalence queries to obtain new*learning*results only with memberships. ... Furthermore,*nearly**monotone**k*-*term**DNF*formulas CGL97] (*monotone**k*-*term**DNF*formulas with a constant number of non*monotone*literals per*term*) and*monotone**k*-*term*decision list GLR97] are also known ...##
###
Optimal Cryptographic Hardness of Learning Monotone Functions
[chapter]

2008
*
Lecture Notes in Computer Science
*

To date, the only negative result for

doi:10.1007/978-3-540-70575-8_4
fatcat:v6zgngwvpzb27jhxalzfytr6d4
*learning**monotone*functions in this model is an information-theoretic lower bound showing that certain superpolynomial-size*monotone*circuits cannot be*learned*to ... optimal in*terms*of the circuit size parameter by known positive results as well (Servedio, Information and Computation '04). ... Thus the class of 2 O( √ log n) -*term**monotone**DNF*can be*learned*to any constant accuracy in poly(n) time, but no such result is known for 2 O( √ log n) -*term*general*DNF*. 3. ...##
###
P-sufficient statistics for PAC learning k-term-DNF formulas through enumeration

2000
*
Theoretical Computer Science
*

Working in the framework of PAC-

doi:10.1016/s0304-3975(98)00215-1
fatcat:y3efuhnh4jhf5caohkhwhwe7rm
*learning*theory, we present special statistics for accomplishing in polynomial time proper*learning*of*DNF*boolean formulas having a ÿxed number of monomials. ... We develop a theory of most powerful*learning*for analyzing the performance of*learning*algorithms, with particular reference to trade-o s between power and computational costs. ... P using*k*-*term*-*DNF*representation; and the class of*monotone**k*-*term*-*DNF*formulas is polynomially poly-relaxedly SQ-learnable w.r.t. P using*monotone**k*-*term*-*DNF*representation. ...##
###
Extraction of Coverings as Monotone DNF Formulas
[chapter]

2003
*
Lecture Notes in Computer Science
*

In this paper, we extend

doi:10.1007/978-3-540-39644-4_15
fatcat:tae2vdufbrhtrgsq5fqtslmguq
*monotone*monomials as large itemsets in association rule mining to*monotone**DNF*formulas. ... Next, we design the algorithm*dnf*cover to extract coverings as*monotone**DNF*formulas satisfying both the minimum support and the maximum overlap. ... [5] , we can design the algorithm to extract coverings as*monotone**k*-*term**DNF*formulas, where*k*is the upperbound of the number of monomials. ...##
###
DNF are teachable in the average case

2007
*
Machine Learning
*

As our main result, we extend Balbach's teaching result for 2-

doi:10.1007/s10994-007-5007-9
fatcat:krwctnwy7rfephfyyfo4cpeivi
*term**DNF*by showing that for any 1 ≤ s ≤ 2 Θ(n) , the well-studied concept classes of at-most-s-*term**DNF*and at-most-s-*term**monotone**DNF*each ... The proofs use detailed analyses of the combinatorial structure of "most"*DNF*formulas and*monotone**DNF*formulas. ... The idea is to show that almost every at-most-s-*term**monotone**DNF*in fact has exactly s*terms*; as we will see, these exactly-s-*term**monotone**DNFs*can be taught very efficiently with O(ns) examples. ...##
###
DNF Are Teachable in the Average Case
[chapter]

2006
*
Lecture Notes in Computer Science
*

As our main result, we extend Balbach's teaching result for 2-

doi:10.1007/11776420_18
fatcat:xvjgunz2irhvvhmkjlscfbtdem
*term**DNF*by showing that for any 1 ≤ s ≤ 2 Θ(n) , the well-studied concept classes of at-most-s-*term**DNF*and at-most-s-*term**monotone**DNF*each ... The proofs use detailed analyses of the combinatorial structure of "most"*DNF*formulas and*monotone**DNF*formulas. ... The idea is to show that almost every at-most-s-*term**monotone**DNF*in fact has exactly s*terms*; as we will see, these exactly-s-*term**monotone**DNFs*can be taught very efficiently with O(ns) examples. ...##
###
Tight Bounds on Proper Equivalence Query Learning of DNF
[article]

2011
*
arXiv
*
pre-print

We also give a new result on certificates for

arXiv:1111.1124v1
fatcat:3nwcachm5bervhoxgpy4bg2uqm
*DNF*-size, a simple algorithm for properly PAC-*learning**DNF*, and new results on EQ-*learning*n-*term**DNF*and decision trees. ... Using the lemma, we give the first subexponential algorithm for proper*learning*of*DNF*in Angluin's Equivalence Query (EQ) model. ... The above bound on seed size is*nearly*tight for a*monotone**DNF*formula on n variables having √ n disjoint*terms*, each of size √ n. ...##
###
Preference Elicitation and Query Learning
[chapter]

2003
*
Lecture Notes in Computer Science
*

*learning*theory. ... In this paper we explore the relationship between "preference elicitation", a

*learning*-style problem that arises in combinatorial auctions, and the problem of

*learning*via queries studied in computational ... classes of

*monotone*functions in machine

*learning*is that of

*monotone*

*DNF*formulas. ...

##
###
Learning with Unreliable Boundary Queries

1998
*
Journal of computer and system sciences (Print)
*

We also describe algorithms for

doi:10.1006/jcss.1997.1559
fatcat:krwyws5rbrbjdncygbaklf26si
*learning*several subclasses of*monotone**DNF*formulas. ... We show how to*learn*the intersection of two halfspaces when membership queries near the boundary may be answered incorrectly. ... Finally return T U D( 1, T). u*Learning*(r+ 1)-Separable*k*-*Term**Monotone**DNF*Formulas We now show that a subclass of*monotone**k*-*term**DNF*formulas are properly learnable in the false-positive-only UBQ ...##
###
Learning with unreliable boundary queries

1995
*
Proceedings of the eighth annual conference on Computational learning theory - COLT '95
*

We also describe algorithms for

doi:10.1145/225298.225310
dblp:conf/colt/BlumCGS95
fatcat:7vhgbr3vdbfv7mvmg22ir5igvm
*learning*several subclasses of*monotone**DNF*formulas. ... We show how to*learn*the intersection of two halfspaces when membership queries near the boundary may be answered incorrectly. ... Finally return T U D( 1, T). u*Learning*(r+ 1)-Separable*k*-*Term**Monotone**DNF*Formulas We now show that a subclass of*monotone**k*-*term**DNF*formulas are properly learnable in the false-positive-only UBQ ...##
###
Efficiently Approximating Weighted Sums with Exponentially Many Terms
[chapter]

2001
*
Lecture Notes in Computer Science
*

The applications we examine are pruning classifier ensembles using WM and

doi:10.1007/3-540-44581-1_6
fatcat:azrll3l4yrca3ownzuohcnwlvu
*learning*general*DNF*formulas using Winnow. ... So using the 2 n possible*terms*as Winnow's inputs, it can*learn**k*-*term**monotone**DNF*with only 2 + 2kn prediction mistakes. ... Our algorithm implicitly enumerates all possible*DNF**terms*and uses Winnow to*learn*a*monotone*disjunction over these*terms*, which it can do while making O(*k*log N ) prediction mistakes, where*k*is the ...##
###
Exact Learning of Formulas in Parallel

1997
*
Machine Learning
*

We investigate the parallel complexity of

doi:10.1023/a:1007320031970
dblp:journals/ml/Bshouty97
fatcat:wkec6qdpkncmfkwaucxeuu22z4
*learning*formulas from membership and equivalence queries. ... We show that many restricted classes of boolean functions cannot be efficiently*learned*in parallel with a polynomial number of processors. ... The classes are*k*-*term**DNF*formulas for*k*= O(log n) (*DNF*with at most*k**terms*),*monotone**DNF*formulas (*DNF*with no negated variables) and*DNF*formulas. Result 3. ...##
###
Tight Bounds on ℓ_1 Approximation and Learning of Self-Bounding Functions
[article]

2019
*
arXiv
*
pre-print

We show that both the degree and junta-size are optimal up to logarithmic

arXiv:1404.4702v3
fatcat:wnmchrdiubeujertmzg3zz6u5m
*terms*. ... Our main result is a*nearly*tight ℓ_1-approximation of self-bounding functions by low-degree juntas. ... In fact, even PAC*learning*of non-*monotone*a-self-bounding functions requires time nΩ(a/ǫ) assuming hardness of*learning**k*-*term**DNF*to accuracy 1/4 in time nΩ(*k*) . ...##
###
A Model of Interactive Teaching

1997
*
Journal of computer and system sciences (Print)
*

An important concept class that is not known to be learnable is

doi:10.1006/jcss.1997.1491
fatcat:7dp7hsez3zhaxgcujby3dqj5pa
*DNF*formulas. We demonstrate the power of our approach by providing a deterministic teacher and learner for the class of*DNF*formulas. ... In this paper we present an interactive model in which the learner has the ability to ask queries as in the query*learning*model of Angluin. ...*K*We note that in some sense this result subsumes the T I L pair using the*monotone*theory since any function can be taught in time polynomial in its*DNF*size without regard for its*monotone*dimension. ...##
###
Page 4583 of Mathematical Reviews Vol. , Issue 98G
[page]

1998
*
Mathematical Reviews
*

Peter Auer (Graz)
982:68139 68T05S 03B05 68Q99 Castro, Jorge (E-UPB-LI; Barcelona); Guijarro, David (E-UPB-LI; Barcelona); Lavin, Victor (E-UPB-LI; Barcelona)

*Learning**nearly**monotone**k*-*term**DNF*. ... Vitanyi’s result for*monotone**k*-*term**DNF*.” {For the entire collection see MR 98f:68007. } 982:68140 68TO5 68Q55 68Q68 68Q75 Edalat, A. (4-LNDIC-C; London Domain theory in*learning*processes. ...
« Previous

*Showing results 1 — 15 out of 339 results*