361 Hits in 5.0 sec

Restriction access

Zeev Dvir, Anup Rao, Avi Wigderson, Amir Yehudayoff
2012 Proceedings of the 3rd Innovations in Theoretical Computer Science Conference on - ITCS '12  
We introduce a PAC-learning version of restriction access, and show that one can efficiently learn both decision trees and DNF formulas in this model.  ...  We introduce a notion of non-black-box access to computational devices (such as circuits, formulas, decision trees, and so forth) that we call restriction access.  ...  Section 4 describes our PAC-learning algorithm for decision trees, and Section 5 gives the algorithm for learning DNF formulas.  ... 
doi:10.1145/2090236.2090239 dblp:conf/innovations/DvirRWY12 fatcat:rnlesawmlng3rivrz3pdjzaanu

DNF-Net: A Neural Architecture for Tabular Data [article]

Ami Abutbul, Gal Elidan, Liran Katzir, Ran El-Yaniv
2020 arXiv   pre-print
(DNF) over affine soft-threshold decision terms.  ...  In addition, DNF-Net promotes localized decisions that are taken over small subsets of the features.  ...  Specifically, we analyze the VC-dimension of Boolean DNF formulas and compare it to that of decision trees.  ... 
arXiv:2006.06465v1 fatcat:mr2qlevggveztarjimfevaoohi

Net-DNF: Effective Deep Modeling of Tabular Data

Liran Katzir, Gal Elidan, Ran El-Yaniv
2021 International Conference on Learning Representations  
(DNF) over affine soft-threshold decision terms.  ...  Net-DNFs also promote localized decisions that are taken over small subsets of the features.  ...  Specifically, we analyze the VC-dimension of Boolean DNF formulas and compare it to that of decision trees.  ... 
dblp:conf/iclr/0001EE21 fatcat:gngum5ogdfhn3ere23wehxe4pa

Truth Table Minimization of Computational Models [article]

Netanel Raviv
2013 arXiv   pre-print
We shall present several new hardness results and efficient algorithms, as well as new proofs and extensions for known theorems, for variants of decision trees, formulas and branching programs.  ...  Complexity theory offers a variety of concise computational models for computing boolean functions - branching programs, circuits, decision trees and ordered binary decision diagrams to name a few.  ...  We denote by Σ k a depth k formula with top gate ∨ and by Π k a depth k formula with top gate ∧. E.g., a Σ 2 formula is a DNF.  ... 
arXiv:1306.3766v1 fatcat:ht7ra6ja7faqfjuxqejkkzlmeq

Synthesizing entity matching rules by examples

Rohit Singh, Venkata Vamsikrishna Meduri, Ahmed Elmagarmid, Samuel Madden, Paolo Papotti, Jorge-Arnulfo Quiané-Ruiz, Armando Solar-Lezama, Nan Tang
2017 Proceedings of the VLDB Endowment  
Consequently, they are more interpretable than decision trees and other machine learning algorithms that output deep trees with many branches.  ...  Extensive experiments show that we outperform other interpretable rules (e.g., decision trees with low depth) in effectiveness, and are comparable with non-interpretable tools (e.g., decision trees with  ...  Exp-2: Effectiveness vs. Interpretable Decision Trees.  ... 
doi:10.14778/3149193.3149199 fatcat:2sekwu3kzbakndo2otlaymeqem

Learning and Smoothed Analysis

Adam Tauman Kalai, Alex Samorodnitsky, Shang-Hua Teng
2009 2009 50th Annual IEEE Symposium on Foundations of Computer Science  
In this model, we analyze two new algorithms, for PAC-learning DNFs and agnostically learning decision trees, from random examples drawn from a constant-bounded product distributions.  ...  We give a new model of learning motivated by smoothed analysis (Spielman and Teng, 2001).  ...  We would like to thank Ran Raz, Madhu Sudan, Ryan O'Donnell, and Prasad Tetali for very helpful comments.  ... 
doi:10.1109/focs.2009.60 dblp:conf/focs/KalaiST09 fatcat:vescwkftgra3vkpbnk5a4gpxli

A Synthesis of Pseudo-Boolean Empirical Models by Precedential Information

V.I. Donskoy
2018 Bulletin of the South Ural State University Series Mathematical Modelling Programming and Computer Software  
In the paper is shown how to use binary decision trees to construct a disjunctive constraint, proposed the methods to identify the properties of monotonicity and linearity of the partially dened objective  ...  When extracting the model from the data the DNF constraint is synthesized approximately but with polynomial complexity and the number of conjunctions in the extracted DNF does not exceed the number of  ...  P (F, l) < 2 −l+pV CD( If binary decision trees are used to extract such regularity as DNF constraint then the family F is a class BDT n,µ of trees with at most µ leaves and from n Boolean variables.  ... 
doi:10.14529/mmp180208 fatcat:aesf62aam5dyrmu742n4v2dm7y

On the Complexity of the Multiplication Method for Monotone CNF/DNF Dualization [chapter]

Khaled M. Elbassioni
2006 Lecture Notes in Computer Science  
Given the irredundant CNF representation φ of a monotone Boolean function f : {0, 1} n → {0, 1}, the dualization problem calls for finding the corresponding unique irredundant DNF representation ψ of f  ...  The (generalized) multiplication method works by repeatedly dividing the clauses of φ into (not necessarily disjoint) groups, multiplying-out the clauses in each group, and then reducing the result by  ...  They achieved this by presenting a quasi-polynomial time algorithm for the decision-version of the problem: given two monotone Boolean formulae φ and ψ in CNF and DNF forms respectively, is φ ≡ ψ?  ... 
doi:10.1007/11841036_32 fatcat:n52gy6g6rvhd5iq2psugmejdja

More Accurate Learning of k-DNF Reference Classes

Brendan Juba, Hengxuan Li
We present new algorithms for computing k-DNF reference classes and establish much stronger approximation guarantees for their error rates.  ...  In machine learning, predictors trained on a given data distribution are usually guaranteed to perform well for further examples from the same distribution on average.  ...  a * 2 ≤ b and k-DNFs h * such that h * (x * ) = 1 and Pr[h * (x) = 1] ≥ μ.  ... 
doi:10.1609/aaai.v34i04.5864 fatcat:ifcuh7urjvgjrai3hi4kgpxcrq

Learning Simple Concepts under Simple Distributions

Ming Li, Paul M. B. Vitányi
1991 SIAM journal on computing (Print)  
Li and P.M.B. Vitanyi, An Introduction to Kolmogorov  ...  John suggested the need for Lemma 2; Peter Gács suggested turning semimeasures like m and M into measures by concentrating the surplus probability on an undefined symbol.  ...  Peter Gács, Gloria Kissin, Ray Solomonoff, and John Tromp commented on the manuscript.  ... 
doi:10.1137/0220056 fatcat:kmyydxxkfrfhjbukltykwlihje

The Fourier Entropy–Influence Conjecture for Certain Classes of Boolean Functions [chapter]

Ryan O'Donnell, John Wright, Yuan Zhou
2011 Lecture Notes in Computer Science  
We also verify the conjecture for functions computable by read-once decision trees.  ...  In 1996, Friedgut and Kalai made the Fourier Entropy-Influence Conjecture: For every Boolean function f : is the total influence of f , and C is a universal constant.  ...  The authors would like to thank Rocco Servedio, Li-Yang Tan, and Andrew Wan for sharing their insights on the Entropy-Influence Conjecture.  ... 
doi:10.1007/978-3-642-22006-7_28 fatcat:shwga3nyenas7bh7is7hvwby6q

Learning DNF Expressions from Fourier Spectrum [article]

Vitaly Feldman
2013 arXiv   pre-print
This property is crucial for learning of DNF expressions over smoothed product distributions, a learning model introduced by Kalai et al. (2009) and inspired by the seminal smoothed analysis model of Spielman  ...  Since its introduction by Valiant in 1984, PAC learning of DNF expressions remains one of the central problems in learning theory.  ...  Acknowledgements I thank Sasha Sherstov for pointing out the connection of our W d 1 (f ) measure of a PTF f to the definition of advantage by Krause and Pudlák (1997) .  ... 
arXiv:1203.0594v3 fatcat:b6n62mueyjaf3fxgb3rflbogza

Learning using Local Membership Queries [article]

Pranjal Awasthi, Vitaly Feldman, Varun Kanade
2013 arXiv   pre-print
This class also includes the class of O((n))-depth decision trees.  ...  (iii) The class of polynomial size DNF formulas is learnable under the uniform distribution using O((n))-local queries in time n^O(((n))).  ...  [BCGS98] proposed a noisy model wherein membership queries made on points lying Acknowledgments The authors would like to thank Avrim Blum, Vitaly Feldman, Adam Kalai, Shang Hua Teng, and Leslie Valiant  ... 
arXiv:1211.0996v2 fatcat:bgg7rrtj5zb5xfu56udu66nldq

STLnet: Signal Temporal Logic Enforced Multivariate Recurrent Neural Networks

Meiyi Ma, Ji Gao, Lu Feng, John A. Stankovic
2020 Neural Information Processing Systems  
In this paper, we develop a new temporal logic-based learning framework, STLnet, which guides the RNN learning process with auxiliary knowledge of model properties, and produces a more robust model for  ...  Our framework can be applied to general sequential deep learning models, and trained in an endto-end manner with back-propagation.  ...  Acknowledgments and Disclosure of Funding This research was partially supported by NSF grants CCF-1942836 and CNS-1739333, and the Commonwealth Cyber Initiative, an investment from the Commonwealth of  ... 
dblp:conf/nips/MaG0S20 fatcat:mt5uf5ktm5dn5bpaqariufwqei

Almost Optimal Testers for Concise Representations [article]

Nader H. Bshouty
2019 arXiv   pre-print
Classes, such as k-junta, k-linear functions, s-term DNF, s-term monotone DNF, r-DNF, decision list, r-decision list, size-s decision tree, size-s Boolean formula, size-s branching programs, s-sparse polynomials  ...  We give improved and almost optimal testers for several classes of Boolean functions on n inputs that have concise representation in the uniform and distribution-free model.  ...  ), Decision List, r-Decision List (r constant), size-s Decision Tree, size-s Branching Programs and size-s Boolean Formula.  ... 
arXiv:1904.09958v3 fatcat:3o2zksdzzrgtjljwmfxbys423a
« Previous Showing results 1 — 15 out of 361 results