Filters








17,419 Hits in 3.3 sec

Learning probabilistic read-once formulas on product distributions

Robert E. Schapire
1994 Machine Learning  
Since the class of formulas considered includes ordinary read-once Boolean formulas, our result shows that such formulas are PAC learnable (in the sense of Valiant) against any product distribution (for  ...  Further, this class of probabilistic formulas includes read-once formulas whose behavior has been corrupted by large amounts of random noise.  ...  I am also grateful to two anonymous referees for their thorough reading and thoughtful suggestions.  ... 
doi:10.1007/bf00993162 fatcat:v2iwhleiwzcpzmryjib3y7ijjy

Computational learning theory

Dana Angluin
1992 Proceedings of the twenty-fourth annual ACM symposium on Theory of computing - STOC '92  
Schapire [120] significantly generalizes these re- sults by giving an algorithm that PAC-learns the class of probabilistic read-once formulas with respect to the class af product distributions  ...  Learning monotone kp DNF formulas on product distributions. In Proceedings of the Fourth Annual Workshop on Computational Learning Theory, pages 179-183.  ... 
doi:10.1145/129712.129746 dblp:conf/stoc/Angluin92 fatcat:7aw3cnd745bellyhu7phywpul4

An O(nlog log n) learning algorithm for DNF under the uniform distribution

Yishay Mansour
1992 Proceedings of the fifth annual workshop on Computational learning theory - COLT '92  
Acknowledgements I would like to thank Eyal Kushilevitz for commenting on an early version of the paper.  ...  Learning monotone kp dnf formulas on product distributions.  ...  Ex- act learning of read-twice dnf formulas. In 32nd Annual Symposium on Foundation of Computer Science, pages 170-179, octo- ber 1991. M. Furst, J. Saxe, and M. Sipser.  ... 
doi:10.1145/130385.130391 dblp:conf/colt/Mansour92 fatcat:w3qhyrnaanemvhkel2tlmkcjpm

10 Years of Probabilistic Querying – What Next? [chapter]

Martin Theobald, Luc De Raedt, Maximilian Dylla, Angelika Kimmig, Iris Miliaraki
2013 Lecture Notes in Computer Science  
While probabilistic databases have focused on describing tractable query classes based on the structure of query plans and data lineage, probabilistic programming has contributed sophisticated inference  ...  far-both areas developed almost independently of one another.  ...  of evaluating different query structures such as safe plans or read-once formulas.  ... 
doi:10.1007/978-3-642-40683-6_1 fatcat:lofuquzqgbb4hcjtjeqydyakbe

Query Processing on Probabilistic Data: A Survey

Guy Van den Broeck, Dan Suciu
2017 Foundations and Trends in Databases  
When F can be written as a read-once expression, then we call it a read-once formula. Every read-once formula F with n variables admits an OBDD with ≤ n internal nodes.  ...  Read-Once Formulas A read-once Boolean expression is an expression where each Boolean variable occurs only once.  ...  We prove the claim by induction on the sentence Q.  ... 
doi:10.1561/1900000052 fatcat:jzifdhyvsnh7thqrnuptxbpejy

Lineage processing over correlated probabilistic databases

Bhargav Kanagal, Amol Deshpande
2010 Proceedings of the 2010 international conference on Management of data - SIGMOD '10  
We observe that evaluating even read-once (tree structured) lineages (e.g., those generated by hierarchical conjunctive queries), polynomially computable over tuple independent probabilistic databases,  ...  We characterize the complexity of exact computation of the probability of the lineage formula on a correlated database using a parameter called lwidth (analogous to the notion of treewidth).  ...  ., if the boolean formula has a read-once form, then it will find one such representation.  ... 
doi:10.1145/1807167.1807241 dblp:conf/sigmod/KanagalD10 fatcat:aefwk6mrljdmzin24pqprivava

Statistical Abduction with Tabulation [chapter]

Taisuke Sato, Yoshitaka Kameya
2002 Lecture Notes in Computer Science  
We propose statistical abduction as a rst-order logical framework for representing, inferring and learning probabilistic knowledge.  ...  algorithm (the graphical EM algorithm) for learning parameters associated with the distribution which achieve the same computational complexity as those specialized algorithms for HMMs (hidden Markov  ...  ; yes) = 0 ; msw(rain; once; no) = 0 )= 0 : Introduce analogously another distribution P Fs (1; 1) parameterized by s over the set F s = fmsw(sprinkler; once; on); msw(sprinkler; once; off)g.  ... 
doi:10.1007/3-540-45632-5_22 fatcat:uqfnlzj35rgxllpeagm7hpfhdu

Inference and learning in probabilistic logic programs using weighted Boolean formulas

DAAN FIERENS, GUY VAN DEN BROECK, JORIS RENKENS, DIMITAR SHTERIONOV, BERND GUTMANN, INGO THON, GERDA JANSSENS, LUC DE RAEDT
2014 Theory and Practice of Logic Programming  
It is based on the conversion of the program and the queries and evidence to a weighted Boolean formula.  ...  The results show that the inference algorithms improve upon the state of the art in probabilistic logic programming, and that it is indeed possible to learn the parameters of a probabilistic logic program  ...  Once we have the formula, we often need to rewrite it in CNF form, which is straightforward for a completion formula.  ... 
doi:10.1017/s1471068414000076 fatcat:iqcl4yfypvebbgbstshwol4j74

Epistemic Configurations and Holistic Meaning of Binomial Distribution

Nicolás Alonso Fernández Coronado, Jaime I. García-García, Elizabeth H. Arredondo, Ismael Andrés Araya Naveas
2022 Mathematics  
In this task, the understanding of the binomial distribution is essential as it allows the analysis of discrete data, the modeling of random situations, and the learning of other notions.  ...  one based on the understanding of the concepts and their application in daily life.  ...  The definitions and concepts that are added to this meaning are related to the theory of probability applied to the binomial distribution once it is formalized: the binomial distribution and its formula  ... 
doi:10.3390/math10101748 fatcat:ficaxfq2svcctfjvw6guvs5wka

Artificial Intelligence for Ecological and Evolutionary Synthesis

Philippe Desjardins-Proulx, Timothée Poisot, Dominique Gravel
2019 Frontiers in Ecology and Evolution  
Mathematicians have solved this problem by using formal languages based on logic to manage theorems.  ...  ., 2019 ) is a BHOPPL built on top of PyTorch, one of the most popular frameworks for deep learning, allowing computation to be distributed on systems of GPUs.  ...  Markov logic supports algorithms to add weights to existing formulas given a dataset, learn new formulas or revise existing ones, and answer probabilistic queries (MAP or conditional).  ... 
doi:10.3389/fevo.2019.00402 fatcat:mnaucpsg2bgyze6o4vrbblskgq

A probabilistic separation logic

Gilles Barthe, Justin Hsu, Kevin Liao
2019 Proceedings of the ACM on Programming Languages (PACMPL)  
We then build a program logic based on these assertions, and prove soundness of the proof system.  ...  We propose a probabilistic separation logic PSL, where separation models probabilistic independence. We first give a new, probabilistic model of the logic of bunched implications (BI).  ...  ACKNOWLEDGMENTS We thank the anonymous reviewers and our shepherd Ohad Kammar for their close reading and useful suggestions.  ... 
doi:10.1145/3371123 fatcat:a2osslhbg5ba7bbnnutmxlc6w4

Dynamic Update with Probabilities

Johan van Benthem, Jelle Gerbrandy, Barteld Kooi
2009 Studia Logica: An International Journal for Symbolic Logic  
that has a probabilistic character itself.  ...  The formal systems we will be dealing with apply just as well to observation, experimentation, learning, or any sort of information-carrying event.  ...  of probabilistic update, including Jeffrey Update.  ... 
doi:10.1007/s11225-009-9209-y fatcat:vc6sctgyhbdb7dwml7fb25h5ze

Sensitivity analysis and explanations for robust query evaluation in probabilistic databases

Bhargav Kanagal, Jian Li, Amol Deshpande
2011 Proceedings of the 2011 international conference on Management of data - SIGMOD '11  
Existing systems provide the lineage/provenance of each of the output tuples in addition to the output probabilities, which is a boolean formula indicating the dependence of the output tuple on the input  ...  Probabilistic database systems have successfully established themselves as a tool for managing uncertain data.  ...  On the other hand, (x1 ∧ x2) ∨ (x2 ∧ x3) ∨ (x3 ∧ x1) cannot be rewritten as a read-once formula. A read-once formula can be represented as an AND/OR tree as shown in Figure 2 .  ... 
doi:10.1145/1989323.1989411 dblp:conf/sigmod/KanagalLD11 fatcat:nhvbtikb5vgfpkq4nmisn5rmaa

Dynamic Context-Aware Event Recognition Based on Markov Logic Networks

Fagui Liu, Dacheng Deng, Ping Li
2017 Sensors  
Then we put forward an algorithm for updating formula weights in MLNs to deal with data dynamics. Experiments on two datasets from different scenarios are conducted to evaluate the proposed approach.  ...  Markov logic networks (MLNs) which combine the expressivity of first order logic (FOL) and the uncertainty disposal of probabilistic graphical models (PGMs).  ...  Acknowledgments: The authors thank the anonymous reviewers and editors for their valuable comments on improving this paper.  ... 
doi:10.3390/s17030491 pmid:28257113 pmcid:PMC5375777 fatcat:y26d2i743jcw3hhsbzr7mzec5y

Formulaic Language and Second Language Acquisition: Zipf and the Phrasal Teddy Bear

Nick C. Ellis
2012 Annual Review of Applied Linguistics  
This article revisits earlier proposals that language learning is, in essence, the learning of formulaic sequences and their interpretations; that this occurs at all levels of granularity from large to  ...  The final section weighs the implications of the statistical distributions of formulaicity in usage for developmental sequences of language acquisition.  ...  Having said that, the same caveat must be stated clearly: "To the extent that language processing is based on frequency and probabilistic knowledge, language learning is implicit learning.  ... 
doi:10.1017/s0267190512000025 fatcat:k5zouxpbhfajtgqcelu5zvr7pa
« Previous Showing results 1 — 15 out of 17,419 results