The Internet Archive has digitized a microfilm copy of this work. It may be possible to borrow a copy for reading.
Filters
Page 5978 of Mathematical Reviews Vol. , Issue 88k
[page]
1988
Mathematical Reviews
The second and third chapters deal with moments, cumulants, generalized cumulants and invariants. The advantages in working with cumulants rather than with moments are stated. ...
Lattice theory is introduced for possible applications in statistics and probability theory.
Chapter 4 is devoted to a discussion of sample cumulants as developed by Fisher, Tukey and others. ...
Page 3389 of Mathematical Reviews Vol. , Issue 92f
[page]
1992
Mathematical Reviews
Gelfond- Lifschitz semantics) is not in general cumulative: the addition to a set of assumptions of some of the derivable conclusions may lead to a loss of others. ...
The core results were initially developed as a formal theory of diagnos- tic inference called parsimonious covering theory, and were then extended to incorporate probability theory. ...
Page 7 of American Mathematical Society. Bulletin of the American Mathematical Society Vol. 54, Issue 1
[page]
1948
American Mathematical Society. Bulletin of the American Mathematical Society
An excellent expository account of the theory of non-para- metric statistical inference has been given by Scheffé [60].! ...
Within the past twenty-five years, a large body of statistical inference theory has been developed for samples from populations having normal, binomial, Poisson, multinomial and other specified forms of ...
Nonmonotonic inference operations
[article]
2002
arXiv
pre-print
In this paper, we weaken the monotonicity requirement and consider more general operations, inference operations. ...
We single out a number of interesting families of inference operations. ...
Acknowledgements David Makinson was with us all along during the elaboration of this work. ...
arXiv:cs/0202031v1
fatcat:div4ti5ueza5pcwf5ltz2pgdmm
Nonmonotonic inference operations
1993
Logic Journal of the IGPL
In this paper, we weaken the monotonicity requirement and consider more general operations, inference operations. ...
We single out a number of interesting families of inference operations. ...
Acknowledgements David Makinson was with us all along during the elaboration of this work. ...
doi:10.1093/jigpal/1.1.23
fatcat:5fqmsuxcorahress7mj5b7hup4
Recovery of (non)monotonic theories
1998
Artificial Intelligence
Based on these rationality postulates our general conclusion is that for cumulative theories, expansions are not suitable, while for noncumulative theories like default logic, auto-epistemic logic and ...
For nonmonotonic theories, e.g., nonmonotonic databases, however, in general it is unclear how to restore the consistency of such a theory: indeed, several options for recovery that use (mixtures of) contractions ...
Acknowledgement The authors would like to thank the reviewers for their constructive comments on a previous version of this paper. ...
doi:10.1016/s0004-3702(98)00099-x
fatcat:5m66ieypj5e5djdicijy6iudnu
The discovery of cumulative knowledge
2018
Accounting Auditing & Accountability Journal
Strong inference identifies theories that make competing predictions and pits one theory against another. ...
/ research site using new empirical data Theories that have been applied to the phenomenon Criteria for theory choice Strong inference tests of the applicability of existing theories Macro-Level
Framing ...
doi:10.1108/aaaj-08-2014-1808
fatcat:sxki5iukqvhlncehrkwpbaagd4
Tightening Bounds for Variational Inference by Revisiting Perturbation Theory
[article]
2019
arXiv
pre-print
Variational inference has become one of the most widely used methods in latent variable modeling. ...
Perturbation theory relies on a form of Taylor expansion of the log marginal likelihood, vaguely in terms of the log ratio of the true posterior and its variational approximation. ...
By introducing a generalized variational inference framework, we propose a similar yet alternative construction to cumulant expansions which results in a valid lower bound of the evidence. • We furthermore ...
arXiv:1910.00069v1
fatcat:ttihwtkn5bhwjln4vhx6t3oc4i
JACKKNIFING: HIGHER ORDER ACCURATE CONFIDENCE INTERVALS
1996
Journal of the Japan Statistical Society
Stochastic expansion of T The coefficients A2 and A3 of (2.2) (Theorem 2.2) are given by ...
While inferences based on the second order asymptotic theory can be untrustworthy, our Monte Carlo study demonstrates clear advantages for third order asymptotic inferences. ...
While the bootstrap clearly outperforms second-order-theory based inference, there is no clear winner between the bootstrap and the third order asymptotic theories. ...
doi:10.14490/jjss1995.26.69
fatcat:xu4lkglmcnfsbku7vim2dvc6du
Abductive Knowledge Induction From Raw Data
[article]
2021
arXiv
pre-print
To the best of our knowledge, Meta_Abd is the first system that can jointly learn neural networks from scratch and induce recursive first-order logic theories with predicate invention. ...
Hence, most of them assume the existence of a strong symbolic knowledge base and only learn the perception model while avoiding a crucial problem: where does the knowledge come from? ...
The x-axis denotes the average number of Prolog inferences, the number at the end of each bar is the average inference time in seconds. ...
arXiv:2010.03514v2
fatcat:6mtsp6ucvnafnme47sama6vuru
Touring the MetaCoq Project (Invited Paper)
2021
Electronic Proceedings in Theoretical Computer Science
the following artefacts: a specification of Coq's syntax and type theory, the Polymorphic Cumulative Calculus of (Co)-Inductive Constructions (PCUIC); a monad for the manipulation of raw syntax and interaction ...
One cause of this difference is the inherent complexity of dependent type theories together with their extensions with inductive types, universe polymorphism and complex sort systems, and the gap between ...
Theory or Type Theory-and a Trusted Code Base (TCB): its actual implementation in a general purpose programming language. ...
doi:10.4204/eptcs.337.2
fatcat:vkzxls6psredfkeo7xwazecbpa
Finite Algebras and AI: From Matrix Semantics to Stochastic Local Search
[chapter]
2004
Lecture Notes in Computer Science
Resolution algebras can also be used to implement some nonmonotonic inference systems. Let P = L, be an arbitrary cumulative inference system. ...
In this semantic framework, every inference operation that satisfies (c1)-(c4) can be defined by a class of generalized matrices. ...
doi:10.1007/978-3-540-30210-0_2
fatcat:vzlccfeqijg3tfh3v73lelk7du
Large-Deviation Approach to Random Recurrent Neuronal Networks : Parameter Inference and Fluctuation-Induced Transitions
2021
Physical review letters : PRL 127(15)
This rate function takes the form of a Kullback-Leibler divergence which enables data-driven inference of model parameters and calculation of fluctuations beyond mean-field theory. ...
We here unify the field-theoretical approach to neuronal networks with large deviations theory. ...
Similar as in field theory, it is convenient to introduce the scaled cumulant-generating functional of the empirical measure. ...
doi:10.18154/rwth-2021-11205
fatcat:lasqlpyp6zbk5kmdxk4vjuaju4
Page 4357 of Mathematical Reviews Vol. , Issue 96g
[page]
1996
Mathematical Reviews
The subfamily generates a wider class of languages than the ones presented previously. ...
recursively generable finite sets. ...
Large Deviations Approach to Random Recurrent Neuronal Networks: Parameter Inference and Fluctuation-Induced Transitions
[article]
2021
arXiv
pre-print
This rate function takes the form of a Kullback-Leibler divergence which enables data-driven inference of model parameters and calculation of fluctuations beyond mean-field theory. ...
We here unify the field theoretical approach to neuronal networks with large deviations theory. ...
Similar as in field theory, it is convenient to introduce the scaled cumulant-generating functional of the empirical measure. ...
arXiv:2009.08889v3
fatcat:xu7i7pachnet5elxmqwnk3vwwu
« Previous
Showing results 1 — 15 out of 168,853 results