Filters








92 Hits in 6.6 sec

Entropy, Information, and the Updating of Probabilities [article]

Ariel Caticha
2021 arXiv   pre-print
The method of updating from a prior to a posterior probability distribution is designed through an eliminative induction process.  ...  This paper is a review of a particular approach to the method of maximum entropy as a general framework for inference. The discussion emphasizes the pragmatic elements in the derivation.  ...  Acknowledgments -I would like to acknowledge many valuable discussions on probability and entropy with N. Caticha, A. Giffin, K. Knuth, R. Preuss, C. Rodríguez, J. Skilling, and K. Vanslette.  ... 
arXiv:2107.04529v1 fatcat:pncnx6zj2natfhj3ufyvf7zgem

Entropy, Information, and the Updating of Probabilities

Ariel Caticha
2021 Entropy  
The method of updating from a prior to posterior probability distribution is designed through an eliminative induction process.  ...  The logarithmic relative entropy is singled out as a unique tool for updating (a) that is of universal applicability, (b) that recognizes the value of prior information, and (c) that recognizes the privileged  ...  Introduction Inductive inference is a framework for coping with uncertainty, for reasoning with incomplete information.  ... 
doi:10.3390/e23070895 fatcat:dadjm3oasrawnc2me4ahxskmhq

Lectures on Probability, Entropy, and Statistical Physics [article]

Ariel Caticha
2008 arXiv   pre-print
These lectures deal with the problem of inductive inference, that is, the problem of reasoning under conditions of incomplete information. Is there a general method for handling uncertainty?  ...  Our goal is to develop the main tools for inductive inference--probability and entropy--from a thoroughly Bayesian point of view and to illustrate their use in physics with examples borrowed from the foundations  ...  One suspects that a deeply fundamental reason must exist for its wide applicability.  ... 
arXiv:0808.0012v1 fatcat:epzc4pwxbneflj3oueqlndixxi

PHILOSOPHIES OF PROBABILITY [chapter]

Jon Williamson
2009 Philosophy of Mathematics  
I discuss the ramifications of interpretations of probability and objective Bayesianism for the philosophy of mathematics in general.  ...  This chapter presents an overview of the major interpretations of probability followed by an outline of the objective Bayesian interpretation and a discussion of the key challenges it faces.  ...  I am also grateful to the Leverhulme Trust for a research fellowship supporting this research.  ... 
doi:10.1016/b978-0-444-51555-1.50016-x fatcat:lodfcxszo5ek5nhxgirobphriy

On The Relationship between Bayesian and Maximum Entropy Inference

Peter Cheeseman
2004 AIP Conference Proceedings  
This generalized MaxEnt (GME) makes MaxEnt inference applicable to a much wider range of problems, and makes direct comparison between Bayesian and MaxEnt inference possible.  ...  Also, we show that MaxEnt is a generalized principle of independence, and this property is what makes it the preferred inference method in many cases.  ...  Also, if there is reason for believing that the current constraint set is incomplete, there is no good reason for believing the current MaxEnt predictions.  ... 
doi:10.1063/1.1835243 fatcat:lkaz4aj2x5gkbaehm6qkonprhu

Modelling Conditional Knowledge Discovery and Belief Revision by Abstract State Machines [chapter]

Christoph Beierle, Gabriele Kern-Isberner
2003 Lecture Notes in Computer Science  
Moreover, this specification provides the basis for a stepwise refinement development process of the Condor system based on the ASM methodology.  ...  We develop a high-level ASM specification for the Condor system that provides powerful methods and tools for managing knowledge represented by conditionals.  ...  Acknowledgements: We thank the anonymous referees of this paper for their helpful comments.  ... 
doi:10.1007/3-540-36498-6_10 fatcat:mf5f2vkybbfszhhrskwnp4p46e

Causality and maximum entropy updating

Daniel Hunter
1989 International Journal of Approximate Reasoning  
An efficient algorithm is given for updating causal information in the form of probabilities of counterfactuals.  ...  Finally, the theory of probabilistic counterfactuals developed in this paper is applied to the interpretation of empirical results concerning the way in which people reason under uncertainty.  ...  It is reasonable that A and B turn out to be independent, since there is no information that would cause one to revise one's probability for A upon learning what B does.  ... 
doi:10.1016/0888-613x(89)90015-7 fatcat:tt6axjwbrvgdpdb6eeisz4bwhq

Is Cosmological Tuning Fine or Coarse? [article]

Daniel Andrés Díaz-Pachón and Ola Hössjer and Robert J. Marks II
2021 arXiv   pre-print
Herein, a Bayesian statistical approach is used to assign an upper bound for the probability of tuning, which is invariant with respect to change of physical units, and under certain assumptions it is  ...  Application of the MaxEnt model reveals, for example, that the ratio of the universal gravitational constant to the square of the Hubble constant is finely tuned in some cases, whereas the amplitude of  ...  Acknowledgements The authors are thankful to Luke Barnes for his suggestions, as well as to Aron Wall for his observations on the first version of this paper.  ... 
arXiv:2104.05400v1 fatcat:2ixpqlh7m5b6je3vcvaqtcncvm

Entropic Inference [article]

Ariel Caticha
2010 arXiv   pre-print
The problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for  ...  In this tutorial we review the essential arguments behing entropic inference. We focus on the epistemological notion of information and its relation to the Bayesian beliefs of rational agents.  ...  cases; it unifies them into a single theory of inductive inference and allows new applications.  ... 
arXiv:1011.0723v1 fatcat:4fhecrjoynhp3jtpanberzukza

Entropic Inference: some pitfalls and paradoxes we can avoid [article]

Ariel Caticha
2012 arXiv   pre-print
This leads us to focus on four epistemically different types of constraints. I propose that the failure to recognize the distinctions between them is a prime source of errors.  ...  The central theme of this paper revolves around the different ways in which constraints are used to capture the information that is relevant to a problem.  ...  The method of maximum entropy (whether in its original MaxEnt version or its generalization ME, the method for updating probabilities) has been successful in many applications but there are cases where  ... 
arXiv:1212.6967v1 fatcat:uxnp3wrefjfa3k3d7u2jmdj2vm

Explicit Bounds for Entropy Concentration Under Linear Constraints

Kostas N. Oikonomou, Peter D. Grunwald
2016 IEEE Transactions on Information Theory  
We present, for the first time, non-asymptotic, explicit lower bounds on n for a number of variants of the concentration result to hold to any prescribed accuracies, with the constraints holding up to  ...  One of our results holds independently of the alphabet size m and is based on a novel proof technique using the multi-dimensional Berry-Esseen theorem.  ...  We are also grateful to an anonymous reviewer of an earlier version of this paper, for the suggestion to include a simple reference result on entropy concentration.  ... 
doi:10.1109/tit.2015.2458951 fatcat:udx4dv2s2rezdhwgde2pxkw7ze

A Nonequilibrium Statistical Ensemble Formalism. Maxent-Nesom: Concepts, Construction, Application, Open Questions and Criticisms [article]

R. Luzzi, A. R. Vasconcellos, J. G. Ramos
1999 arXiv   pre-print
We describe a particular approach for the construction of a nonequilibrium statistical ensemble formalism for the treatment of dissipative many-body systems.  ...  The corresponding response function theory for systems away from equilibrium allows to connected the theory with experiments, and some examples are summarized; there follows a good agreement between theory  ...  Maxwell who stated that the true logic for this world is the Calculus of Probability which takes account of the magnitude of the probability that is, or ought to be, in a reasonable man's mind.  ... 
arXiv:cond-mat/9909160v2 fatcat:wp2tpegvijbcbdnuev4szodzoq

The Development of Subjective Bayesianism [chapter]

James M. Joyce
2011 Handbook of the History of Logic  
The Bayesian approach to inductive reasoning originated in two brilliant insights.  ...  In 1654 Blaise Pascal, while in the course of a correspondence with Fermat [1769], recognized that states of uncertainty can be quantified using probabilities and expectations.  ...  This is just to see the MaxEnt prior as a flawless inductive reasoner.  ... 
doi:10.1016/b978-0-444-52936-7.50012-4 fatcat:asge4i3znzhepiv6kuipcqdtdi

Maximum Entropy Principle in Statistical Inference: Case for Non-Shannonian Entropies

Petr Jizba, Jan Korbel
2019 Physical Review Letters  
Apart from a formal side of the proof where a one-parameter class of admissible entropies is identified, we substantiate our point by analyzing the effect of weak correlations and by discussing two pertinent  ...  In this Letter, we show that the Shore-Johnson axioms for the maximum entropy principle in statistical estimation theory account for a considerably wider class of entropic functional than previously thought  ...  On a formal level the passage from Shannon's information theory to statistical thermodynamics is remarkably simple, namely a MaxEnt probability distribution subject to constraints on average energy, or  ... 
doi:10.1103/physrevlett.122.120601 pmid:30978043 fatcat:hpb3nzzj4bftnd3bi252mfwkom

Inductive logic: from data analysis to experimental design

Kevin H. Knuth
2002 AIP Conference Proceedings  
In its application to the scientific method, the logic of questions, inductive inquiry, can be applied to design an experiment that most effectively addresses a scientific issue.  ...  We discuss the conjecture that the relevance or bearing, b, of a question on an issue can be expressed in terms of the probabilities, p, of the assertions that answer the question via the entropy.  ...  For this reason, I refer the interested reader to another source [6] for a detailed description of the process of data analysis using Bayesian or inductive inference.  ... 
doi:10.1063/1.1477061 fatcat:5mq3w7pcrve6fbq2v5crsdjfwm
« Previous Showing results 1 — 15 out of 92 results