Filters








12,199,749 Hits in 7.4 sec

Measures of Information

Paul Walton
2015 Information  
This paper builds an integrated framework of measures of information based on the Model for Information (MfI) developed by the author.  ...  Another measure of information is the amount of information. This has played a role in two important theoretical difficulties-the Bar-Hillel Carnap paradox and the "scandal of deduction".  ...  Conflicts of Interest The author declares no conflict of interest.  ... 
doi:10.3390/info6010023 fatcat:tpqa7hjsffhfbjyztvvxeaiyga

An Unorthodox Parametric Measure of Information and Corresponding Measure of Intutionistic Fuzzy Information

P. Jha, Vikas Kumar Mishra
2013 Bulletin of Mathematical Sciences and Applications  
In this paper the given functions is twice differentiable and is used to obtain the related measure of directed divergence, measure of intutionistic fuzzy entropy, measure of intutionistic fuzzy directed  ...  We also investigate the monotonic character of the proposed function.  ...  Introduction In the present paper we draw our inspiration for obtaining a new parametric measure of entropy which is the joint effect of measures of information due to Kapur [6] and Burg [2] .  ... 
doi:10.18052/www.scipress.com/bmsa.3.24 fatcat:ityevtiy6fbrxld24qwpd3rauu

Generalized Measures of Information Transfer [article]

Paul L. Williams, Randall D. Beer
2011 arXiv   pre-print
Transfer entropy provides a general tool for analyzing the magnitudes and directions---but not the kinds---of information transfer in a system. We extend transfer entropy in two complementary ways.  ...  The new measures are demonstrated on several systems that extend examples from previous literature.  ...  In particular, the information-theoretic measure transfer entropy (TE) [2, 3] has become widely adopted as a standard measure of information transfer, with applications in neuroscience [4] , cellular  ... 
arXiv:1102.1507v1 fatcat:at2ydwmetjbpblxvgomgp27t7e

Measure of Uncertainty and Information [chapter]

2005 Uncertainty and Information  
This contribution overviews the approaches, results and history of attempts at measuring uncertainty and information in the various theories of imprecise probabilities.  ...  This writing summarizes my personal view of the subject. I do not cover measuring fuzziness here. Please, see one of the above references for an introduction and relevant references.  ...  RP960351 from the National Science and Technology Board and the Ministry of Education, Singapore.  ... 
doi:10.1002/0471755575.ch6 fatcat:tjzbsloslze77b64b6cbi2wvv4

Information-based measure of nonlocality

Alberto Montina, Stefan Wolf
2016 New Journal of Physics  
Thus, a natural measure of nonlocal correlations is provided by the minimal amount of communication required for classically simulating them.  ...  This measure turns out to have an important role in communication complexity and can be used to discriminate between local and nonlocal correlations, as an alternative to the violation of Bell's inequalities  ...  Analysis of Experimental Qudit Correlations".  ... 
doi:10.1088/1367-2630/18/1/013035 fatcat:woza7x74dvfrnovs5hdmfdy6du

Statistical Measurement of Information Leakage [chapter]

Konstantinos Chatzikokolakis, Tom Chothia, Apratim Guha
2010 Lecture Notes in Computer Science  
In this paper, we show that measures of information leakage based on mutual information and capacity can be calculated, automatically, from trial runs of a system alone.  ...  Information theory provides a range of useful methods to analyse probability distributions and these techniques have been successfully applied to measure information flow and the loss of anonymity in secure  ...  The information theoretic notion of mutual information measures the amount of information that can be sent across this channel, under a particular usage pattern, and therefore measures the amount of information  ... 
doi:10.1007/978-3-642-12002-2_33 fatcat:ewawqi2ij5dotpe5stn4envluq

Improved Measures of Integrated Information

Max Tegmark, Anil Seth
2016 PLoS Computational Biology  
Although there is growing interest in measuring integrated information in computational and cognitive systems, current methods for doing so in practice are computationally unfeasible.  ...  of measure for comparing probability distributions (7 options).  ...  information of a bivariate distribution is simply the KL-measure of how non-separable it is.  ... 
doi:10.1371/journal.pcbi.1005123 pmid:27870846 pmcid:PMC5117999 fatcat:afutsi4mtfgo5i4l2dg4kye2f4

A measure of informed choice

Theresa M. Marteau, Elizabeth Dormandy, Susan Michie
2001 Health Expectations  
We are grateful for comments to an earlier draft of this paper from those attending the meeting.  ...  This paper was presented at a workshop on informed choice organised by the National Screening Committee.  ...  It also provides theory-based measures of these core constructs to form the basis for developing a measure of informed choice.  ... 
doi:10.1046/j.1369-6513.2001.00140.x pmid:11359540 pmcid:PMC5060053 fatcat:yjfdlqbedzglpap767b44mc2ni

Axiomatic Characterizations of Information Measures

Imre Csiszár
2008 Entropy  
Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures.  ...  Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed.  ...  in finding operational meanings of a previously insignificant information measure.  ... 
doi:10.3390/e10030261 fatcat:woa3ngtq5ba57jvnlc4lns5qkm

Information content of polarization measurements

D. G. Ireland
2010 Physical Review C  
Information entropy is applied to the state of knowledge of reaction amplitudes in pseudoscalar meson photoproduction, and a scheme is developed that quantifies the information content of a measured set  ...  It is shown that this definition of information is a more practical measure of the quality of a set of measured observables than whether the combination is a mathematically complete set.  ...  MEASURING INFORMATION A.  ... 
doi:10.1103/physrevc.82.025204 fatcat:xplavzohgvedheqz5vgwyeaxfe

An informational measure of correlation

E.H. Linfoot
1957 Information and Control  
An important advantage of the informational measures of correlation ro and rl in physical applications is that they are independent of the particular manner in which the measure numbers x and y are assigned  ...  Thus r0 can be interpreted as an information gain which provides a measure of the correlation between x and y.  ... 
doi:10.1016/s0019-9958(57)90116-x fatcat:2oixopdhofdmdbvkuctdkejfxe

Informational power of quantum measurements

Michele Dall'Arno, Giacomo Mauro D'Ariano, Massimiliano F. Sacchi
2011 Physical Review A. Atomic, Molecular, and Optical Physics  
We introduce the informational power of a quantum measurement as the maximum amount of classical information that the measurement can extract from any ensemble of quantum states.  ...  We restate the problem of evaluating the informational power as the maximization of the accessible information of a suitable ensemble.  ...  2-dimensional symmetric informationally complete quantum measurement (i. e., the tetrahedral measurement).  ... 
doi:10.1103/physreva.83.062304 fatcat:sdun73z5y5dutnrdastc4rykhe

Informing Measurement of Cooperative Performance [chapter]

Jason R. V. Franken, Michael L. Cook
2014 Interfirm Networks  
Prior work assessing cooperative performance focuses mostly on available financial accounting measures commonly used to evaluate investor owned firms.  ...  The degree of correspondence among these aspects of performance varies with cooperative type.  ...  financial performance but do not necessarily inform on achievement of other objectives of serving patron needs.  ... 
doi:10.1007/978-3-319-10184-2_11 fatcat:n3fsrx565bfohm4q3dkgrtirzq

On measures of "useful" information

Bhu Dev Sharma, Jagdish Mitter, Man Mohan
1978 Information and Control  
The quantitative-qualitative measure of information as given by Bells and Guiasu is additive, the additivity being a modification of the additivity of Shannon's measure with due place for utilities of  ...  Starting from a particular type of nonadditivity relation we characterize a measure of nonadditive "useful" information, which may be considered as a quantitative-qualitative measure corresponding to the  ...  One of the authors (Man Mohan) is also thankful to Professor U. N.  ... 
doi:10.1016/s0019-9958(78)90671-x fatcat:6zgkr5tzabd4dbil73hbrwtbzm

New parametric measures of information

K. Ferentinos, T. Papaioannou
1981 Information and Control  
But if we impose the regularity conditions of the Fisherian theory of information, these measures become linear functions of Fisher's measure.  ...  In this paper methods are presented for obtaining parametric measures of information from the non-parametric ones and from information matrices. The properties of these measures are examined.  ...  Parametric measures of information measure the amount of information supplied by the data about an unknown parameter O and are functions of O.  ... 
doi:10.1016/s0019-9958(81)90263-1 fatcat:i5d3acredzdotptdhp46pjpe4m
« Previous Showing results 1 — 15 out of 12,199,749 results