Filters








255,370 Hits in 8.8 sec

The mutual information: Detecting and evaluating dependencies between variables

R. Steuer, J. Kurths, C. O. Daub, J. Weise, J. Selbig
2002 Bioinformatics  
One alternative, based on information theory, is the mutual information, providing a general measure of dependencies between variables.  ...  Results: Here we describe and review several approaches to estimate the mutual information from finite datasets.  ...  The authors would like to thank W.Ebeling (HU-Berlin), J.Kopka (MPIMP) and S.Kloska (Scienion AG, Berlin) for stimulating discussion.  ... 
doi:10.1093/bioinformatics/18.suppl_2.s231 pmid:12386007 fatcat:otrfcc2gjjgzlgha4avq44ic3y

Equitability, mutual information, and the maximal information coefficient

Justin B. Kinney, Gurinder S. Atwal
2014 Proceedings of the National Academy of Sciences of the United States of America  
Reshef et al. recently proposed a new statistical measure, the "maximal information coefficient" (MIC), for quantifying arbitrary dependencies between pairs of stochastic quantities.  ...  We then propose a self-consistent and more general definition of equitability that follows naturally from the Data Processing Inequality.  ...  We thank David Donoho, Bud Mishra, Swagatam Mukhopadhyay, and Bruce Stillman for their helpful feedback.  ... 
doi:10.1073/pnas.1309933111 pmid:24550517 pmcid:PMC3948249 fatcat:22na2pkyg5co7ddayx3pjjafwa

A mutual information extension to the matched filter

Deniz Erdogmus, Rati Agrawal, Jose C. Principe
2005 Signal Processing  
In this paper, we introduce a nonlinear filter for signal detection based on the Cauchy-Schwartz quadratic mutual information (CS-QMI) criterion.  ...  Matched filters are the optimal linear filters for signal detection under linear channel and white noise conditions.  ...  MUTUAL INFORMATION Mutual information indicates the amount of shared information between two or more random variables.  ... 
doi:10.1016/j.sigpro.2004.11.018 fatcat:7hrsbkvihrfbxj3l5epavf72c4

The mutual information principle and applications

Nicolas S. Tzannes, Joseph P. Noonan
1973 Information and Control  
A general theory of prior probability models is presented, valid for both discrete and continuous random variables, even when the prior information about them has been obtained with errors.  ...  In the first place, there is something intuitively pleasing about assigning a prior to X, so that the mutual information between X and the observed Y is minimized.  ...  Mutual information is invariant under linear transformations of the random variables, and it is also convex U and thus possesses the desired minimum.  ... 
doi:10.1016/s0019-9958(73)90448-8 fatcat:yay6syc5mvbhtgjohn2sm53qtq

Evaluation of Symmetric Mutual Information of the Simplified TDMR Channel Model [article]

Tadashi Wadayama
2015 arXiv   pre-print
proposed channel model incorporates the effects of both linear interference from adjacent bit-cells and signal-dependent noise due to irregular grain boundaries between adjacent bit-cells.  ...  The symmetric mutual information is closely related to the areal density limit for TDMR systems.  ...  Even the evaluation of the mutual information itself is not a simple problem because we need to handle signal-dependent noise that results in a nonuniform conditional PDF.  ... 
arXiv:1503.08937v1 fatcat:7fw2gjplhzaphed6mohq4jwc6e

Mutual information and the F-theorem [article]

Horacio Casini, Marina Huerta, Robert C. Myers, Alexandre Yale
2015 arXiv   pre-print
We rederive the proof of the c-theorem for d=3 in terms of mutual information, and check our arguments with holographic entanglement entropy, a free scalar field, and an extensive mutual information model  ...  A coefficient in the mutual information between concentric circular entangling surfaces gives a precise universal prescription for the monotonous quantity in the c-theorem for d=3.  ...  We begin regulating the entanglement entropy of Γ by evaluating the mutual information between regions defined by where Γ + corresponds to the boundary of exterior region A + and Γ − corresponding to the  ... 
arXiv:1506.06195v1 fatcat:jhaouzfnifhzdmeplvfvgxdqdu

Mutual information and the F-theorem

Horacio Casini, Marina Huerta, Robert C. Myers, Alexandre Yale
2015 Journal of High Energy Physics  
We rederive the proof of the c-theorem for d = 3 in terms of mutual information, and check our arguments with holographic entanglement entropy, a free scalar field, and an extensive mutual information  ...  A coefficient in the mutual information between concentric circular entangling surfaces gives a precise universal prescription for the monotonous quantity in the c-theorem for d = 3.  ...  The quantity of interest is then the mutual information between A + and A − , i.e., .  ... 
doi:10.1007/jhep10(2015)003 fatcat:txvw3ptnzzakvbrscj2ovjdcqu

The Information Mutual Information Ratio for Counting Image Features and Their Matches [article]

Ali Khajegili Mirabadi, Stefano Rini
2020 arXiv   pre-print
Notably, the relationship of the IR and MIR with the image entropy and mutual information, classic information measures, are discussed.  ...  In this paper, two new image features are proposed: the Information Ratio (IR) and the Mutual Information Ratio (MIR).  ...  The IR and the MIR image features, are fundamentally linked to the image entropy and image mutual information which are fundamental measures of variability and dependence among images, respectively.  ... 
arXiv:2005.06739v1 fatcat:4eqyp3g2ubgpfeqixyqzzapota

Measuring Nonlinear Serial Dependencies Using the Mutual Information Coefficient

Witold Orzeszko
2010 Dynamic Econometric Models  
Moreover, the mutual information measure has been applied to the indices and the sector sub-indices of the Warsaw Stock Exchange.  ...  K e y w o r d s: nonlinearity, mutual information coefficient, mutual information, serial dependencies.  ...  Let k i denotes an estimated value of the mutual information measure between variables t X and k t X − .  ... 
doi:10.12775/dem.2010.008 fatcat:swlc3bc53bhvdeszfyt3pgfsta

Jackknife approach to the estimation of mutual information

Xianli Zeng, Yingcun Xia, Howell Tong
2018 Proceedings of the National Academy of Sciences of the United States of America  
We are most grateful to the editor and two referees for their meticulous review, valuable comments, and constructive suggestions, which have led to a substantial improvement of this paper.  ...  Significance As a fundamental concept in information theory, mutual information has been commonly applied to quantify the dependence between variables.  ...  Recent studies have focused on the renowned mutual information (MI) [ A key issue in data science is how to measure the dependence between two random variables.  ... 
doi:10.1073/pnas.1715593115 pmid:30224466 pmcid:PMC6176556 fatcat:gptof3nnevewzgymptp32mpmdm

Inferring the directionality of coupling with conditional mutual information

Martin Vejmelka, Milan Paluš
2008 Physical Review E  
In this paper, we discuss a nonparametric method for detecting the directionality of coupling based on the estimation of information theoretic functionals.  ...  Numerical experiments in detecting coupling directionality are performed using chaotic oscillators, where the influence of the phase extraction method and relative frequency ratio is investigated.  ...  ACKNOWLEDGMENTS This study was supported by the EC FP6 project BRAC-CIA Contract No. 517133 NEST and in part by the Institutional Research Plan No. AV0Z10300504.  ... 
doi:10.1103/physreve.77.026214 pmid:18352110 fatcat:6s3l5vxwmnd7hjfnagflm4fqve

Estimation of mutual information by the fuzzy histogram

Maryam Amir Haeri, Mohammad Mehdi Ebadzadeh
2014 Fuzzy Optimization and Decision Making  
Mutual Information (MI) is an important dependency measure between random variables, due to its tight connection with information theory. It has numerous applications, both in theory and practice.  ...  Our experiments show that, in contrast to the naïve histogram MI estimator, the fuzzy-histogram MI estimator is able to reveal all dependencies between the geneexpression data.  ...  For bivariate normal distribution, the mutual information between its variates depends only on the correlation coefficient ρ.  ... 
doi:10.1007/s10700-014-9178-0 fatcat:3y5dzqqobrdcjn5b5orsuxtwiy

Mutual Information Estimation using LSH Sampling

Ryan Spring, Anshumali Shrivastava
2020 Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence  
Current approaches in representation learning seek to maximize the mutual information between the learned representation and original data.  ...  One of the most popular ways to estimate mutual information (MI) is based on Noise Contrastive Estimation (NCE).  ...  From an information-theoretic perspective, this amounts to maximize the mutual information term I(y; S ).  ... 
doi:10.24963/ijcai.2020/385 dblp:conf/ijcai/YuSAP20 fatcat:sdf77ztcpre4bfc6wdssxpcfcm

Mutual information analysis of the factors influencing port throughput

Majid Eskafi, Milad Kowsari, Ali Dastgheib, Gudmundur F. Ulfarsson, Poonam Taneja, Ragnheidur I. Thorarinsdottir
2020 Maritime Business Review  
The method gives a unique measure of dependence between two variables by quantifying the amount of information held in one variable through another variable.  ...  Design/methodology/approach Mutual information is applied to measure the linear and nonlinear correlation among variables.  ...  Mutual information quantifies the statistical dependence between two random variables.  ... 
doi:10.1108/mabr-05-2020-0030 fatcat:qh7rp2nx5feb7evp6xlbhu46by

On the Effect of Suboptimal Estimation of Mutual Information in Feature Selection and Classification [article]

Kiran Karra, Lamine Mili
2021 arXiv   pre-print
favorably to kNN, vME, AP, and H_MI estimators of mutual information.  ...  and discrete random variables need to be rank ordered.  ...  ACKNOWLEDGMENTS The authors would like to thank the Hume Center at Virginia Tech for its support.  ... 
arXiv:1804.11021v3 fatcat:53ndba25hjd3zanjyqphpjqtr4
« Previous Showing results 1 — 15 out of 255,370 results