A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is
Physics of the Shannon Limits
IEEE Transactions on Information Theory
We provide a simple physical interpretation, in the context of the second law of thermodynamics, to the information inequality (a.k.a. the Gibbs' inequality, which is also equivalent to the log-sum inequality), asserting that the relative entropy between two probability distributions cannot be negative. Since this inequality stands at the basis of the data processing theorem (DPT), and the DPT in turn is at the heart of most, if not all, proofs of converse theorems in Shannon theory, it isdoi:10.1109/tit.2010.2053867 fatcat:5ti4x4j5ebcgvalitqdajc43r4