The Internet Archive has a preservation copy of this work in our general collections.
The file type is application/pdf
.
Filters
The Data Compression Theorem for Ergodic Quantum Information Sources
[article]
2003
arXiv
pre-print
We extend the data compression theorem to the case of ergodic quantum information sources. ...
The rate of this compression scheme is equal to the von Neumann entropy rate. ...
The authors are grateful to Ruedi Seiler, Rainer Siegmund-Schultze and Tyll Krüger for helpful discussions and valuable comments on this paper. ...
arXiv:quant-ph/0301043v1
fatcat:ideltecajfdwneqijwijat7vze
The Data Compression Theorem for Ergodic Quantum Information Sources
2005
Quantum Information Processing
We extend the data compression theorem to the case of ergodic quantum information sources. ...
The rate of this compression scheme is equal to the von Neumann entropy rate. * ...
The authors are grateful to Ruedi Seiler, Rainer Siegmund-Schultze and Tyll Krüger for helpful discussions and valuable comments on this paper. ...
doi:10.1007/s11128-003-3195-1
fatcat:bj4uxclth5ewrguv7h5t4wcloy
Smooth Rényi Entropy of Ergodic Quantum Information Sources
2007
2007 IEEE International Symposium on Information Theory
We will actually consider ergodic quantum information sources, of which ergodic classical information sources are a special case. ...
entropy rate for a quantum source. ...
ACKNOWLEDGMENT Boris Skoric is gratefully acknowledged for discussions in the early stage of this work. ...
doi:10.1109/isit.2007.4557235
dblp:conf/isit/SchoenmakersTTV07
fatcat:z2he5ftj3nfovcfwdjhga7a3au
Page 3651 of Mathematical Reviews Vol. , Issue 99e
[page]
1999
Mathematical Reviews
For this extended context- tree weighting algorithm we show that with probability one the compression ratio is not larger than the source entropy for source sequence length T — oo for stationary and ergodic ...
A sufficient condition for information sources, ensuring decoder self-synchronisation for the T-encoded ...
Smooth Rényi Entropy of Ergodic Quantum Information Sources
[article]
2007
arXiv
pre-print
and the von Neumann entropy rate for a quantum information source. ...
We prove that the average smooth Renyi entropy rate will approach the entropy rate of a stationary, ergodic information source, which is equal to the Shannon entropy rate for a classical information source ...
Clearly, the quantum AEP for ergodic sources implies the classical AEP for ergodic sources. ...
arXiv:0704.3504v1
fatcat:3rlqplmb3ng37ffjukhufv5wcq
Page 2258 of Mathematical Reviews Vol. , Issue 99c
[page]
1991
Mathematical Reviews
99c:94014
data compression. {For the entire collection see MR 98m:94002.} ...
The estimators are Cesaro averages of longest match-lengths, and their consistency follows from a generalized ergodic theorem due to Maker. ...
The Shannon-McMillan Theorem for Ergodic Quantum Lattice Systems
[article]
2002
arXiv
pre-print
The one-dimensional case covers quantum information sources and is basic for coding theorems. ...
The theorem demonstrates the significance of the von Neumann entropy for translation invariant ergodic quantum spin systems on n-dimensional lattices: the entropy gives the logarithm of the essential number ...
We want to express our deep thanks to Ruedi Seiler who emphasized the relevance of a quantum version of the Shannon-McMillan theorem to us. ...
arXiv:math/0207121v3
fatcat:d5cty6yu4vdzjibwifg6eu2ncq
Second-Order Asymptotics of Visible Mixed Quantum Source Coding via Universal Codes
2016
IEEE Transactions on Information Theory
Our results provide the first example of second order asymptotics for a quantum information-processing task employing a resource with memory. ...
The simplest example of a quantum information source with memory is a mixed source which emits signals entirely from one of two memoryless quantum sources with given a priori probabilities. ...
Acknowledgements We would like to thank Vincent Tan for useful discussions, and the anonymous referees for helpful feedback. ...
doi:10.1109/tit.2016.2571662
fatcat:l6jenaf4nng6ne5ulpslqd4t3u
Page 3377 of Mathematical Reviews Vol. , Issue 97E
[page]
1997
Mathematical Reviews
It is an approximate fixed-length string matching data compression combined with a block-coder based on the empirical distribution. ...
(IL-TLAVH-CME; Holon)
Optimal data compression algorithm. (English summary)
Comput. Math. Appl. 32 (1996), no. 5, 57-72. ...
Guest editorial
1998
IEEE Transactions on Information Theory
The timely tutorial survey of quantum information theory by Bennett and Shor gives an accessible introduction to compression, transmission and cryptography for quantummechanical models. ...
Csiszár gives an overview of the basic tools and achievements of this method. Determining the fundamental limits of sources and channels with memory often calls for ergodic-theoretic tools. ...
doi:10.1109/tit.1998.720529
fatcat:fqja5ctzhjhljlgswymiclcuvm
Fifty years of Shannon theory
1998
IEEE Transactions on Information Theory
A brief chronicle is given of the historical development of the central problems in the theory of fundamental limits of data compression and reliable communication. ...
Unlike memoryless sources, for which the AEP is equivalent to the weak law of large numbers, showing that the AEP is satisfied for stationary ergodic sources requires a nontrivial use of the ergodic theorem ...
Because of its relevance to data compression, it is natural to investigate whether Theorem 3 applies to sources with memory. ...
doi:10.1109/18.720531
fatcat:gj4lp6rgtza5bd6becmv3kx2zu
Fifty Years of Shannon Theory
[chapter]
2009
Information Theory
A brief chronicle is given of the historical development of the central problems in the theory of fundamental limits of data compression and reliable communication. ...
Unlike memoryless sources, for which the AEP is equivalent to the weak law of large numbers, showing that the AEP is satisfied for stationary ergodic sources requires a nontrivial use of the ergodic theorem ...
Because of its relevance to data compression, it is natural to investigate whether Theorem 3 applies to sources with memory. ...
doi:10.1109/9780470544907.ch2
fatcat:h22tqdqxsjcblhwqvrphmwzvhu
Data compression limit for an information source of interacting qubits
[article]
2002
arXiv
pre-print
We establish the limit for the compression of information from such a source and show that asymptotically it is given by the von Neumann entropy rate. ...
Our result can be viewed as a quantum analog of Shannon's noiseless coding theorem for a class of non - i.i.d. quantum information sources. ...
Hence, the data compression limit, for the class of non-i.i.d. quantum information sources considered in this paper, is given by the von Neumann entropy rate h. ...
arXiv:quant-ph/0207069v2
fatcat:hbtlh73wt5blxar7zpyh5jxawe
Page 7263 of Mathematical Reviews Vol. , Issue 2003i
[page]
2003
Mathematical Reviews
This is analogous to the Kraft in- equality in lossless data compression. In the case of stationary ergodic sources our results reduce to the classical coding theo- rems. ...
In the present paper the authors show that for the quantum channel ®(p) = a4 (Tr(p)l - p") in the Hilbert space H, dim H = d > 3, the multiplicativity conjecture is not valid for two copies of this channel ...
A Quantum Version of Sanov's Theorem
2005
Communications in Mathematical Physics
We present a quantum extension of a version of Sanov's theorem focussing on a hypothesis testing aspect of the theorem: There exists a sequence of typical subspaces for a given set Ψ of stationary quantum ...
Analogously to the classical case, the exponential separating rate is equal to the infimum of the quantum relative entropy with respect to the quantum reference state over the set Ψ. ...
This work was supported by the DFG via the project "Entropie, Geometrie und Kodierung großer Quanten-Informationssysteme", by the ESF via the project "Belearning" at the TU Berlin and the DFG-Forschergruppe ...
doi:10.1007/s00220-005-1426-2
fatcat:nzd6hhrvebdvbbv4ywde3cyyeq
« Previous
Showing results 1 — 15 out of 970 results