On the entropy rate of word-valued sources

R. Timo, K. Blackmore, L. Hanlen
2007 2007 Australasian Telecommunication Networks and Applications Conference  
A word-valued source Y is any discrete finite alphabet random process that is created by encoding a discrete random process X with a symbol-to-word function f . The first result of this paper solves an open problem by proving an existence theorem for the entropy rate of word valued sources: If X is ergodic, then the entropy rate of the word valued source Y exists, and it is upper bound by the entropy rate of X divided by the expected codeword length. More generally, if X is Asymptotically Mean
more » ... symptotically Mean Stationary (AMS), then entropy rate of Y exists, and it is upper bound by the expectation of the entropy rate of each stationary ergodic sub-source divided by the expected codeword length of that sub-source. The second result of this paper proves a "conservation of entropy result" for decodable word functions: If f is decodable and X is AMS, then the entropy rate of the word valued source Y is equal to the expectation of the entropy rate of each stationary ergodic sub-source divided by the expected codeword length of that sub-source. This result generalizes existing entropy conservation results for prefix free word functions.
doi:10.1109/atnac.2007.4665292 fatcat:xt4ooow6aja3tpzmgl2vuj6454