Scanning and prediction in multi-dimensional data arrays

N. Merhav, T. Weissmani
Proceedings IEEE International Symposium on Information Theory,  
The problem of sequentially scanning and predicting data arranged in a multidimensional array is considered. We introduce the notion of a scandictor, which is any scheme for the sequential scanning and prediction of such multidimensional data. The scandictability of any finite (probabilistic) data array is defined as the best achievable expected "scandiction" performance on that array. The scandictability of any (spatially) stationary random field on is defined as the limit of its
more » ... y on finite "boxes" (subsets of ), as their edges become large. The limit is shown to exist for any stationary field, and essentially be independent of the ratios between the box dimensions. Fundamental limitations on scandiction performance in both the probabilistic and the deterministic settings are characterized for the family of difference loss functions. We find that any stochastic process or random field that can be generated autoregressively with a maximum-entropy innovation process is optimally "scandicted" the way it was generated. These results are specialized for cases of particular interest. The scandictability of any stationary Gaussian field under the squared-error loss function is given a single-letter expression in terms of its spectral measure and is shown to be attained by the raster scan. For a family of binary Markov random fields (MRFs), the scandictability under the Hamming distortion measure is fully characterized.
doi:10.1109/isit.2002.1023589 fatcat:pzh67zj77recfbwqbgsovgwlju