Vector quantization of contextual information for lossless image compression

X. Ginesta, S.P. Kim
Proceedings of IEEE Data Compression Conference (DCC'94)  
In the companion paper 1121, we presented a context-tree based lossless image compression approach. The most important step in the implementation of the proposed algorithm is the construction of a context-tree from a given (set of) training image(s). It has been shown that the design process of a context-tree parallels the idea of non-binary Pruned Tree Structured Vector Quantization (PTSVQ). Vectors are represented by the conditional probability tables of the contexts identified by the tree
more » ... their respective distances are measured in terms of the overall entropy increase after two tables (vectors) are merged. The PTSVQ requires a fully grown initial context-tree which, in practice, is impossible to achieve due to enormous associated memory requirements. Additionally, even in the case of being able to grow a full context tree, the huge number of contexts that could potentially appear would make the optimal pruning procedure computationally impractical. Therefore, the conventional approach of optimally pruning the worst branches off from the initial tree until some performance constraint is satisfied is not applicable. In this paper we present a new TSVQ algorithm, called Incremental Tree Growing (ITG), which incrementally grows and quantizes the context tree locally on a level by level basis, thus drastically reducing both the memory requirements and the computational complexity. After the Incremental Tree Growing is completed, terminal branches are globally vector quaqntized again, which is possible due to significant reduction of the number of initial branches. Using a technique similar in spirit to the Mean Removed VQ (MRVQ) [ll], significant reduction in the number of probability tables is achieved. In one of our simulations, for example (see [12] ), we reduced the number of probability tables from 262K(=218) to 108 by using ITG algorithm, and from 108 to 10 using the modified MRVQ at the conditional entropy increase of only 1.26%. In summary, the proposed ITG algorithm combined with a modified MRVQ provides an efficient framework for the design of context-trees with reduced memory and computational requirements.
doi:10.1109/dcc.1994.305947 dblp:conf/dcc/GinestaK94 fatcat:f4ejua4ocfc2vedrtdygczj3uu