next up previous
Next: Coding Theorem 1:

1995 Plenary Lecture

Note: This material was originally presented at the 1995 Information Theory Symposium and has since been published in a special issue of the Russian journal Problems of Information Transmission dedicated to Mark Pinsker on his 70th birthday. Many thanks to Jacob Ziv and to Plenum Publishing Corp. for granting us permission to re-print this article.

Inequalities and Algorithms for Universal Data Compression

Jacob Zivgif gif
Department of Electrical Engineering
Technion, Haifa, Israel

Abstract:

We describe non-asymptotic uniform bounds on the performance of data-compression algorithms in cases where the length N of the training sequence (``history") that is available to the encoder is not large enough so as to yield the ultimate compression, namely the entropy of the source. Two characteristic ultimate goals are considered: The tex2html_wrap_inline627 -th order entropy tex2html_wrap_inline629 , and the associated conditional entropy tex2html_wrap_inline631 . The bounds are based on classical information-theoretic convexity arguments. Nevertheless, it is demonstrated that convexity arguments that work for some cases are totally useless for others. Furthermore, these classical convexity arguments, when properly used, lead to efficient universal data compression algorithms for each of the considered cases.

We limit the discussion to Fixed-to-Variable, uniquely decipherable codes.

We consider Fixed-to-Variable universal encoders with a limited training data that noiselessly compress blocks of some fixed length tex2html_wrap_inline627 and derive bounds on the rate of approach of the compression to the tex2html_wrap_inline627 th order entropy, and to the smaller conditional entropy, as a function of tex2html_wrap_inline627 and of the length N of the training sequence.

Let

eqnarray431

For stationary sources, the tex2html_wrap_inline627 -th order per letter entropy is:

displaymath645

and

  equation433

Now, let tex2html_wrap_inline647 be the length function of tex2html_wrap_inline649 (namely, the number of bits that represent tex2html_wrap_inline649 ). Then [1]:

  equation436

Assume that only the tex2html_wrap_inline627 -th order statistics of the information source are known to the encoder. Then, equipped with this information, we can establish the following Coding Theorem.




next up previous
Next: Coding Theorem 1:

Ramesh Rao
Thu Sep 19 17:06:16 PDT 1996