Entropy/connections between different meanings of entropy/entropy example 5

From Scholarpedia
Jump to: navigation, search

    Consider the infinite string obtained by concatenating consecutively all words of length 1, then of length 2, etc, each time ordered lexicographically:

    \[ B=0,1,\,00,01,10,11,\,000,001,010,011,100,101,110,111,\,0000,0001\dots \]

    It is not hard to see that this string generates by frequencies the uniform measure, whose Kolmogorov-Sinai entropy equals 1. So, according to the assertion of the Compression Theorem, \(B\) should not allow for any compression. On the other hand, since \(B\) can be completely determined by a finite set of instructions (as is done above), its Kolmogorov complexity is zero. This phenomenon follows from the fact that the Compression Theorem takes into account only frequency-based data compression codes, while Kolmogorov complexity allows one to use any algorithmic regularity in the structure of the message. Nevertheless, the Compression Theorem says that the collection of messages with regularities undetected by frequency-based algorithms has measure zero.

    Personal tools
    Namespaces

    Variants
    Actions
    Navigation
    Focal areas
    Activity
    Tools