Without cheeks, I not in a subject from a word absolutely.
To "compression" there is an interest academic, yes, and in an Internet on a subject pow&poc and understood, that into it to investigate, time is necessary. It is whole , .., collective calculations, the games theory... It not mine, also I think, the specialized party is necessary to you, to dig ., etc.
At me recently it is more prosy: how to implement one in another a regular method without the documentation, etc. - here it really problem, and like there is who knows, but do not explain, because all acquired that the information corrects a point.
From an Internet 1 :
Vjugin, Kolmlgorovsky complexity and algorithmic randomness, 2012.
Something to a subject Statistical tests (nrjetix_com /fileadmin/doc/publications/Lectures_security/Lecture2_pdf).
Something to comparing of archivers (ict_nsc_ru/ws/YM2007/12817/paper_html).
Still here houses the book, the World, 1988 ~ the Information uncertainty complexity (well, on a cover it is Traub). Truth it tiresome, at them there so is found, did not inspire then.
Everywhere in the end the Literature.
At Kolmogorov . kol-va , entropies on Shannon are more adequate, and something reminds the conditional probability. In it not to put any more outside the brackets volume centralized . But it is inconvenient for implementation.
If once again to return to a sigma (item 2 in the first post). Typical methods "entropy", reduce average entropy, and in such a method it is obvious to eat a limit. It is "the Shannon" approach. Therefore it seems that the unconditional leader in all tests hardly to find. And for example carrying out centralized from the compressor in out of is already , but to pow it like does not approach.
At there can be still a feature to retract a file entirely if it is foreseen that meta data effect from compression.