Hello, Kodt, you wrote: And you petty Are possible is better. In the previous message arithmetical coding, and an example as it is possible not to lose entropy in both branches (in unsuccessful when it is impossible to answer, and in successful when it is possible to give the answer) is written, of course, not. Arithmetical coding is written difficult and, generally speaking, demands arithmetics of more and more growing accuracy. In practice so certainly nobody does, and any compressors simply make the decision periodically (in in advance stipulated places) to approximate numbers. For compression it is not critical - its quality decreases only, but unpacking possibility is not lost. For conversion of probabilities such approach is bad that gives non-uniform allocation (though clearly that in practice even without any artful long arithmetics this non-uniformity can be made easily the order 2-60, that is almost imperceptible). Or it is possible to throw and begin besides in case of detection of such situation all with the beginning - uniformity will be, and entropy recycling (we will do more calls rand than it is necessary) worsens. To put it briefly, if it is necessary to receive many numbers it appears to write such coder very easily, modifying in the previous source code literally pair of lines: using ui = uint64_t;//s = 5, d = 7 template <ui s, ui d> ui gen () {static ui g = 1, v = 0; constexpr ui max = std:: numeric_limits <ui>:: max () / s; do {while (g <= max) {g * = s; v = v * s + random <s> ();}; ui f = g / d * d; if (v <f) {ui r = v % d; g / = d; v / = d; return r;} g - = f; v - = f;} while (1);} As payment log or time delay appearance will appear: the present arithmetical coding (as well as all remaining ideas in this branch) will be aspires to answer as soon as possible that is as soon as it is possible to distinguish one answer from another. The same source code at first always does some "idle" calls random and only then starts to produce the answer. Such accumulation of entropy also allows to make a situation when it is necessary to launch a cycle at first, very improbable. Thus this entropy is not lost, at desire it can be got reversely, by code complication. That is if it is necessary one million random numbers under the circuit 5 -> 7 at first there will be 24 calls random, then for one million numbers following almost will be it is spent on the average log 57 calls random for number, and for the last several numbers of calls random at all will not be, as it is possible to manage already stored entropy in counters, that is it is possible not to do just 24 calls random which have already been made in the beginning. Exact values to consider laziness (they simply are considered only for first two algorithms), therefore simply launched some experiments on generation of 10 8 random numbers: transformation algorithm -> transfer the Author: Sinix Date: 08.04 21:35 storage of failure the Author: Date: 09.04 03:23 storage of success the Author: watchmaker Date: 09.04 03:59 accumulation of entropy (this message) arithmetical coding ( a minimum) 8 -> 2 calls on the answer 1.00000 1.00000 0.33333 0.33333 0.33333 superfluous calls of +200.000 % of +200.000 % of +0.000 % of +0.000 % of +0.000 % a time delay 0 0 0 20 5 -> 7 calls on the answer 2.38086 2.21235 1.54579 1.20906 1.20906 superfluous calls of +96.918 % of +82.981 % of +27.850 % of +0.000 % of +0.000 % a time delay 0 0 0 24 7 -> 5 calls on the answer 1.39989 1.37666 1.15922 0.82709 0.82709 superfluous calls of +69.255 % of +66.446 % of +40.157 % of +0.000 % of +0.000 % a time delay 0 0 1 21 3 -> 14 calls on the answer 5.78528 3.64286 3.23388 2.40217 2.40217 superfluous calls of +140.835 % of +51.648 % of +34.623 % of +0.000 % of +0.000 % a time delay 0 0 3 33 14 -> 3 calls on the answer 1.16664 1.14884 0.59734 0.41629 0.41629 superfluous calls of +180.247 % of +175.970 % of +43.490 % of +0.000 % of +0.000 % a time delay 0 0 0 15 3 -> 2 calls on the answer 1.50010 1.49976 1.50003 0.63093 0.63093 superfluous calls of +137.759 % of +137.706 % of +137.749 % of +0.000 % of +0.000 % a time delay 0 0 0 39 2 -> 3 calls on the answer 2.66673 2.66664 2.66658 1.58496 1.58496 superfluous calls of +68.252 % of +68.246 % of +68.242 % of +0.000 % of +0.000 % a time delay 0 0 0 59 14 -> 15 calls on the answer 2.01024 2.01026 1.14870 1.02614 1.02614 superfluous calls of +95.903 % of +95.904 % of +11.943 % of +0.000 % of +0.000 % a time delay 0 0 0 13 15 -> 14 calls on the answer 1.07142 1.07142 1.07140 0.97452 0.97452 superfluous calls of +9.944 % of +9.943 % of +9.941 % of +0.000 % of +0.000 % a time delay 1 0 0 14 in general are visible that at conversion 5 -> 7 if to follow strategy "to throw 2 times, if it turned out more than 21 to throw", 97 % of superfluous calls in comparison with optimal strategy, where calls log 57 turn out. If to add optimization of Kodta the number of superfluous calls decreases to 83 %. If to remember also value of variables between calls 28 % of losses already turn out. If beforehand to save up entropy superfluous calls will not be at all - a cycle do while it was never fulfilled twice, and losses on a rounding off before return also never proved - so happens not always, with the total not the zero there turns out. But, if I truly estimated, probability that it is required to us of more calls random than to arithmetical coding, for 10 8 calls makes approximately 7*10-11.