The entropy of a Huffman tree can be defined as the average of codeword lengths weighted by coefficients pw = 2-|w|, for every codeword w. It turns out [1] that the entropy of a Huffman tree is a measure of the optimal compression of a sequence of random objects drawn according to probabilities pw. The average number of bits per leaf is at its smallest when each leaf is represented by its codeword.
The entropy of our tree H is the average distance from root to leaf, while a complete binary tree with the same average has 2H leaves. Thus it is intuitively appealing to assume that the number of leaves swept out by our optimal guessing strategy is an appreciable fraction of the total, and since we are picking the most probable 2H leaves, it is tempting to assume that we would have at least even odds of winning the game.
This document identifies Huffman trees in which the chance of beating the carny not only falls below even odds, but in fact goes to zero in the limiting case.
Notice that there are a subtrees of t2,3 which are isomorphic to t2,2 and t2,1. Furthermore, notice the zig-zag pattern formed in the left halftree by starting from the root and alternating between the proper prefixes of the short codewords and the roots of the subtrees isomorphic to tj,i, for i<k. These observations characterize the self-similarity of these trees. In fact, for a fixed j, the family of trees tj,k can be thought of as truncations of an infinite tree Tj to a depth jk. More of T2 is shown below.
Part of T3 is shown below.
Theorem As k approaches infinity,
Notice that this probability is slightly more than the chance 2-j of guessing correctly on the very first try. Clearly, as both j and k become large, your chance of winning this shell game approaches zero.
If we trust our instincts that the total probability of the 2H most likely events is about 50%, the carny can trick us into playing a game in which we literally have one chance in a billion of winning.
[2] John O. Pliam, Ciphers and their Products: Group Theory in Private Key Cryptography, PhD thesis, University of Minnesota, to appear 1999. See also [3].
[3] John O. Pliam, The Disparity between Work and Entropy in Cryptology," 1998.