Exact Probability Distribution versus Entropy
AbstractThe problem addressed concerns the determination of the average number of successive attempts of guessing a word of a certain length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations to a natural language are considered. The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations are necessary in order to estimate the number of guesses. Several kinds of approximations are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU) time. When considering realistic sizes of alphabets and words (100), the number of guesses can be estimated within minutes with reasonable accuracy (a few percent) and may therefore constitute an alternative to, e.g., various entropy expressions. For many probability distributions, the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion of guesses needed on average compared to the total number decreases almost exponentially with the word length. The leading term in an asymptotic expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided. View Full-Text
Share & Cite This Article
Andersson, K. Exact Probability Distribution versus Entropy. Entropy 2014, 16, 5198-5210.
Andersson K. Exact Probability Distribution versus Entropy. Entropy. 2014; 16(10):5198-5210.Chicago/Turabian Style
Andersson, Kerstin. 2014. "Exact Probability Distribution versus Entropy." Entropy 16, no. 10: 5198-5210.