Next Article in Journal
A Further Indication of the Self-Ordering Capacity of Water Via the Droplet Evaporation Method
Previous Article in Journal
Entropy Generation in Flow of Highly Concentrated Non-Newtonian Emulsions in Smooth Tubes
Article Menu

Export Article

Open AccessArticle
Entropy 2014, 16(10), 5198-5210; doi:10.3390/e16105198

Exact Probability Distribution versus Entropy

Department of Mathematics and Computer Science, Karlstad University, SE-651 88 Karlstad, Sweden
Received: 26 May 2014 / Revised: 20 September 2014 / Accepted: 26 September 2014 / Published: 7 October 2014
(This article belongs to the Section Information Theory)
View Full-Text   |   Download PDF [245 KB, uploaded 24 February 2015]   |  

Abstract

The problem addressed concerns the determination of the average number of successive attempts of guessing a word of a certain length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations to a natural language are considered. The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations are necessary in order to estimate the number of guesses. Several kinds of approximations are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU) time. When considering realistic sizes of alphabets and words (100), the number of guesses can be estimated within minutes with reasonable accuracy (a few percent) and may therefore constitute an alternative to, e.g., various entropy expressions. For many probability distributions, the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion of guesses needed on average compared to the total number decreases almost exponentially with the word length. The leading term in an asymptotic expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided. View Full-Text
Keywords: information entropy; security; guessing information entropy; security; guessing
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Andersson, K. Exact Probability Distribution versus Entropy. Entropy 2014, 16, 5198-5210.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top