Special Issue "Maximum Entropy and Bayes Theorem"
A special issue of Entropy (ISSN 1099-4300).
Deadline for manuscript submissions: closed (30 June 2013)
The probabilistic version of entropy originated with Boltzmann. Much later, Shannon (1948) created his information “entropy” using an axiomatic approach. Finally it was Jaynes (1957) who put the two together and showed that by maximizing Shannon’s entropy, along with specific information from physics in the form of expectation values, one gets Boltzmann’s and more importantly Gibbs’ statistical mechanics.
Before Shannon, Jeffreys (1939) revitalized the Bayesian (Laplacian) view of probabilities (among others) and examined the ratio of a posterior versus a prior, specifically the log of this ratio. He used this ratio as a kind of measure to compare distributions with a focus on finding an invariant prior. This concept was expanded by others such as Kullback and Leibler (1951), Shore and Johnson (1980), Skilling (1988) and Caticha and Giffin (2006) by using the mean of this ratio as a measure to examine the “distance” between distributions. This mean has a form similar to Shannon’s entropy and is sometimes called the cross entropy or relative entropy.
For the purposes of inference, the goal of both Bayes Theorem and Maximum Entropy is to determine a probability distribution based on certain information. However, even though the two are indubitably linked, the use of each is controversial, particularly Maximum Entropy. In this volume we seek contributions that will shed light on their use, their connections and their controversies. The scope of the contributions will be very broad to include philosophical discussions, theoretical derivations and explicit, practical solutions to current problems. Such problems may come from, but by no means are limited to, topics from fields such as data mining, dynamical systems, ecology, economics, image analysis and computer vision, machine learning, signal analysis, physics and other scientific areas of inquiry. Papers demonstrating that Bayesian or Maximum Entropy methods yielding unique solutions that cannot be produced otherwise are especially welcome.
Dr. Adom Giffin
Manuscript Submission Information
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.
Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1500 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.
- cross entropy
- relative entropy
- information theory
- maximum entropy
- maximum relative entropy
- Bayes theorem
- Bayes rule
- machine learning
- data mining
- image analysis
- computer vision
- machine learning
- signal analysis
- data fusion
- multi-scale dynamical systems
- distributed computing
- probability theory