Special Issue "Maximum Entropy and Bayes Theorem"

Quicklinks

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (30 June 2013)

Special Issue Editor

Guest Editor
Dr. Adom Giffin
Mathematics & Computer Science, Clarkson University, Potsdam, NY 13699, USA
E-Mail: agiffin@clarkson.edu
Interests: Bayesian inference and maximum entropy methods; data fusion; multi-scale dynamical systems; machine learning; probability theory

Special Issue Information

Dear Colleagues,

The probabilistic version of entropy originated with Boltzmann. Much later, Shannon (1948) created his information “entropy” using an axiomatic approach. Finally it was Jaynes (1957) who put the two together and showed that by maximizing Shannon’s entropy, along with specific information from physics  in the form of expectation values, one gets Boltzmann’s and more importantly Gibbs’ statistical mechanics.

Before Shannon, Jeffreys (1939) revitalized the Bayesian (Laplacian) view of probabilities (among others) and examined the ratio of a posterior versus a prior, specifically the log of this ratio. He used this ratio as a kind of measure to compare distributions with a focus on finding an invariant prior. This concept was expanded by others such as Kullback and Leibler (1951), Shore and Johnson (1980), Skilling (1988) and Caticha and Giffin (2006) by using the mean of this ratio as a measure to examine the “distance” between distributions. This mean has a form similar to Shannon’s entropy and is sometimes called the cross entropy or relative entropy.

For the purposes of inference, the goal of both Bayes Theorem and Maximum Entropy is to determine a probability distribution based on certain information. However, even though the two are indubitably linked, the use of each is controversial, particularly Maximum Entropy. In this volume we seek contributions that will shed light on their use, their connections and their controversies. The scope of the contributions will be very broad to include philosophical discussions, theoretical derivations and explicit, practical solutions to current problems. Such problems may come from, but by no means are limited to, topics from fields such as data mining, dynamical systems, ecology, economics, image analysis and computer vision, machine learning, signal analysis, physics and other scientific areas of inquiry. Papers demonstrating that Bayesian or Maximum Entropy methods yielding unique solutions that cannot be produced otherwise are especially welcome.

Dr. Adom Giffin
Guest Editor

Submission

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1200 CHF (Swiss Francs).

Keywords

  • cross entropy
  • relative entropy
  • information theory
  • inference
  • maximum entropy
  • maximum relative entropy
  • Bayes theorem
  • Bayes rule
  • Bayesian
  • machine learning
  • data mining
  • image analysis
  • computer vision
  • machine learning
  • signal analysis
  • data fusion
  • multi-scale dynamical systems
  • distributed computing
  • probability theory

Published Papers (7 papers)

by  and
Entropy 2014, 16(2), 1047-1069; doi:10.3390/e16021047
Received: 12 September 2013; in revised form: 5 February 2014 / Accepted: 7 February 2014 / Published: 19 February 2014
Show/Hide Abstract | PDF Full-text (387 KB) | HTML Full-text | XML Full-text

by ,  and
Entropy 2013, 15(12), 5439-5463; doi:10.3390/e15125439
Received: 9 July 2013; in revised form: 13 November 2013 / Accepted: 3 December 2013 / Published: 9 December 2013
Show/Hide Abstract | PDF Full-text (318 KB)

by  and
Entropy 2013, 15(9), 3528-3591; doi:10.3390/e15093528
Received: 28 June 2013; in revised form: 21 August 2013 / Accepted: 21 August 2013 / Published: 4 September 2013
Show/Hide Abstract | PDF Full-text (732 KB)

by ,  and
Entropy 2013, 15(7), 2861-2873; doi:10.3390/e15072861
Received: 24 May 2013; in revised form: 12 July 2013 / Accepted: 16 July 2013 / Published: 23 July 2013
Show/Hide Abstract | PDF Full-text (289 KB)

by , , ,  and
Entropy 2013, 15(4), 1375-1387; doi:10.3390/e15041375
Received: 4 February 2013; in revised form: 8 April 2013 / Accepted: 15 April 2013 / Published: 17 April 2013
Show/Hide Abstract | Cited by 2 | PDF Full-text (193 KB)

by ,  and
Entropy 2013, 15(3), 988-998; doi:10.3390/e15030988
Received: 24 December 2012; in revised form: 18 February 2013 / Accepted: 1 March 2013 / Published: 6 March 2013
Show/Hide Abstract | Cited by 1 | PDF Full-text (540 KB)

by
Entropy 2013, 15(3), 789-925; doi:10.3390/e15030789
Received: 21 December 2012; in revised form: 11 February 2013 / Accepted: 17 February 2013 / Published: 27 February 2013
Show/Hide Abstract | PDF Full-text (592 KB)

Planned Papers

The below list represents only planned manuscripts. Some of these manuscripts have not been received by the Editorial Office yet. Papers submitted to MDPI journals are subject to peer-review.

Type of Paper: Philosophical Foundations
Title: Objective Bayesianism and the Maximum Entropy Principle
Authors: Juergen Landes and Jon Williamson
Affiliation: Philosophy Department, University of Kent, Canterbury, CT2 7NF, UK; E-Mail: J.Williamson@kent.ac.uk
Abstract: The standard argument for the Bayesian view that rational degrees of belief are probabilities is that they have to be in order to avoid the possibility of sure loss (the Dutch Book Argument). But this can be put another way: in order to minimise worst-case expected loss, rational degrees of belief must be probabilities. Now the goal of minimising worst-case expected loss can be used to argue for other norms of rational belief: that rational degrees of belief must be directly calibrated to known physical probabilities (explicated via the Principal Principle), and that rational degrees of belief must otherwise equivocate as far as possible between the basic possibilities under consideration (explicated via the Maximum Entropy Principle). So the standard motivation for Bayesianism motivates a strong, ‘objective’ Bayesianism that invokes the Maximum Entropy Principle. Interestingly, however, conditional probabilities play a less central part in this account than they do under the subjective view of Bayesianism, leading to a reduced role for Bayes Theorem.

Last update: 7 January 2013

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert