Special Issue "Selected Papers from MaxEnt 2016"

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (30 November 2016).

Special Issue Editor

Prof. Dr. Geert Verdoolaege
Website
Guest Editor
1. Research Unit Nuclear Fusion, Department of Applied Physics, Ghent University, Sint-Pietersnieuwstraat 41, B-9000 Ghent, Belgium
2. Laboratory for Plasma Physics, Royal Military Academy (ERM/KMS), Renaissancelaan 30 Avenue de la Renaissance, B-1000 Brussels, Belgium
Interests: probability theory; Bayesian inference; machine learning; information geometry; differential geometry; nuclear fusion; plasma physics; plasma turbulence; continuum mechanics; statistical mechanics
Special Issues and Collections in MDPI journals

Special Issue Information

Dear Colleagues,

This Special Issue will collect a limited number of selected papers presented at the 36th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering (MaxEnt 2016), which will be held at Ghent University, Belgium, in July 2016. Website: http://www.maxent2016.org/.

Prof. Dr. Geert Verdoolaege
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Open AccessArticle
Deriving Proper Uniform Priors for Regression Coefficients, Parts I, II, and III
Entropy 2017, 19(6), 250; https://doi.org/10.3390/e19060250 - 30 May 2017
Cited by 1
Abstract
It is a relatively well-known fact that in problems of Bayesian model selection, improper priors should, in general, be avoided. In this paper we will derive and discuss a collection of four proper uniform priors which lie on an ascending scale of informativeness. [...] Read more.
It is a relatively well-known fact that in problems of Bayesian model selection, improper priors should, in general, be avoided. In this paper we will derive and discuss a collection of four proper uniform priors which lie on an ascending scale of informativeness. It will turn out that these priors lead us to evidences that are closely associated with the implied evidence of the Bayesian Information Criterion (BIC) and the Akaike Information Criterion (AIC). All the discussed evidences are then used in two small Monte Carlo studies, wherein for different sample sizes and noise levels the evidences are used to select between competing C-spline regression models. Also, there is given, for illustrative purposes, an outline on how to construct simple trivariate C-spline regression models. In regards to the length of this paper, only one half of this paper consists of theory and derivations, the other half consists of graphs and outputs of the two Monte Carlo studies. Full article
(This article belongs to the Special Issue Selected Papers from MaxEnt 2016)
Show Figures

Figure 1

Open AccessArticle
Paradigms of Cognition
Entropy 2017, 19(4), 143; https://doi.org/10.3390/e19040143 - 27 Mar 2017
Abstract
An abstract, quantitative theory which connects elements of information —key ingredients in the cognitive proces—is developed. Seemingly unrelated results are thereby unified. As an indication of this, consider results in classical probabilistic information theory involving information projections and so-called Pythagorean inequalities. This [...] Read more.
An abstract, quantitative theory which connects elements of information —key ingredients in the cognitive proces—is developed. Seemingly unrelated results are thereby unified. As an indication of this, consider results in classical probabilistic information theory involving information projections and so-called Pythagorean inequalities. This has a certain resemblance to classical results in geometry bearing Pythagoras’ name. By appealing to the abstract theory presented here, you have a common point of reference for these results. In fact, the new theory provides a general framework for the treatment of a multitude of global optimization problems across a range of disciplines such as geometry, statistics and statistical physics. Several applications are given, among them an “explanation” of Tsallis entropy is suggested. For this, as well as for the general development of the abstract underlying theory, emphasis is placed on interpretations and associated philosophical considerations. Technically, game theory is the key tool. Full article
(This article belongs to the Special Issue Selected Papers from MaxEnt 2016)
Show Figures

Figure 1

Open AccessArticle
Systematic Analysis of the Non-Extensive Statistical Approach in High Energy Particle Collisions—Experiment vs. Theory
Entropy 2017, 19(3), 88; https://doi.org/10.3390/e19030088 - 24 Feb 2017
Cited by 17
Abstract
The analysis of high-energy particle collisions is an excellent testbed for the non-extensive statistical approach. In these reactions we are far from the thermodynamical limit. In small colliding systems, such as electron-positron or nuclear collisions, the number of particles is several orders of [...] Read more.
The analysis of high-energy particle collisions is an excellent testbed for the non-extensive statistical approach. In these reactions we are far from the thermodynamical limit. In small colliding systems, such as electron-positron or nuclear collisions, the number of particles is several orders of magnitude smaller than the Avogadro number; therefore, finite-size and fluctuation effects strongly influence the final-state one-particle energy distributions. Due to the simple characterization, the description of the identified hadron spectra with the Boltzmann–Gibbs thermodynamical approach is insufficient. These spectra can be described very well with Tsallis–Pareto distributions instead, derived from non-extensive thermodynamics. Using the q-entropy formula, we interpret the microscopic physics in terms of the Tsallis q and T parameters. In this paper we give a view on these parameters, analyzing identified hadron spectra from recent years in a wide center-of-mass energy range. We demonstrate that the fitted Tsallis-parameters show dependency on the center-of-mass energy and particle species (mass). Our findings are described well by a QCD (Quantum Chromodynamics) inspired parton evolution ansatz. Based on this comprehensive study, apart from the evolution, both mesonic and baryonic components found to be non-extensive ( q > 1 ), besides the mass ordered hierarchy observed in the parameter T. We also study and compare in details the theory-obtained parameters for the case of PYTHIA8 Monte Carlo Generator, perturbative QCD and quark coalescence models. Full article
(This article belongs to the Special Issue Selected Papers from MaxEnt 2016)
Show Figures

Figure 1

Open AccessArticle
Sequential Batch Design for Gaussian Processes Employing Marginalization †
Entropy 2017, 19(2), 84; https://doi.org/10.3390/e19020084 - 21 Feb 2017
Cited by 1
Abstract
Within the Bayesian framework, we utilize Gaussian processes for parametric studies of long running computer codes. Since the simulations are expensive, it is necessary to exploit the computational budget in the best possible manner. Employing the sum over variances —being indicators for the [...] Read more.
Within the Bayesian framework, we utilize Gaussian processes for parametric studies of long running computer codes. Since the simulations are expensive, it is necessary to exploit the computational budget in the best possible manner. Employing the sum over variances —being indicators for the quality of the fit—as the utility function, we establish an optimized and automated sequential parameter selection procedure. However, it is also often desirable to utilize the parallel running capabilities of present computer technology and abandon the sequential parameter selection for a faster overall turn-around time (wall-clock time). This paper proposes to achieve this by marginalizing over the expected outcomes at optimized test points in order to set up a pool of starting values for batch execution. For a one-dimensional test case, the numerical results are validated with the analytical solution. Eventually, a systematic convergence study demonstrates the advantage of the optimized approach over randomly chosen parameter settings. Full article
(This article belongs to the Special Issue Selected Papers from MaxEnt 2016)
Show Figures

Figure 1

Open AccessArticle
Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem
Entropy 2017, 19(2), 48; https://doi.org/10.3390/e19020048 - 24 Jan 2017
Cited by 15
Abstract
We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynamic Entropy. The first is defined on any probability distribution; and therefore it is a very general concept. On the other hand Entropy is defined on a very special [...] Read more.
We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynamic Entropy. The first is defined on any probability distribution; and therefore it is a very general concept. On the other hand Entropy is defined on a very special set of distributions. Next we show that the Shannon Measure of Information (SMI) provides a solid and quantitative basis for the interpretation of the thermodynamic entropy. The entropy measures the uncertainty in the distribution of the locations and momenta of all the particles; as well as two corrections due to the uncertainty principle and the indistinguishability of the particles. Finally we show that the H-function as defined by Boltzmann is an SMI but not entropy. Therefore; much of what has been written on the H-theorem is irrelevant to entropy and the Second Law of Thermodynamics. Full article
(This article belongs to the Special Issue Selected Papers from MaxEnt 2016)
Show Figures

Figure 1

Back to TopTop