Special Issue "MaxEnt 2017 - The 37th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering"

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (30 November 2017).

Special Issue Editors

Dr. Adriano Polpo
Website
Guest Editor
Federal University of Sao Carlos, São Carlos, São Paulo, Brazil
Interests: Bayesian inference; foundations of statistics; significance tests; reliability and survival analysis; model selection; biostatistics
Special Issues and Collections in MDPI journals
Prof. Carlos Alberto De Bragança Pereira
Website
Guest Editor
Federal University of Mato Grosso do Sul, Campo Grande, MS, Brazil and University of Sao Paulo, Sao Paulo, SP, Brazil
Interests: Bayesian statistics; controversies and paradoxes in probability and statistics; Bayesian reliability; Bayesian analysis of discrete data (BADD); applied statistics
Special Issues and Collections in MDPI journals
Dr. Marcio Alves Diniz

Assistant Guest Editor
Federal University of Sao Carlos, São Carlos, São Paulo, Brazil
Dr. Rafael Bassi Stern

Assistant Guest Editor
Federal University of Sao Carlos, São Carlos, São Paulo, Brazil

Special Issue Information

Dear Colleagues,

The MaxEnt workshops have explored the use of Bayesian and Maximum Entropy methods in scientific and engineering applications. The Special Issue invites contributions on all aspects of probabilistic inference, including novel techniques and applications, and work that sheds new light on the foundations of inference. In previous workshops areas of application have included astronomy and astrophysics, chemistry, communications theory, cosmology, climate studies, earth science, fluid mechanics, genetics, geophysics, machine learning, material science, medical imaging, nanoscience, source separation, thermodynamics (equilibrium and non-equilibrium), particle physics, plasma physics, quantum mechanics, robotics, and social sciences. Bayesian computational techniques such as Markov chain Monte Carlo sampling have been regular topics, as are approximate inferential methods. Foundational issues involving probability theory and information theory, and the novel application of inference to illuminate the foundations of physical theories, have also been of keen interest.

The special issue is in honor of Prof. Julio Stern. He have been working with Maximum Entropy and Bayesian methods and has encouraged many people to work in this subject. His contributions are important to the MaxEnt scientific group, as well as for the Brazilian scientific society.

The papers of the MaxEnt 2017 Workshop are welcomed, but is not limited to. Papers in the subject of Maximum Entropy and Bayesian Methods are also welcomed.

Prof. Dr. Adriano Polpo
Prof. Dr. Carlos Alberto de Bragança Pereira
Guest Editors

Dr. Marcio Alves Diniz
Dr. Rafael Bassi Stern
Assistant Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Other

Open AccessArticle
Global Optimization Employing Gaussian Process-Based Bayesian Surrogates
Entropy 2018, 20(3), 201; https://doi.org/10.3390/e20030201 - 16 Mar 2018
Cited by 4
Abstract
The simulation of complex physics models may lead to enormous computer running times. Since the simulations are expensive it is necessary to exploit the computational budget in the best possible manner. If for a few input parameter settings an output data set has [...] Read more.
The simulation of complex physics models may lead to enormous computer running times. Since the simulations are expensive it is necessary to exploit the computational budget in the best possible manner. If for a few input parameter settings an output data set has been acquired, one could be interested in taking these data as a basis for finding an extremum and possibly an input parameter set for further computer simulations to determine it—a task which belongs to the realm of global optimization. Within the Bayesian framework we utilize Gaussian processes for the creation of a surrogate model function adjusted self-consistently via hyperparameters to represent the data. Although the probability distribution of the hyperparameters may be widely spread over phase space, we make the assumption that only the use of their expectation values is sufficient. While this shortcut facilitates a quickly accessible surrogate, it is somewhat justified by the fact that we are not interested in a full representation of the model by the surrogate but to reveal its maximum. To accomplish this the surrogate is fed to a utility function whose extremum determines the new parameter set for the next data point to obtain. Moreover, we propose to alternate between two utility functions—expected improvement and maximum variance—in order to avoid the drawbacks of each. Subsequent data points are drawn from the model function until the procedure either remains in the points found or the surrogate model does not change with the iteration. The procedure is applied to mock data in one and two dimensions in order to demonstrate proof of principle of the proposed approach. Full article
Show Figures

Figure 1

Open AccessArticle
Estimating Multivariate Discrete Distributions Using Bernstein Copulas
Entropy 2018, 20(3), 194; https://doi.org/10.3390/e20030194 - 14 Mar 2018
Cited by 1
Abstract
Measuring the dependence between random variables is one of the most fundamental problems in statistics, and therefore, determining the joint distribution of the relevant variables is crucial. Copulas have recently become an important tool for properly inferring the joint distribution of the variables [...] Read more.
Measuring the dependence between random variables is one of the most fundamental problems in statistics, and therefore, determining the joint distribution of the relevant variables is crucial. Copulas have recently become an important tool for properly inferring the joint distribution of the variables of interest. Although many studies have addressed the case of continuous variables, few studies have focused on treating discrete variables. This paper presents a nonparametric approach to the estimation of joint discrete distributions with bounded support using copulas and Bernstein polynomials. We present an application in real obsessive-compulsive disorder data. Full article
Show Figures

Figure 1

Open AccessArticle
Categorical Data Analysis Using a Skewed Weibull Regression Model
Entropy 2018, 20(3), 176; https://doi.org/10.3390/e20030176 - 07 Mar 2018
Cited by 2
Abstract
In this paper, we present a Weibull link (skewed) model for categorical response data arising from binomial as well as multinomial model. We show that, for such types of categorical data, the most commonly used models (logit, probit and complementary log–log) can be [...] Read more.
In this paper, we present a Weibull link (skewed) model for categorical response data arising from binomial as well as multinomial model. We show that, for such types of categorical data, the most commonly used models (logit, probit and complementary log–log) can be obtained as limiting cases. We further compare the proposed model with some other asymmetrical models. The Bayesian as well as frequentist estimation procedures for binomial and multinomial data responses are presented in detail. The analysis of two datasets to show the efficiency of the proposed model is performed. Full article
Show Figures

Figure 1

Open AccessArticle
Feature Selection based on the Local Lift Dependence Scale
Entropy 2018, 20(2), 97; https://doi.org/10.3390/e20020097 - 30 Jan 2018
Cited by 1
Abstract
This paper uses a classical approach to feature selection: minimization of a cost function applied on estimated joint distributions. However, in this new formulation, the optimization search space is extended. The original search space is the Boolean lattice of features sets (BLFS), while [...] Read more.
This paper uses a classical approach to feature selection: minimization of a cost function applied on estimated joint distributions. However, in this new formulation, the optimization search space is extended. The original search space is the Boolean lattice of features sets (BLFS), while the extended one is a collection of Boolean lattices of ordered pairs (CBLOP), that is (features, associated value), indexed by the elements of the BLFS. In this approach, we may not only select the features that are most related to a variable Y, but also select the values of the features that most influence the variable or that are most prone to have a specific value of Y. A local formulation of Shannon’s mutual information, which generalizes Shannon’s original definition, is applied on a CBLOP to generate a multiple resolution scale for characterizing variable dependence, the Local Lift Dependence Scale (LLDS). The main contribution of this paper is to define and apply the LLDS to analyse local properties of joint distributions that are neglected by the classical Shannon’s global measure in order to select features. This approach is applied to select features based on the dependence between: i—the performance of students on university entrance exams and on courses of their first semester in the university; ii—the congress representative party and his vote on different matters; iii—the cover type of terrains and several terrain properties. Full article
Show Figures

Figure 1

Open AccessArticle
A Sequential Algorithm for Signal Segmentation
Entropy 2018, 20(1), 55; https://doi.org/10.3390/e20010055 - 12 Jan 2018
Cited by 4
Abstract
The problem of event detection in general noisy signals arises in many applications; usually, either a functional form of the event is available, or a previous annotated sample with instances of the event that can be used to train a classification algorithm. There [...] Read more.
The problem of event detection in general noisy signals arises in many applications; usually, either a functional form of the event is available, or a previous annotated sample with instances of the event that can be used to train a classification algorithm. There are situations, however, where neither functional forms nor annotated samples are available; then, it is necessary to apply other strategies to separate and characterize events. In this work, we analyze 15-min samples of an acoustic signal, and are interested in separating sections, or segments, of the signal which are likely to contain significant events. For that, we apply a sequential algorithm with the only assumption that an event alters the energy of the signal. The algorithm is entirely based on Bayesian methods. Full article
Show Figures

Figure 1

Open AccessArticle
Classical-Equivalent Bayesian Portfolio Optimization for Electricity Generation Planning
Entropy 2018, 20(1), 42; https://doi.org/10.3390/e20010042 - 10 Jan 2018
Cited by 3
Abstract
There are several electricity generation technologies based on different sources such as wind, biomass, gas, coal, and so on. The consideration of the uncertainties associated with the future costs of such technologies is crucial for planning purposes. In the literature, the allocation of [...] Read more.
There are several electricity generation technologies based on different sources such as wind, biomass, gas, coal, and so on. The consideration of the uncertainties associated with the future costs of such technologies is crucial for planning purposes. In the literature, the allocation of resources in the available technologies has been solved as a mean-variance optimization problem assuming knowledge of the expected values and the covariance matrix of the costs. However, in practice, they are not exactly known parameters. Consequently, the obtained optimal allocations from the mean-variance optimization are not robust to possible estimation errors of such parameters. Additionally, it is usual to have electricity generation technology specialists participating in the planning processes and, obviously, the consideration of useful prior information based on their previous experience is of utmost importance. The Bayesian models consider not only the uncertainty in the parameters, but also the prior information from the specialists. In this paper, we introduce the classical-equivalent Bayesian mean-variance optimization to solve the electricity generation planning problem using both improper and proper prior distributions for the parameters. In order to illustrate our approach, we present an application comparing the classical-equivalent Bayesian with the naive mean-variance optimal portfolios. Full article
Show Figures

Figure 1

Open AccessEditor’s ChoiceArticle
Exact Renormalization Groups As a Form of Entropic Dynamics
Entropy 2018, 20(1), 25; https://doi.org/10.3390/e20010025 - 04 Jan 2018
Cited by 4
Abstract
The Renormalization Group (RG) is a set of methods that have been instrumental in tackling problems involving an infinite number of degrees of freedom, such as, for example, in quantum field theory and critical phenomena. What all these methods have in common—which is [...] Read more.
The Renormalization Group (RG) is a set of methods that have been instrumental in tackling problems involving an infinite number of degrees of freedom, such as, for example, in quantum field theory and critical phenomena. What all these methods have in common—which is what explains their success—is that they allow a systematic search for those degrees of freedom that happen to be relevant to the phenomena in question. In the standard approaches the RG transformations are implemented by either coarse graining or through a change of variables. When these transformations are infinitesimal, the formalism can be described as a continuous dynamical flow in a fictitious time parameter. It is generally the case that these exact RG equations are functional diffusion equations. In this paper we show that the exact RG equations can be derived using entropic methods. The RG flow is then described as a form of entropic dynamics of field configurations. Although equivalent to other versions of the RG, in this approach the RG transformations receive a purely inferential interpretation that establishes a clear link to information theory. Full article
Open AccessArticle
L1-Minimization Algorithm for Bayesian Online Compressed Sensing
Entropy 2017, 19(12), 667; https://doi.org/10.3390/e19120667 - 05 Dec 2017
Cited by 3
Abstract
In this work, we propose a Bayesian online reconstruction algorithm for sparse signals based on Compressed Sensing and inspired by L1-regularization schemes. A previous work has introduced a mean-field approximation for the Bayesian online algorithm and has shown that it is possible to [...] Read more.
In this work, we propose a Bayesian online reconstruction algorithm for sparse signals based on Compressed Sensing and inspired by L1-regularization schemes. A previous work has introduced a mean-field approximation for the Bayesian online algorithm and has shown that it is possible to saturate the offline performance in the presence of Gaussian measurement noise when the signal generating distribution is known. Here, we build on these results and show that reconstruction is possible even if prior knowledge about the generation of the signal is limited, by introduction of a Laplace prior and of an extra Kullback–Leibler divergence minimization step for hyper-parameter learning. Full article
Show Figures

Figure 1

Open AccessArticle
EXONEST: The Bayesian Exoplanetary Explorer
Entropy 2017, 19(10), 559; https://doi.org/10.3390/e19100559 - 20 Oct 2017
Cited by 6
Abstract
The fields of astronomy and astrophysics are currently engaged in an unprecedented era of discovery as recent missions have revealed thousands of exoplanets orbiting other stars. While the Kepler Space Telescope mission has enabled most of these exoplanets to be detected by identifying [...] Read more.
The fields of astronomy and astrophysics are currently engaged in an unprecedented era of discovery as recent missions have revealed thousands of exoplanets orbiting other stars. While the Kepler Space Telescope mission has enabled most of these exoplanets to be detected by identifying transiting events, exoplanets often exhibit additional photometric effects that can be used to improve the characterization of exoplanets. The EXONEST Exoplanetary Explorer is a Bayesian exoplanet inference engine based on nested sampling and originally designed to analyze archived Kepler Space Telescope and CoRoT (Convection Rotation et Transits planétaires) exoplanet mission data. We discuss the EXONEST software package and describe how it accommodates plug-and-play models of exoplanet-associated photometric effects for the purpose of exoplanet detection, characterization and scientific hypothesis testing. The current suite of models allows for both circular and eccentric orbits in conjunction with photometric effects, such as the primary transit and secondary eclipse, reflected light, thermal emissions, ellipsoidal variations, Doppler beaming and superrotation. We discuss our new efforts to expand the capabilities of the software to include more subtle photometric effects involving reflected and refracted light. We discuss the EXONEST inference engine design and introduce our plans to port the current MATLAB-based EXONEST software package over to the next generation Exoplanetary Explorer, which will be a Python-based open source project with the capability to employ third-party plug-and-play models of exoplanet-related photometric effects. Full article
Show Figures

Figure 1

Other

Jump to: Research

Open AccessMeeting Report
Overview of the 37th MaxEnt
Entropy 2018, 20(9), 694; https://doi.org/10.3390/e20090694 - 11 Sep 2018
Abstract
The 37th edition of MaxEnt was held in Brazil, hosting several distinguished researchers and students. The workshop offered four tutorials, nine invited talks, twenty four oral presentations and twenty seven poster presentations. All submissions received their first choice between oral and poster presentations. [...] Read more.
The 37th edition of MaxEnt was held in Brazil, hosting several distinguished researchers and students. The workshop offered four tutorials, nine invited talks, twenty four oral presentations and twenty seven poster presentations. All submissions received their first choice between oral and poster presentations. The event held a celebration to Julio Stern’s 60th anniversary and awarded two prizes to young researchers. As customary, the workshop had one free afternoon, in which participants visited the city’s surroundings and experienced Brazilian food and traditions. Full article
Back to TopTop