entropy-logo

Journal Browser

Journal Browser

MaxEnt 2022—the 41st International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Multidisciplinary Applications".

Deadline for manuscript submissions: closed (31 December 2022) | Viewed by 10193

Special Issue Editors


E-Mail Website
Guest Editor
Laboratoire des Signaux et Système, CNRS CentraleSupélec, Université Paris-Saclay, 3, Rue Joliot-Curie, 91192 Gif-sur-Yvette, France
Interests: inference; inverse problems; Bayesian computation; information and maximum entropy; knowledge extraction
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Sony Computer Science Laboratories, Takanawa Muse Bldg., 3-14-13, Higashigotanda, Shinagawa-ku, Tokyo 141-0022, Japan
Interests: information geometry; machine learning; imaging
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Institut des NanoSciences de Paris, CNRS, Sorbonne Université, Campus Pierre et Marie Curie, 75005 Paris, France
Interests: atomic physics; surface science; conditional probability

Special Issue Information

Dear Colleagues,

This Special Issue invites contributions that use Bayesian inference and maximum entropy methods in data analysis, information processing and inverse problems from a broad range of diverse disciplines, including the following: astronomy and astrophysics, geophysics, medical imaging, molecular imaging and genomics, non-destructive evaluation, particle and quantum physics, physical and chemical measurement techniques, and economics and econometrics.

The specific areas of interest include, but are not limited to, the following:

  • Foundations of probability, inference, information, and entropy;
  • Bayesian physics-informed and thermodynamics-informed machine learning;
  • Machine learning tools for inverse problems;
  • Bayesian and maximum entropy in real-world applications;
  • Geometric statistical mechanics/physics, Lie group thermodynamics and maximum entropy densities;
  • Quantum: theory, computation, tomography and applications.

We welcome the submission of extended papers on contributions presented at the MaxEnt 2022. Papers on the subject of maximum entropy and Bayesian methods are also welcome.

Prof. Dr. Ali Mohammad-Djafari
Prof. Dr. Frank Nielsen
Dr. Frédéric Barbaresco
Dr. Martino Trassinelli
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (8 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Other

31 pages, 2351 KiB  
Article
The Interplay between Error, Total Variation, Alpha-Entropy and Guessing: Fano and Pinsker Direct and Reverse Inequalities
by Olivier Rioul
Entropy 2023, 25(7), 978; https://doi.org/10.3390/e25070978 - 25 Jun 2023
Cited by 1 | Viewed by 954
Abstract
Using majorization theory via “Robin Hood” elementary operations, optimal lower and upper bounds are derived on Rényi and guessing entropies with respect to either error probability (yielding reverse-Fano and Fano inequalities) or total variation distance to the uniform (yielding reverse-Pinsker and Pinsker inequalities). [...] Read more.
Using majorization theory via “Robin Hood” elementary operations, optimal lower and upper bounds are derived on Rényi and guessing entropies with respect to either error probability (yielding reverse-Fano and Fano inequalities) or total variation distance to the uniform (yielding reverse-Pinsker and Pinsker inequalities). This gives a general picture of how the notion of randomness can be measured in many areas of computer science. Full article
Show Figures

Figure 1

11 pages, 647 KiB  
Article
Outlier-Robust Surrogate Modeling of Ion–Solid Interaction Simulations
by Roland Preuss and Udo von Toussaint
Entropy 2023, 25(4), 685; https://doi.org/10.3390/e25040685 - 19 Apr 2023
Viewed by 706
Abstract
Data for complex plasma–wall interactions require long-running and expensive computer simulations. Furthermore, the number of input parameters is large, which results in low coverage of the (physical) parameter space. Unpredictable occasions of outliers create a need to conduct the exploration of this multi-dimensional [...] Read more.
Data for complex plasma–wall interactions require long-running and expensive computer simulations. Furthermore, the number of input parameters is large, which results in low coverage of the (physical) parameter space. Unpredictable occasions of outliers create a need to conduct the exploration of this multi-dimensional space using robust analysis tools. We restate the Gaussian process (GP) method as a Bayesian adaptive exploration method for establishing surrogate surfaces in the variables of interest. On this basis, we expand the analysis by the Student-t process (TP) method in order to improve the robustness of the result with respect to outliers. The most obvious difference between both methods shows up in the marginal likelihood for the hyperparameters of the covariance function, where the TP method features a broader marginal probability distribution in the presence of outliers. Eventually, we provide first investigations, with a mixture likelihood of two Gaussians within a Gaussian process ansatz for describing either outlier or non-outlier behavior. The parameters of the two Gaussians are set such that the mixture likelihood resembles the shape of a Student-t likelihood. Full article
Show Figures

Figure 1

18 pages, 1313 KiB  
Article
Butterfly Transforms for Efficient Representation of Spatially Variant Point Spread Functions in Bayesian Imaging
by Vincent Eberle, Philipp Frank, Julia Stadler, Silvan Streit and Torsten Enßlin
Entropy 2023, 25(4), 652; https://doi.org/10.3390/e25040652 - 13 Apr 2023
Cited by 1 | Viewed by 1077
Abstract
Bayesian imaging algorithms are becoming increasingly important in, e.g., astronomy, medicine and biology. Given that many of these algorithms compute iterative solutions to high-dimensional inverse problems, the efficiency and accuracy of the instrument response representation are of high importance for the imaging process. [...] Read more.
Bayesian imaging algorithms are becoming increasingly important in, e.g., astronomy, medicine and biology. Given that many of these algorithms compute iterative solutions to high-dimensional inverse problems, the efficiency and accuracy of the instrument response representation are of high importance for the imaging process. For efficiency reasons, point spread functions, which make up a large fraction of the response functions of telescopes and microscopes, are usually assumed to be spatially invariant in a given field of view and can thus be represented by a convolution. For many instruments, this assumption does not hold and degrades the accuracy of the instrument representation. Here, we discuss the application of butterfly transforms, which are linear neural network structures whose sizes scale sub-quadratically with the number of data points. Butterfly transforms are efficient by design, since they are inspired by the structure of the Cooley–Tukey fast Fourier transform. In this work, we combine them in several ways into butterfly networks, compare the different architectures with respect to their performance and identify a representation that is suitable for the efficient representation of a synthetic spatially variant point spread function up to a 1% error. Furthermore, we show its application in a short synthetic example. Full article
Show Figures

Figure 1

0 pages, 2988 KiB  
Article
Assessing Search and Unsupervised Clustering Algorithms in Nested Sampling
by Lune Maillard, Fabio Finocchi and Martino Trassinelli
Entropy 2023, 25(2), 347; https://doi.org/10.3390/e25020347 - 14 Feb 2023
Cited by 2 | Viewed by 1344 | Correction
Abstract
Nested sampling is an efficient method for calculating Bayesian evidence in data analysis and partition functions of potential energies. It is based on an exploration using a dynamical set of sampling points that evolves to higher values of the sampled function. When several [...] Read more.
Nested sampling is an efficient method for calculating Bayesian evidence in data analysis and partition functions of potential energies. It is based on an exploration using a dynamical set of sampling points that evolves to higher values of the sampled function. When several maxima are present, this exploration can be a very difficult task. Different codes implement different strategies. Local maxima are generally treated separately, applying cluster recognition of the sampling points based on machine learning methods. We present here the development and implementation of different search and clustering methods on the nested_fit code. Slice sampling and the uniform search method are added in addition to the random walk already implemented. Three new cluster recognition methods are also developed. The efficiency of the different strategies, in terms of accuracy and number of likelihood calls, is compared considering a series of benchmark tests, including model comparison and a harmonic energy potential. Slice sampling proves to be the most stable and accurate search strategy. The different clustering methods present similar results but with very different computing time and scaling. Different choices of the stopping criterion of the algorithm, another critical issue of nested sampling, are also investigated with the harmonic energy potential. Full article
Show Figures

Figure 1

21 pages, 1550 KiB  
Article
Information and Agreement in the Reputation Game Simulation
by Viktoria Kainz, Céline Bœhm, Sonja Utz and Torsten Enßlin
Entropy 2022, 24(12), 1768; https://doi.org/10.3390/e24121768 - 03 Dec 2022
Cited by 1 | Viewed by 1386
Abstract
Modern communication habits are largely shaped by the extensive use of social media and other online communication platforms. The enormous amount of available data and speed with which new information arises, however, often suffices to cause misunderstandings, false conclusions, or otherwise disturbed opinion [...] Read more.
Modern communication habits are largely shaped by the extensive use of social media and other online communication platforms. The enormous amount of available data and speed with which new information arises, however, often suffices to cause misunderstandings, false conclusions, or otherwise disturbed opinion formation processes. To investigate some of these effects we use an agent-based model on gossip and reputation dynamics with 50 agents, including Bayesian knowledge updates under bounded rationality and up to the second-order theory of mind effects. Thereby, we observe the occurrence of reputation boosts from fake images, as well as the advantage of hiding one’s opinion in order to become a strong information trader. In addition, the simulations show fundamentally different mechanisms for reaching high agreement with others and becoming well-informed. Additionally, we investigate the robustness of our results with respect to different knowledge-update mechanisms and argue why it makes sense to especially emphasize the margins of distribution when judging a bounded quantity such as honesty in a reputation game simulation. Full article
Show Figures

Figure 1

12 pages, 332 KiB  
Article
Extended Divergence on a Foliation by Deformed Probability Simplexes
by Keiko Uohashi
Entropy 2022, 24(12), 1736; https://doi.org/10.3390/e24121736 - 28 Nov 2022
Viewed by 1021
Abstract
This study considers a new decomposition of an extended divergence on a foliation by deformed probability simplexes from the information geometry perspective. In particular, we treat the case where each deformed probability simplex corresponds to a set of q-escort distributions. For the [...] Read more.
This study considers a new decomposition of an extended divergence on a foliation by deformed probability simplexes from the information geometry perspective. In particular, we treat the case where each deformed probability simplex corresponds to a set of q-escort distributions. For the foliation, different q-parameters and the corresponding α-parameters of dualistic structures are defined on each of the various leaves. We propose the divergence decomposition theorem that guides the proximity of q-escort distributions with different q-parameters and compare the new theorem to the previous theorem of the standard divergence on a Hessian manifold with a fixed α-parameter. Full article
Show Figures

Figure 1

40 pages, 813 KiB  
Article
A Hierarchy of Probability, Fluid and Generalized Densities for the Eulerian Velocivolumetric Description of Fluid Flow, for New Families of Conservation Laws
by Robert K. Niven
Entropy 2022, 24(10), 1493; https://doi.org/10.3390/e24101493 - 19 Oct 2022
Cited by 2 | Viewed by 1438
Abstract
The Reynolds transport theorem occupies a central place in continuum mechanics, providing a generalized integral conservation equation for the transport of any conserved quantity within a fluid or material volume, which can be connected to its corresponding differential equation. Recently, a more generalized [...] Read more.
The Reynolds transport theorem occupies a central place in continuum mechanics, providing a generalized integral conservation equation for the transport of any conserved quantity within a fluid or material volume, which can be connected to its corresponding differential equation. Recently, a more generalized framework was presented for this theorem, enabling parametric transformations between positions on a manifold or in any generalized coordinate space, exploiting the underlying continuous multivariate (Lie) symmetries of a vector or tensor field associated with a conserved quantity. We explore the implications of this framework for fluid flow systems, based on an Eulerian velocivolumetric (position-velocity) description of fluid flow. The analysis invokes a hierarchy of five probability density functions, which by convolution are used to define five fluid densities and generalized densities relevant to this description. We derive 11 formulations of the generalized Reynolds transport theorem for different choices of the coordinate space, parameter space and density, only the first of which is commonly known. These are used to generate a table of integral and differential conservation laws applicable to each formulation, for eight important conserved quantities (fluid mass, species mass, linear momentum, angular momentum, energy, charge, entropy and probability). The findings substantially expand the set of conservation laws for the analysis of fluid flow and dynamical systems. Full article
Show Figures

Graphical abstract

Other

Jump to: Research

2 pages, 185 KiB  
Correction
Correction: Maillard et al. Assessing Search and Unsupervised Clustering Algorithms in Nested Sampling. Entropy 2023, 25, 347
by Lune Maillard, Fabio Finocchi and Martino Trassinelli
Entropy 2024, 26(1), 55; https://doi.org/10.3390/e26010055 - 09 Jan 2024
Viewed by 519
Abstract
There was an error in the original publication [...] Full article
Show Figures

Figure 1

Back to TopTop