Next Issue
Volume 4, ICEM 2022
Previous Issue
Volume 2, ECU 2021
 
 

Phys. Sci. Forum, 2021, MaxEnt 2021

The 40th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering

Online | 4–9 July 2021

Volume Editors:
Wolfgang von der Linden, Graz University of Technology, Austria
Sascha Ranftl, Graz University of Technology, Austria

ISBN 978-3-0365-3200-4 (Hbk); ISBN 978-3-0365-3201-1 (PDF)

Number of Papers: 13

Printed Edition Available!

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Cover Story (view full-size image): This volume aims to collect the ideas presented and discussed at the MaxEnt 2021. Skilling and Knuth seek to rebuild the foundations of quantum mechanics from probability theory, Caticha competes in [...] Read more.
Order results
Result details
Select all
Export citation of selected articles as:

Other

8 pages, 355 KiB  
Proceeding Paper
Legendre Transformation and Information Geometry for the Maximum Entropy Theory of Ecology
by Pedro Pessoa
Phys. Sci. Forum 2021, 3(1), 1; https://doi.org/10.3390/psf2021003001 - 3 Nov 2021
Viewed by 1861
Abstract
Here I investigate some mathematical aspects of the maximum entropy theory of ecology (METE). In particular I address the geometrical structure of METE endowed by information geometry. As novel results, the macrostate entropy is calculated analytically by the Legendre transformation of the log-normalizer [...] Read more.
Here I investigate some mathematical aspects of the maximum entropy theory of ecology (METE). In particular I address the geometrical structure of METE endowed by information geometry. As novel results, the macrostate entropy is calculated analytically by the Legendre transformation of the log-normalizer in METE. This result allows for the calculation of the metric terms in the information geometry arising from METE and, by consequence, the covariance matrix between METE variables. Full article
Show Figures

Figure A1

10 pages, 474 KiB  
Proceeding Paper
A Weakly Informative Prior for Resonance Frequencies
by Marnix Van Soom and Bart de Boer
Phys. Sci. Forum 2021, 3(1), 2; https://doi.org/10.3390/psf2021003002 - 4 Nov 2021
Viewed by 1458
Abstract
We derive a weakly informative prior for a set of ordered resonance frequencies from Jaynes’ principle of maximum entropy. The prior facilitates model selection problems in which both the number and the values of the resonance frequencies are unknown. It encodes a weakly [...] Read more.
We derive a weakly informative prior for a set of ordered resonance frequencies from Jaynes’ principle of maximum entropy. The prior facilitates model selection problems in which both the number and the values of the resonance frequencies are unknown. It encodes a weakly inductive bias, provides a reasonable density everywhere, is easily parametrizable, and is easy to sample. We hope that this prior can enable the use of robust evidence-based methods for a new class of problems, even in the presence of multiplets of arbitrary order. Full article
Show Figures

Figure 1

9 pages, 507 KiB  
Proceeding Paper
Global Variance as a Utility Function in Bayesian Optimization
by Roland Preuss and Udo von Toussaint
Phys. Sci. Forum 2021, 3(1), 3; https://doi.org/10.3390/psf2021003003 - 5 Nov 2021
Cited by 2 | Viewed by 1674
Abstract
A Gaussian-process surrogate model based on already acquired data is employed to approximate an unknown target surface. In order to optimally locate the next function evaluations in parameter space a whole variety of utility functions are at one’s disposal. However, good choice of [...] Read more.
A Gaussian-process surrogate model based on already acquired data is employed to approximate an unknown target surface. In order to optimally locate the next function evaluations in parameter space a whole variety of utility functions are at one’s disposal. However, good choice of a specific utility or a certain combination of them prepares the fastest way to determine a best surrogate surface or its extremum for lowest amount of additional data possible. In this paper, we propose to consider the global (integrated) variance as an utility function, i.e., to integrate the variance of the surrogate over a finite volume in parameter space. It turns out that this utility not only complements the tool set for fine tuning investigations in a region of interest but expedites the optimization procedure in toto. Full article
Show Figures

Figure 1

9 pages, 469 KiB  
Proceeding Paper
Survey Optimization via the Haphazard Intentional Sampling Method
by Miguel Miguel, Rafael Waissman, Marcelo Lauretto and Julio Stern
Phys. Sci. Forum 2021, 3(1), 4; https://doi.org/10.3390/psf2021003004 - 5 Nov 2021
Viewed by 1944
Abstract
In previously published articles, our research group has developed the Haphazard Intentional Sampling method and compared it to the Rerandomization method proposed by K.Morgan and D.Rubin. In this article, we compare both methods to the pure randomization method used for the Epicovid19 survey, [...] Read more.
In previously published articles, our research group has developed the Haphazard Intentional Sampling method and compared it to the Rerandomization method proposed by K.Morgan and D.Rubin. In this article, we compare both methods to the pure randomization method used for the Epicovid19 survey, conducted to estimate SARS-CoV-2 prevalence in 133 Brazilian Municipalities. We show that Haphazard intentional sampling can either substantially reduce operating costs to achieve the same estimation errors or, the other way around, substantially improve estimation precision using the same sample sizes. Full article
Show Figures

Figure 1

10 pages, 2919 KiB  
Proceeding Paper
Orbit Classification and Sensitivity Analysis in Dynamical Systems Using Surrogate Models
by Katharina Rath, Christopher G. Albert, Bernd Bischl and Udo von Toussaint
Phys. Sci. Forum 2021, 3(1), 5; https://doi.org/10.3390/psf2021003005 - 5 Nov 2021
Cited by 1 | Viewed by 1595
Abstract
Dynamics of many classical physics systems are described in terms of Hamilton’s equations. Commonly, initial conditions are only imperfectly known. The associated volume in phase space is preserved over time due to the symplecticity of the Hamiltonian flow. Here we study the propagation [...] Read more.
Dynamics of many classical physics systems are described in terms of Hamilton’s equations. Commonly, initial conditions are only imperfectly known. The associated volume in phase space is preserved over time due to the symplecticity of the Hamiltonian flow. Here we study the propagation of uncertain initial conditions through dynamical systems using symplectic surrogate models of Hamiltonian flow maps. This allows fast sensitivity analysis with respect to the distribution of initial conditions and an estimation of local Lyapunov exponents (LLE) that give insight into local predictability of a dynamical system. In Hamiltonian systems, LLEs permit a distinction between regular and chaotic orbits. Combined with Bayesian methods we provide a statistical analysis of local stability and sensitivity in phase space for Hamiltonian systems. The intended application is the early classification of regular and chaotic orbits of fusion alpha particles in stellarator reactors. The degree of stochastization during a given time period is used as an estimate for the probability that orbits of a specific region in phase space are lost at the plasma boundary. Thus, the approach offers a promising way to accelerate the computation of fusion alpha particle losses. Full article
Show Figures

Figure 1

13 pages, 382 KiB  
Proceeding Paper
Bayesian Surrogate Analysis and Uncertainty Propagation
by Sascha Ranftl and Wolfgang von der Linden
Phys. Sci. Forum 2021, 3(1), 6; https://doi.org/10.3390/psf2021003006 - 13 Nov 2021
Cited by 9 | Viewed by 5288
Abstract
The quantification of uncertainties of computer simulations due to input parameter uncertainties is paramount to assess a model’s credibility. For computationally expensive simulations, this is often feasible only via surrogate models that are learned from a small set of simulation samples. The surrogate [...] Read more.
The quantification of uncertainties of computer simulations due to input parameter uncertainties is paramount to assess a model’s credibility. For computationally expensive simulations, this is often feasible only via surrogate models that are learned from a small set of simulation samples. The surrogate models are commonly chosen and deemed trustworthy based on heuristic measures, and substituted for the simulation in order to approximately propagate the simulation input uncertainties to the simulation output. In the process, the contribution of the uncertainties of the surrogate itself to the simulation output uncertainties is usually neglected. In this work, we specifically address the case of doubtful surrogate trustworthiness, i.e., non-negligible surrogate uncertainties. We find that Bayesian probability theory yields a natural measure of surrogate trustworthiness, and that surrogate uncertainties can easily be included in simulation output uncertainties. For a Gaussian likelihood for the simulation data, with unknown surrogate variance and given a generalized linear surrogate model, the resulting formulas reduce to simple matrix multiplications. The framework contains Polynomial Chaos Expansions as a special case, and is easily extended to Gaussian Process Regression. Additionally, we show a simple way to implicitly include spatio-temporal correlations. Lastly, we demonstrate a numerical example where surrogate uncertainties are in part negligible and in part non-negligible. Full article
Show Figures

Figure 1

8 pages, 2005 KiB  
Proceeding Paper
Regularization of the Gravity Field Inversion Process with High-Dimensional Vector Autoregressive Models
by Andreas Kvas and Torsten Mayer-Gürr
Phys. Sci. Forum 2021, 3(1), 7; https://doi.org/10.3390/psf2021003007 - 7 Dec 2021
Viewed by 1935
Abstract
Earth’s gravitational field provides invaluable insights into the changing nature of our planet. It reflects mass change caused by geophysical processes like continental hydrology, changes in the cryosphere or mass flux in the ocean. Satellite missions such as the NASA/DLR operated Gravity Recovery [...] Read more.
Earth’s gravitational field provides invaluable insights into the changing nature of our planet. It reflects mass change caused by geophysical processes like continental hydrology, changes in the cryosphere or mass flux in the ocean. Satellite missions such as the NASA/DLR operated Gravity Recovery and Climate Experiment (GRACE), and its successor GRACE Follow-On (GRACE-FO) continuously monitor these temporal variations of the gravitational attraction. In contrast to other satellite remote sensing datasets, gravity field recovery is based on geophysical inversion which requires a global, homogeneous data coverage. GRACE and GRACE-FO typically reach this global coverage after about 30 days, so short-lived events such as floods, which occur on time frames from hours to weeks, require additional information to be properly resolved. In this contribution we treat Earth’s gravitational field as a stationary random process and model its spatio-temporal correlations in the form of a vector autoregressive (VAR) model. The satellite measurements are combined with this prior information in a Kalman smoother framework to regularize the inversion process, which allows us to estimate daily, global gravity field snapshots. To derive the prior, we analyze geophysical model output which reflects the expected signal content and temporal evolution of the estimated gravity field solutions. The main challenges here are the high dimensionality of the process, with a state vector size in the order of 103 to 104, and the limited amount of model output from which to estimate such a high-dimensional VAR model. We introduce geophysically motivated constraints in the VAR model estimation process to ensure a positive-definite covariance function. Full article
Show Figures

Figure 1

8 pages, 291 KiB  
Proceeding Paper
Statistical Mechanics of Unconfined Systems: Challenges and Lessons
by Bruno Arderucio Costa and Pedro Pessoa
Phys. Sci. Forum 2021, 3(1), 8; https://doi.org/10.3390/psf2021003008 - 9 Dec 2021
Viewed by 1336
Abstract
Motivated by applications of statistical mechanics in which the system of interest is spatially unconfined, we present an exact solution to the maximum entropy problem for assigning a stationary probability distribution on the phase space of an unconfined ideal gas in an anti-de [...] Read more.
Motivated by applications of statistical mechanics in which the system of interest is spatially unconfined, we present an exact solution to the maximum entropy problem for assigning a stationary probability distribution on the phase space of an unconfined ideal gas in an anti-de Sitter background. Notwithstanding the gas’ freedom to move in an infinite volume, we establish necessary conditions for the stationary probability distribution solving a general maximum entropy problem to be normalizable and obtain the resulting probability for a particular choice of constraints. As a part of our analysis, we develop a novel method for identifying dynamical constraints based on local measurements. With no appeal to a priori information about globally defined conserved quantities, it is therefore applicable to a much wider range of problems. Full article
10 pages, 421 KiB  
Proceeding Paper
The ABC of Physics
by John Skilling and Kevin Knuth
Phys. Sci. Forum 2021, 3(1), 9; https://doi.org/10.3390/psf2021003009 - 10 Dec 2021
Cited by 1 | Viewed by 2414
Abstract
Why quantum? Why spacetime? We find that the key idea underlying both is uncertainty. In a world lacking probes of unlimited delicacy, our knowledge of quantities is necessarily accompanied by uncertainty. Consequently, physics requires a calculus of number pairs and not only [...] Read more.
Why quantum? Why spacetime? We find that the key idea underlying both is uncertainty. In a world lacking probes of unlimited delicacy, our knowledge of quantities is necessarily accompanied by uncertainty. Consequently, physics requires a calculus of number pairs and not only scalars for quantity alone. Basic symmetries of shuffling and sequencing dictate that pairs obey ordinary component-wise addition, but they can have three different multiplication rules. We call those rules A, B and C. “A” shows that pairs behave as complex numbers, which is why quantum theory is complex. However, consistency with the ordinary scalar rules of probability shows that the fundamental object is not a particle on its Hilbert sphere but a stream represented by a Gaussian distribution. “B” is then applied to pairs of complex numbers (qubits) and produces the Pauli matrices for which its operation defines the space of four vectors. “C” then allows integration of what can then be recognised as energy-momentum into time and space. The picture is entirely consistent. Spacetime is a construct of quantum and not a container for it. Full article
Show Figures

Figure 1

10 pages, 557 KiB  
Proceeding Paper
On Two Measure-Theoretic Aspects of the Full Bayesian Significance Test for Precise Bayesian Hypothesis Testing
by Riko Kelter
Phys. Sci. Forum 2021, 3(1), 10; https://doi.org/10.3390/psf2021003010 - 17 Dec 2021
Viewed by 1372
Abstract
The Full Bayesian Significance Test (FBST) has been proposed as a convenient method to replace frequentist p-values for testing a precise hypothesis. Although the FBST enjoys various appealing properties, the purpose of this paper is to investigate two aspects of the FBST [...] Read more.
The Full Bayesian Significance Test (FBST) has been proposed as a convenient method to replace frequentist p-values for testing a precise hypothesis. Although the FBST enjoys various appealing properties, the purpose of this paper is to investigate two aspects of the FBST which are sometimes observed as measure-theoretic inconsistencies of the procedure and have not been discussed rigorously in the literature. First, the FBST uses the posterior density as a reference for judging the Bayesian statistical evidence against a precise hypothesis. However, under absolutely continuous prior distributions, the posterior density is defined only up to Lebesgue null sets which renders the reference criterion arbitrary. Second, the FBST statistical evidence seems to have no valid prior probability. It is shown that the former aspect can be circumvented by fixing a version of the posterior density before using the FBST, and the latter aspect is based on its measure-theoretic premises. An illustrative example demonstrates the two aspects and their solution. Together, the results in this paper show that both of the two aspects which are sometimes observed as measure-theoretic inconsistencies of the FBST are not tenable. The FBST thus provides a measure-theoretically coherent Bayesian alternative for testing a precise hypothesis. Full article
Show Figures

Figure 1

9 pages, 662 KiB  
Proceeding Paper
Surrogate-Enhanced Parameter Inference for Function-Valued Models
by Christopher G. Albert, Ulrich Callies and Udo von Toussaint
Phys. Sci. Forum 2021, 3(1), 11; https://doi.org/10.3390/psf2021003011 - 21 Dec 2021
Viewed by 1205
Abstract
We present an approach to enhance the performance and flexibility of the Bayesian inference of model parameters based on observations of the measured data. Going beyond the usual surrogate-enhanced Monte-Carlo or optimization methods that focus on a scalar loss, we place emphasis on [...] Read more.
We present an approach to enhance the performance and flexibility of the Bayesian inference of model parameters based on observations of the measured data. Going beyond the usual surrogate-enhanced Monte-Carlo or optimization methods that focus on a scalar loss, we place emphasis on a function-valued output of a formally infinite dimension. For this purpose, the surrogate models are built on a combination of linear dimensionality reduction in an adaptive basis of principal components and Gaussian process regression for the map between reduced feature spaces. Since the decoded surrogate provides the full model output rather than only the loss, it is re-usable for multiple calibration measurements as well as different loss metrics and, consequently, allows for flexible marginalization over such quantities and applications to Bayesian hierarchical models. We evaluate the method’s performance based on a case study of a toy model and a simple riverine diatom model for the Elbe river. As input data, this model uses six tunable scalar parameters as well as silica concentrations in the upper reach of the river together with the continuous time-series of temperature, radiation, and river discharge over a specific year. The output consists of continuous time-series data that are calibrated against corresponding measurements from the Geesthacht Weir station at the Elbe river. For this study, only two scalar inputs were considered together with a function-valued output and compared to an existing model calibration using direct simulation runs without a surrogate. Full article
Show Figures

Figure 1

12 pages, 308 KiB  
Proceeding Paper
Quantum Mechanics as Hamilton–Killing Flows on a Statistical Manifold
by Ariel Caticha
Phys. Sci. Forum 2021, 3(1), 12; https://doi.org/10.3390/psf2021003012 - 21 Dec 2021
Cited by 1 | Viewed by 1410
Abstract
The mathematical formalism of quantum mechanics is derived or “reconstructed” from more basic considerations of the probability theory and information geometry. The starting point is the recognition that probabilities are central to QM; the formalism of QM is derived as a particular kind [...] Read more.
The mathematical formalism of quantum mechanics is derived or “reconstructed” from more basic considerations of the probability theory and information geometry. The starting point is the recognition that probabilities are central to QM; the formalism of QM is derived as a particular kind of flow on a finite dimensional statistical manifold—a simplex. The cotangent bundle associated to the simplex has a natural symplectic structure and it inherits its own natural metric structure from the information geometry of the underlying simplex. We seek flows that preserve (in the sense of vanishing Lie derivatives) both the symplectic structure (a Hamilton flow) and the metric structure (a Killing flow). The result is a formalism in which the Fubini–Study metric, the linearity of the Schrödinger equation, the emergence of complex numbers, Hilbert spaces and the Born rule are derived rather than postulated. Full article
10 pages, 286 KiB  
Proceeding Paper
An Entropic Approach to Classical Density Functional Theory
by Ahmad Yousefi and Ariel Caticha
Phys. Sci. Forum 2021, 3(1), 13; https://doi.org/10.3390/psf2021003013 - 24 Dec 2021
Viewed by 1555
Abstract
The classical Density Functional Theory (DFT) is introduced as an application of entropic inference for inhomogeneous fluids in thermal equilibrium. It is shown that entropic inference reproduces the variational principle of DFT when information about the expected density of particles is imposed. This [...] Read more.
The classical Density Functional Theory (DFT) is introduced as an application of entropic inference for inhomogeneous fluids in thermal equilibrium. It is shown that entropic inference reproduces the variational principle of DFT when information about the expected density of particles is imposed. This process introduces a family of trial density-parametrized probability distributions and, consequently, a trial entropy from which the preferred one is found using the method of Maximum Entropy (MaxEnt). As an application, the DFT model for slowly varying density is provided, and its approximation scheme is discussed. Full article
Previous Issue
Next Issue
Back to TopTop