Overview of the 37th MaxEnt

The 37th edition of MaxEnt was held in Brazil, hosting several distinguished researchers and students. The workshop offered four tutorials, nine invited talks, twenty four oral presentations and twenty seven poster presentations. All submissions received their first choice between oral and poster presentations. The event held a celebration to Julio Stern’s 60th anniversary and awarded two prizes to young researchers. As customary, the workshop had one free afternoon, in which participants visited the city’s surroundings and experienced Brazilian food and traditions.

We emphasize the fact that the invited talks, tutorials and oral sessions were filmed. They are available at https://www.youtube.com/channel/UCvBinMGiKRNRo8g8F0fvjMg. This is the visibility of the work of our community, which we all wish to have.

Tutorials and Talks
This edition of MaxEnt offered four tutorials presented here in chronological order. The tutorial, by Hellinton Takada, Methods for portfolio allocation presented the main methodologies for portfolio allocation, focusing on computational optimization and statistical modeling. Agatha Rodrigues, Adriano Polpo and Carlos Pereira in Reliability estimation in coherent systems discussed the parametric and nonparametric estimation of the reliability function of coherent systems (series-parallel and parallel-series). Systems with masked data were also discussed. A model of evolution for languages, by Rafael B. Stern, presented a probability model for the evolution of languages and discussed advances in Sequential Monte Carlo to obtain the posterior for this model. Ali Mohammad-Djafari in Approximate Bayesian Computation tools for large scale inverse problems and hierarchical models for big Data discussed applications in which observations are modeled by means of an hierarchical model with high dimensional hidden variables. Since in such models it is intractable to compute the likelihood function, the tutorial discussed how to obtain the posterior distribution by using Approximate Bayesian Computation methods.
The workshop also held nine invited talks, which are presented in chronological order. Ariel Caticha discussed the derivation of dynamical laws, and in particular, of quantum theory as an application of entropic methods of inference. Flavio Goncalves proposed a hidden Markov mixture model for the analysis of gene expression measurements mapped to chromosome locations. Udo Toussaint described the use of Gaussian processes and developed (parallelizable) batch processing approaches for the propagation of uncertainty in computational models. Karim Anaya-Isquierdo introduced a survival analysis methodology based on the cumulative hazard function which generalizes models such as those of proportional hazards and accelerated failure time. Estevam Hruschka presented key concepts from NELL (Never-Ending Language Learner), which automatically retrieves semantic relations between words from the Internet. John Skilling presented the "Quantum Theory is not weird" talk. Thais Fonseca presented a new class of spatio-temporal models that allows kurtosis to vary spatially. Rubens Sampaio constructed a parametric model to analyze data from a dry-friction oscillator. Kevin Knuth presented the EXONEST Exoplanetary Explorer, which is a Bayesian exoplanet inference engine based on nested sampling. This engine was used to analyze Kepler and CoRoT data.

Entropy Special Issue and Conference Proceedings
This special issue of Entropy (http://www.mdpi.com/journal/entropy/special_issues/maxent17) was focused on all aspects of probabilistic inference, including applications, methodology and foundations.The conference proceedings [1] had twenty eight articles published. The issue is composed of nine articles, which are briefly discussed below in the order of their publication.
EXONEST: The Bayesian Exoplanetary Explorer [2] presented a Bayesian exoplanet inference engine based on nested sampling and originally designed to analyze archived Kepler Space Telescope and CoRoT (Convection Rotation et Transits planétaires) exoplanet mission data. The EXONEST package currently accommodates plug-and-play models which allow for both circular and eccentric orbits in conjunction with photometric effects, such as the primary transit and secondary eclipse, reflected light, thermal emissions, ellipsoidal variations, Doppler beaming and superrotation. The paper also discussed new efforts to include more subtle photometric effects involving reflected and refracted light.
L1-minimization Algorithm for Bayesian Online Compressed Sensing [3] proposed a Bayesian online reconstruction algorithm for sparse signals based on Compressed Sensing and inspired by L1-regularization schemes. It showed that, even when prior knowledge about the signal's generation is limited, reconstruction is possible by introduction of a Laplace prior and of an extra Kullback-Leibler divergence minimization step for hyper-parameter learning.
Exact Renormalization Groups as a Form of Entropic Dynamics [4] discussed Renormalization Groups (RG), a set of methods that have been instrumental in tackling problems involving an infinite number of degrees of freedom, such as, quantum field theory and critical phenomena. The paper showed that exact RG equations are functional diffusion equations, which can be derived using entropic methods. The RG model's flow was then described as a form of entropic dynamics of field configurations. In this approach the RG transformations receive a purely inferential interpretation that establishes a clear link to information theory.
Classical-Equivalent Bayesian Portfolio Optimization for Electricity Generation Planning [5] discussed a method for allocating electricity generation among several available technologies, which can be solved as a mean-variance optimization problem. While traditional approaches assumed that the mean and covariances of the costs are known, the paper proposed a Bayesian approach which attributes a prior to these unknown quantities. Both improper and proper prior distributions were used and shown to compare well to the traditional method.
A Sequential Algorithm for Signal Segmentation [6] presented an application of event detection in noisy signals. The paper analyzed 15 min samples of acoustic signals and builds an unsupervised method for finding sections or segments of the signal in which events of interest are likely to have occurred. The Bayesian method is based on a sequential algorithm that is based solely on the assumption that each event alters the energy of the signal.
Feature Selection Based on the Local Lift Dependence Scale [7] proposed a method for feature selection that is based on minimizing a cost function over an estimate of the observables' joint distribution. While previous approaches perform a search over a Boolean lattice of feature sets (BLFS), the proposed method searched over a collection of Boolean lattices of ordered pairs (CBLOP). As a result, the method found not only the features that are most related to a given variable, but also explained which values of these features are most related to each value of the variable.
Categorical Data Analysis Using a Skewed Weibull Regression Model [8] showed a partition scheme to derive a multinomial regression model, under binary link functions. Based on the distribution function of a random variable with Weibull distribution, they proposed a new skewed link function that has as limit case the most commonly used models (logit, probit and complementary log-log). The estimation procedure was presented under frequentist and Bayesian paradigms. The proposed model was compared to other skewed link function in two data sets.
Estimating Multivariate Discrete Distributions Using Bernstein Copulas [9] presented a procedure for estimation of joint discrete distributions. Under nonparametric approach, the authors proposed a method, based on Bernstein polynomials copula, to estimate bivariate discrete distributions, both the marginals and the joint distribution. Also, they compared with a alternative method by simulated samples and presented an application.
Global Optimization Employing Gaussian Process-Based Bayesian Surrogates [10] described a method for global optimization based on Gaussian processes. The authors discussed the problem that a simulation process is time consuming. Under Bayesian paradigm they proposed surrogate model function adjusted self-consistently via hyperparameters to represent the data. This surrogate model is based on a Gaussian Process, where they focused in the expectation values to represent the data. They advocated that this procedure is a shortcut to find the maximum, solving the global optimization problem.

Celebration and Awards
The 37th MaxEnt held a special celebration for Julio Stern's 60th anniversary. Julio encouraged and mentored many students in the development of original research, totalizing twenty two graduate student orientations until now. His scientific interests encompass a variety of topics, such as, statistical theory, optimization, logic, epistemology, philosophy of science and operational research. During celebration there were speeches from several of Julio's colleagues and friends, such as Marcelo Lauretto, Celma Ribeiro, Carlos Humes, Marco Gubitoso and Stephen Vavasis.
The workshop also awarded two young researchers for the "Young Author Best Paper Entropy Award". The MaxEnt 2017 scientific committee has selected StevenWaldrip [11], and the local award committee, with Carlos Pereira as chair, selected Roberta Lima [12].

Final Comments
The main objective this workshop was discussion about the Bayesian computational techniques as Monte Carlo Markov Chain, as well as approximate inferential methods, questions about foundation of probability and information theory. Also, present and discuss new applications of inference foundations of physical theories. The good level of the event had excellent talks given by researchers of international projection, whose works are part of the current scenario of applied mathematics and statistics.
The workshop had a total of seventy participants, belonging to these institutions: It is worth noting the significant participation of students (twenty nine), most presenting their works orally or in posters. This is an important indicator of a promising future for the areas covered by the event. An evaluation was carried out with the participants. In view of the evaluations the event was considered excellent and that it fully reached the proposed objectives.