On Maximum Entropy and Inference
AbstractMaximum entropy is a powerful concept that entails a sharp separation between relevant and irrelevant variables. It is typically invoked in inference, once an assumption is made on what the relevant variables are, in order to estimate a model from data, that affords predictions on all other (dependent) variables. Conversely, maximum entropy can be invoked to retrieve the relevant variables (sufficient statistics) directly from the data, once a model is identified by Bayesian model selection. We explore this approach in the case of spin models with interactions of arbitrary order, and we discuss how relevant interactions can be inferred. In this perspective, the dimensionality of the inference problem is not set by the number of parameters in the model, but by the frequency distribution of the data. We illustrate the method showing its ability to recover the correct model in a few prototype cases and discuss its application on a real dataset. View Full-Text
Share & Cite This Article
Gresele, L.; Marsili, M. On Maximum Entropy and Inference. Entropy 2017, 19, 642.
Gresele L, Marsili M. On Maximum Entropy and Inference. Entropy. 2017; 19(12):642.Chicago/Turabian Style
Gresele, Luigi; Marsili, Matteo. 2017. "On Maximum Entropy and Inference." Entropy 19, no. 12: 642.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.