On the Use of Entropy to Improve Model Selection Criteria
AbstractThe most widely used forms of model selection criteria, the Bayesian Information Criterion (BIC) and the Akaike Information Criterion (AIC), are expressed in terms of synthetic indicators of the residual distribution: the variance and the mean-squared error of the residuals respectively. In many applications in science, the noise affecting the data can be expected to have a Gaussian distribution. Therefore, at the same level of variance and mean-squared error, models, whose residuals are more uniformly distributed, should be favoured. The degree of uniformity of the residuals can be quantified by the Shannon entropy. Including the Shannon entropy in the BIC and AIC expressions improves significantly these criteria. The better performances have been demonstrated empirically with a series of simulations for various classes of functions and for different levels and statistics of the noise. In presence of outliers, a better treatment of the errors, using the Geodesic Distance, has proved essential. View Full-Text
- Supplementary File 1:
PDF-Document (PDF, 6827 KB)
Share & Cite This Article
Murari, A.; Peluso, E.; Cianfrani, F.; Gaudio, P.; Lungaroni, M. On the Use of Entropy to Improve Model Selection Criteria. Entropy 2019, 21, 394.
Murari A, Peluso E, Cianfrani F, Gaudio P, Lungaroni M. On the Use of Entropy to Improve Model Selection Criteria. Entropy. 2019; 21(4):394.Chicago/Turabian Style
Murari, Andrea; Peluso, Emmanuele; Cianfrani, Francesco; Gaudio, Pasquale; Lungaroni, Michele. 2019. "On the Use of Entropy to Improve Model Selection Criteria." Entropy 21, no. 4: 394.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.