Next Article in Journal
GIS Partial Discharge Pattern Recognition Based on a Novel Convolutional Neural Networks and Long Short-Term Memory
Previous Article in Journal
Timelessness Strictly inside the Quantum Realm
Previous Article in Special Issue
Information Measure in Terms of the Hazard Function and Its Estimate

Robust Universal Inference

The Industrial Engineering Department, Tel Aviv University, Tel Aviv 6997801, Israel
The School of Electrical Engineering, Tel Aviv University, Tel Aviv 6997801, Israel
Author to whom correspondence should be addressed.
Academic Editor: Sangun Park
Entropy 2021, 23(6), 773;
Received: 2 May 2021 / Revised: 14 June 2021 / Accepted: 15 June 2021 / Published: 18 June 2021
(This article belongs to the Special Issue Applications of Information Theory in Statistics)
Learning and making inference from a finite set of samples are among the fundamental problems in science. In most popular applications, the paradigmatic approach is to seek a model that best explains the data. This approach has many desirable properties when the number of samples is large. However, in many practical setups, data acquisition is costly and only a limited number of samples is available. In this work, we study an alternative approach for this challenging setup. Our framework suggests that the role of the train-set is not to provide a single estimated model, which may be inaccurate due to the limited number of samples. Instead, we define a class of “reasonable” models. Then, the worst-case performance in the class is controlled by a minimax estimator with respect to it. Further, we introduce a robust estimation scheme that provides minimax guarantees, also for the case where the true model is not a member of the model class. Our results draw important connections to universal prediction, the redundancy-capacity theorem, and channel capacity theory. We demonstrate our suggested scheme in different setups, showing a significant improvement in worst-case performance over currently known alternatives. View Full-Text
Keywords: minimax estimation; minimax risk; statistical inference; estimation theory; universal prediction minimax estimation; minimax risk; statistical inference; estimation theory; universal prediction
Show Figures

Figure 1

MDPI and ACS Style

Painsky, A.; Feder, M. Robust Universal Inference. Entropy 2021, 23, 773.

AMA Style

Painsky A, Feder M. Robust Universal Inference. Entropy. 2021; 23(6):773.

Chicago/Turabian Style

Painsky, Amichai, and Meir Feder. 2021. "Robust Universal Inference" Entropy 23, no. 6: 773.

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

Back to TopTop