On the Impossibility of Learning the Missing Mass
AbstractThis paper shows that one cannot learn the probability of rare events without imposing further structural assumptions. The event of interest is that of obtaining an outcome outside the coverage of an i.i.d. sample from a discrete distribution. The probability of this event is referred to as the “missing mass”. The impossibility result can then be stated as: the missing mass is not distribution-free learnable in relative error. The proof is semi-constructive and relies on a coupling argument using a dithered geometric distribution. Via a reduction, this impossibility also extends to both discrete and continuous tail estimation. These results formalize the folklore that in order to predict rare events without restrictive modeling, one necessarily needs distributions with “heavy tails”. View Full-Text
Share & Cite This Article
Mossel, E.; Ohannessian, M.I. On the Impossibility of Learning the Missing Mass. Entropy 2019, 21, 28.
Mossel E, Ohannessian MI. On the Impossibility of Learning the Missing Mass. Entropy. 2019; 21(1):28.Chicago/Turabian Style
Mossel, Elchanan; Ohannessian, Mesrob I. 2019. "On the Impossibility of Learning the Missing Mass." Entropy 21, no. 1: 28.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.