Next Article in Journal
Derivation of 2D Power-Law Velocity Distribution Using Entropy Theory
Next Article in Special Issue
Time Evolution of Relative Entropies for Anomalous Diffusion
Previous Article in Journal
Analysis of an Air Powered Engine System Using a Multi-Stage Radial Turbine
Previous Article in Special Issue
Kullback–Leibler Divergence Measure for Multivariate Skew-Normal Distributions
Open AccessArticle

Pushing for the Extreme: Estimation of Poisson Distribution from Low Count Unreplicated Data—How Close Can We Get?

School of Computer Science, The University of Birmingham, Birmingham, B15 2TT, UK
Entropy 2013, 15(4), 1202-1220; https://doi.org/10.3390/e15041202
Received: 15 January 2013 / Revised: 21 March 2013 / Accepted: 25 March 2013 / Published: 8 April 2013
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Studies of learning algorithms typically concentrate on situations where potentially ever growing training sample is available. Yet, there can be situations (e.g., detection of differentially expressed genes on unreplicated data or estimation of time delay in non-stationary gravitationally lensed photon streams) where only extremely small samples can be used in order to perform an inference. On unreplicated data, the inference has to be performed on the smallest sample possible—sample of size 1. We study whether anything useful can be learnt in such extreme situations by concentrating on a Bayesian approach that can account for possible prior information on expected counts. We perform a detailed information theoretic study of such Bayesian estimation and quantify the effect of Bayesian averaging on its first two moments. Finally, to analyze potential benefits of the Bayesian approach, we also consider Maximum Likelihood (ML) estimation as a baseline approach. We show both theoretically and empirically that the Bayesian model averaging can be potentially beneficial. View Full-Text
Keywords: Poisson distribution; unreplicated data; Bayesian learning; expected Kullback–Leibler divergence Poisson distribution; unreplicated data; Bayesian learning; expected Kullback–Leibler divergence
Show Figures

Figure 1

MDPI and ACS Style

Tiňo, P. Pushing for the Extreme: Estimation of Poisson Distribution from Low Count Unreplicated Data—How Close Can We Get? Entropy 2013, 15, 1202-1220. https://doi.org/10.3390/e15041202

AMA Style

Tiňo P. Pushing for the Extreme: Estimation of Poisson Distribution from Low Count Unreplicated Data—How Close Can We Get? Entropy. 2013; 15(4):1202-1220. https://doi.org/10.3390/e15041202

Chicago/Turabian Style

Tiňo, Peter. 2013. "Pushing for the Extreme: Estimation of Poisson Distribution from Low Count Unreplicated Data—How Close Can We Get?" Entropy 15, no. 4: 1202-1220. https://doi.org/10.3390/e15041202

Find Other Styles

Article Access Map by Country/Region

1
Only visits after 24 November 2015 are recorded.
Search more from Scilit
 
Search
Back to TopTop