Next Article in Journal
Did the Federal Agriculture Improvement and Reform Act of 1996 Affect Farmland Values?
Previous Article in Journal
Information Theory and Dynamical System Predictability
Open AccessArticle

k-Nearest Neighbor Based Consistent Entropy Estimation for Hyperspherical Distributions

1
Health Effects Laboratory Division, National Institute for Occupational Safety and Health, Morgantown, WV 26505, USA
2
Department of Statistics, West Virginia University, Morgantown, WV 26506, USA
*
Authors to whom correspondence should be addressed.
Entropy 2011, 13(3), 650-667; https://doi.org/10.3390/e13030650
Received: 22 December 2010 / Revised: 27 January 2011 / Accepted: 28 February 2011 / Published: 8 March 2011
A consistent entropy estimator for hyperspherical data is proposed based on the k-nearest neighbor (knn) approach. The asymptotic unbiasedness and consistency of the estimator are proved. Moreover, cross entropy and Kullback-Leibler (KL) divergence estimators are also discussed. Simulation studies are conducted to assess the performance of the estimators for models including uniform and von Mises-Fisher distributions. The proposed knn entropy estimator is compared with the moment based counterpart via simulations. The results show that these two methods are comparable. View Full-Text
Keywords: hyperspherical distribution; directional data; differential entropy; cross entropy; Kullback-Leibler divergence; k-nearest neighbor hyperspherical distribution; directional data; differential entropy; cross entropy; Kullback-Leibler divergence; k-nearest neighbor
Show Figures

Figure 1

MDPI and ACS Style

Li, S.; Mnatsakanov, R.M.; Andrew, M.E. k-Nearest Neighbor Based Consistent Entropy Estimation for Hyperspherical Distributions. Entropy 2011, 13, 650-667.

Show more citation formats Show less citations formats

Article Access Map by Country/Region

1
Only visits after 24 November 2015 are recorded.
Back to TopTop