Optimal entropy estimation on large alphabets via best polynomial approximation

Jump to other IT Society Websites:

Optimal entropy estimation on large alphabets via best polynomial approximation

Author(s):

Creation Date: Jun 14, 2015

Published In: Jun 2015

Paper Type: Conference Paper

Book Title: Proceedings of the 2015 IEEE International Symposium on Information Theory

Address: Hong Kong, China

Abstract:

Consider the problem of estimating the Shannon entropy of a distribution on $k$ elements from $n$ independent samples. We show that the minimax mean-square error is within universal multiplicative constant factors of  $\left( \frac{n}{k \log n} \right)^{2} + \frac{\log^2 k}{n}$. This implies the recent result of Valiant-Valiant [1] that the minimal sample size for consistent entropy estimation scales according to $\Theta( \frac{k}{\log k} )$. The apparatus of best polynomial approximation plays a key role in both the minimax lower bound and the construction of optimal estimators.

Award(s) Received:

COVID-19 Updates

Read important changes and updates to the 2020 activities of the IEEE Information Theory Society due to the COVID-19 pandemic at:

https://www.itsoc.org/covid

 

Table of Contents

IEEE TechNav