Relations Between Information and Estimation: A Unified View

Jump to other IT Society Websites:

Relations Between Information and Estimation: A Unified View


Creation Date: Jun 02, 2016

Published In: Dec 2015

Paper Type: Dissertation

Address: Stanford, CA, USA

School: Stanford University


Measures of information, such as entropy, mutual information, and relative entropy play important roles in various domains: underlying algorithmic procedures in machine learning, characterizing operational quantities in communication, storage and inference, and characterizing the likelihood of large deviations events in statistics and probability, among others. Such information measures also turn out to give a quantitative understanding of several estimation theoretic notions - such as the costs of causality and mismatch, and the benefit of lookahead - via relations between information and estimation. This thesis presents a new framework for understanding the rich interconnections between information and estimation for a variety of widely applicable observation models.

Many of these relations can be viewed as identities between expectations of random quantities. This dissertation presents these identities and their generalizations in new light. The first part of this thesis focusses on two canonical observation models: The additive Gaussian channel and the Poisson channel. For both these models, we dispense with the expectations and explore the nature of the pointwise relations between the respective random quantities. Our results recover and generalize previously known identities involving mutual information and minimum mean loss in estimation. Classical estimation and information theoretic quantities emerge with new and surprising roles. In the second part of the thesis, we introduce a natural family of Levy channels where the distribution of the output conditioned on the input is infinitely divisible. This family includes, as special cases, both the Gaussian and Poisson channels. For Levy channels, we establish new representations relating the mutual information between the channel input and output to an optimal expected estimation loss, thereby unifying and extending previously known results. Corollaries of our results include new formulae for entropy and relative entropy. The unified framework extends to continuous-time stochastic processes as well, where we introduce and study these connections for semi-martingale channels, which include the well-known white Gaussian channel and the multivariate point process channel. In the third, and final part of this dissertation, we revisit the white Gaussian channel and study the problem of estimation with finite lookahead. We look at the properties of finite lookahead minimum mean squared error and investigate connections between this quantity and mutual information.

Award(s) Received: