Colloquia --- Summer 2007

Thursday, June 7, 2007

Title Information Volatility
Speaker M. Rao
University of Florida
Time 3:00-4:00 p.m.
Place ENG 4
Sponsor TBA

Abstract

As is well known the Shannon Entropy H(X) of a random variable X is by definition -∫ f(x) log f(x) dx, which is the expectation of the random variable - log f(X). In this talk we study its variance, which we call its Information Volatility (or IV(X) for short). IV(X) has some very good properties not shared by Shannon Entropy. For example, IV(X) equaling zero characterizes the Uniform distribution; IV(X) is invariant under the affine transformation; and has some convergence properties.

Please direct questions to mthmaster@nosferatu.cas.usf.edu.
Last updated: 04-Jun-2007.
Copyright © 2000, USF Department of Mathematics & Statistics.