Theory of Computing Seminar
Abstract:
The entropy power inequality, fundamental in Information Theory, states that for every independent continuous random vector X,Y in R^n$, one has N(X+Y) \geq N(X) + N(Y). Here N(X) denotes the entropy power of X, defined as N(X) = e^{2h(X)/n}, where h(X) is the entropy of X.
In this talk, we will see that the entropy power inequality can be extended to the Renyi entropy.
(based on a joint work with S. Bobkov)
In this talk, we will see that the entropy power inequality can be extended to the Renyi entropy.
(based on a joint work with S. Bobkov)
For more information, please contact Thomas Vidick by email at [email protected].
Event Series
Theory of Computing Seminar Series