Seminarium KMMF "Teoria Dwoistości"
sala 2.23, ul. Pasteura 5
P. Nayar (MIMUW)
Inequalities in information theory
The notion of entropy was introduced in 1872 by Ludwig Boltzmann in the study of his famous equation describing distribution of positions and velocities of particles in the classical gas with particle collisions. In 1948 Claude Shannon, the father of information theory, used the same mathematical notion to quantify the amount of information contained in a given random variable. During the talk we will deal with entropy in the context of probabilistic inequalities, in particular discussing variance-entropy comparisons and bounds on entropy of sums of independent random variables. We shall also describe relations with convex geometry and isoperimetric inequalities.