Shannon's differential entropy asymptotic analysis in a Bayesian problem

Mark Kelbert, Pavel Mozgunov


We consider a Bayesian problem of estimating of probability of success in a series of conditionally independent trials with binary outcomes. We study the asymptotic behaviour of differential entropy for posterior probability density function conditional on $x$ successes after $n$ conditionally independent trials, when $n \to \infty$. Three particular cases are studied: $x$ is a proportion of $n$; $x$ $\sim n^\beta$, where $0<\beta<1$; either $x$ or $n-x$ is a constant. It is shown that after an appropriate normalization in the first and second cases limiting distribution is Gaussian and the differential entropy of standardized RV converges to differential entropy of standard Gaussian random variable. In the third case the limiting distribution in not Gaussian, but still the asymptotic of differential entropy can be found explicitly.


differential entropy, Bayes' formula, Gaussian limit theorem

Full Text:


ISSN: 1331-0623 (Print), 1848-8013 (Online)