Topic: Large deviations, in particular: Sanov's theorem

Let $\Sigma$ be a Polish space and $M_1(\Sigma)$ be the space of probability measures in $\Sigma$. Prove that
$$\left\lVert\nu-\mu\right\rVert_{var}^2\leq 2\textbf{H}(\nu|\mu), \ \ \ \ \mu,\nu\in\textbf{M}_1(\Sigma).\tag{*}$$
A proof of (*) can be based on the observation that
$$3(x-1)^2\leq(4+2x)(x\log x-x+1),\ \ \ \ x\in[0,\infty),$$
the fact that $\left\lVert\nu-\mu\right\rVert_{var}=\left\lVert f-1\right\rVert_{L^1(\mu)}$ if $\nu <<\mu$ and $f=\frac{d\nu}{d\mu}$, and Schwartz's inequality.


notes:
$\textbf{H}(\nu|\mu)=\left\{ \begin{array}{rcl} \int_\Sigma f\log f\ d\mu & if\ \ \nu\ll\mu\ and\ \ f=\frac{d\nu}{d\mu}\\ \infty & otherwise \end{array}\right.$
side note: $\int_\Sigma f\log f\ d\mu=\int_\Sigma \log f\ d\nu$

$\left\lVert\alpha\right\rVert_{var}=\sup\bigg\{\int\phi\ d\alpha:\phi\in C_b(\Sigma;\mathbb{R})\ with\ \left\lVert\phi\right\rVert_{C_b}\leq 1\bigg\}$
is the (total) variation norm (that is the definition in large deviation book written by jean-dominique deuschel and daniel w. stroock)

  • What is H?

  • H is relative entropy

  • Can you upload the definition of conditional entropy as well? Different textbooks use slightly different definitions. Does "var" norm means the total variation norm?

  • I added some definition to the question, and by the way, I need the answer in like 15 hours from now, I accidentally added the extra time

  • Mathe Mathe
    0

    Never mind, I found the solution

Answer

Answers can be viewed only if
  1. The questioner was satisfied and accepted the answer, or
  2. The answer was disputed, but the judge evaluated it as 100% correct.
View the answer

1 Attachment

Mathe Mathe
3.2K
  • damn, it is perfect (unless i miss something) thank you very much.

The answer is accepted.
Join Matchmaticians Affiliate Marketing Program to earn up to 50% commission on every question your affiliated users ask or answer.