## Estimating the AM-GM Inequality

Let $I\subset (0,\infty)$ be an interval, and let $f: I\rightarrow (0,\infty)$ be twice-differentiable. For a parameter $\alpha\in\mathbb{R}$, consider the function

$\displaystyle\varphi(x):=f(x)\cdot x^{-\frac{\alpha\log x}{2}}$

We want to figure the values of $\alpha$ for which $\varphi$ is multiplicatively convex (i.e. $\varphi(x^{\lambda}y^{1-\lambda})\leq\varphi(x)^{\lambda}\varphi(y)^{1-\lambda}$, for all $\forall x,y\in I,\forall\lambda\in[0,1]$. I leave it as an exercise to the reader to verify that $\varphi$ is multiplicatively convex if and only if $\varphi(e^{x})$ is convex on $\log I$. Define $\Phi:\log I\rightarrow(0,\infty)$ by

$\displaystyle\Phi(x):=\log\varphi(e^{x})=f(e^{x})+\log(e^{x})^{-\frac{\alpha x}{2}}=f(e^{x})-\dfrac{\alpha x^{2}}{2}$

Since $\Phi$ is twice-differentiable, convexity of $\Phi$ is equivalent to $\Phi''\geq 0$, which implies that $\varphi$ is multiplicatively convex if and only if

$\begin{array}{lcl}\displaystyle\dfrac{d^{2}}{dx^{2}}\left[\log f(e^{x})-\dfrac{\alpha x^{2}}{2}\right]&=&\displaystyle\dfrac{d}{dx}\left[\dfrac{f'(e^{x})}{f(e^{x})}e^{x}-\alpha x\right]\\[.7 em]&=&\displaystyle\dfrac{f(e^{x})\left[f''(e^{x})e^{2x}+f'(e^{x})e^{x}\right]-\left(f'(e^{x})\right)^{2}e^{2x}}{\left(f(e^{x})\right)^{2}}-\alpha\geq0\end{array}$

Equivalently, $\Phi$ is convex if and only if

$\begin{array}{lcl}\displaystyle\alpha(f)&:=&\displaystyle\inf_{x\in\log I}\dfrac{d^{2}}{dx^{2}}\left[\log f(e^{x})\right]\\[.9 em]&=&\displaystyle\inf_{x\in I}\dfrac{x^{2}\left[f(x)f''(x)-(f'(x))^{2}\right]+xf(x)f'(x)}{f(x)^{2}}\\&\geq&\displaystyle\alpha\end{array}$

Define $\beta(f):=\sup_{x\in I}\dfrac{d^{2}}{dx^{2}}[\log f(e^{x})]$. Since for the choice $\alpha=\beta(f)$, we have that $\Phi''\leq 0$ on $\log I$ and therefore $\Phi$ is concave. We obtain the inequalities

$\begin{array}{lcl}\displaystyle\log f\left(\exp\left[\dfrac{1}{n}\sum_{k=1}^{n}\log x_{k}\right]\right)-\dfrac{\alpha(f)}{2n^{2}}\left(\sum_{k=1}^{n}\log x_{k}\right)^{2}&\leq&\displaystyle\dfrac{1}{n}\sum_{k=1}^{n}\log f\left(\exp(\log x_{k})\right)\\&-&\displaystyle\dfrac{\alpha(f)}{2n}\sum_{k=1}^{n}(\log x_{k})^{2},\end{array}$

which implies that

$\begin{array}{lcl}\displaystyle\dfrac{\left(\prod_{k=1}^{n}f(x_{k})\right)^{\frac{1}{n}}}{f\left(\prod_{k=1}^{n}x_{k}^{\frac{1}{n}}\right)}&\geq&\displaystyle\exp\left(\dfrac{\alpha(f)}{2n^{2}}\left[n\sum_{k=1}^{n}(\log x_{k})^{2}-\left(\sum_{k=1}^{n}\log x_{k}\right)^{2}\right]\right)\\&=&\displaystyle\exp\left(\dfrac{\alpha(f)}{2n^{2}}\sum_{1\leq j

and

$\begin{array}{lcl}\displaystyle\log f\left(\exp\left[\dfrac{1}{n}\sum_{k=1}^{n}\log x_{k}\right]\right)-\dfrac{\beta(f)}{2n^{2}}\left(\sum_{k=1}^{n}\log x_{k}\right)^{2}&\geq&\displaystyle\dfrac{1}{n}\sum_{k=1}^{n}\log f\left(\exp(\log x_{k})\right)\\&-&\displaystyle\dfrac{\beta(f)}{2n}\sum_{k=1}^{n}(\log x_{k})^{2},\end{array}$

which implies that

$\begin{array}{lcl}\displaystyle\dfrac{\left(\prod_{k=1}^{n}f(x_{k})\right)^{\frac{1}{n}}}{f\left(\prod_{k=1}^{n}x_{k}^{\frac{1}{n}}\right)}&\leq&\displaystyle\exp\left(\dfrac{\beta(f)}{2n^{2}}\left[n\sum_{k=1}^{n}(\log x_{k})^{2}-\left(\log x_{k}\right)^{2}\right]\right)\\&=&\displaystyle\exp\left(\dfrac{\beta(f)}{2n^{2}}\sum_{1\leq j

In particular, if $f(x)=e^{x}$ restricted to the interval $[A,B]$, where $0, then $\alpha(f)=A, \beta(f)=B$.

We can use the preceding estimate of the AM-GM inequality to obtain an inequality for nonnegative $L^{\infty}$ random variables. This inequality sheds light on the probabilistic connections of the AM-GM inequality. Let $A$ and $B$ be as above. If $X$ is the discrete random variable with distribution $\mathbb{P}(X=x_{j})=\frac{1}{n}$ for $1\leq j\leq n$, then $\mathbb{E}[\log X]=\frac{1}{n}\sum_{j=1}^{n}\log x_{j}$ and

$\displaystyle\text{Var}(\log X)=\dfrac{1}{n}\sum_{j=1}^{n}(\log x_{j})^{2}-\dfrac{1}{n^{2}}\left(\sum_{j=1}^{n}\log x_{j}\right)^{2}=\dfrac{1}{n^{2}}\sum_{1\leq j

So we can restate the above inequality in the language of probability theory as

$\displaystyle A\text{Var}(\log X)\leq\mathbb{E}[X]-e^{\mathbb{E}[\log X]}\leq B\text{Var}(\log X)$

Naturally, we ask if the inequality holds for not necessarily discrete random variables taking values in $[A,B]$. The answer is yes, as we now show.

Proposition. Let $(\Omega,\mathcal{F},\mathbb{P})$ be a probability space, and let $X:\Omega\rightarrow\mathbb{R}$ be a random variable taking values in $[A,B]$, where $0. Then

$\displaystyle A\leq\dfrac{\mathbb{E}[X]-e^{\mathbb{E}[\log X]}}{\text{Var}(\log X)}\leq B$

Proof. Consider the simple function $X_{n}=\sum_{k=1}^{{n}}\left[A+(k-1)\frac{B-A}{n}\right]\mathbf{1}_{E_{k}}$, where

$\displaystyle E_{k}:=\left\{\omega\in\Omega:A+(k-1)\frac{B-A}{n}\leq X(\omega)

for all $1\leq k\leq n$. If we can show that $X_{n}\stackrel{L^{1}}\rightarrow X$ and $(\log X_{n})^{2}\stackrel{L^{1}}\rightarrow (\log X)^{2}$, then the desired inequality follows from our work above and a limiting argument. But $X_{n}\rightarrow X$ a.s. and by continuity, $\log X_{n}\rightarrow \log X$ a.s. Also, $X_{n}\leq X$ and $(\log X_{n})^{2}\leq \max\left\{(\log A)^{2},(\log B)^{2}\right\}$. So the desired convergence results follow from the dominated convergence theorem. $\Box$