Weighted Power Means

Let x,\alpha\in\mathbb{R}^{n} have positive components and satisfy \sum_{k=1}^{n}\alpha_{k}=1. We define the weighted power mean of order t by

\displaystyle M_{t}(x;\alpha):=\left(\sum_{k=1}^{n}\alpha_{k}x_{k}^{t}\right)^{\frac{1}{t}}

I claim that

\displaystyle\lim_{t\rightarrow 0^{+}}M_{t}(x;\alpha)=\prod_{k=1}^{n}x_{k}^{\alpha_{k}}

Proof. Observe that

\displaystyle t\sum_{k=1}^{n}\alpha_{k}\log\left|x_{k}\right|=\sum_{k=1}^{n}\alpha_{k}\log\left|x_{k}^{t}\right|\leq\log\left(\sum_{k=1}^{n}\alpha_{k}x_{k}^{t}\right),

by the concavity of \log. Hence,

\displaystyle\exp\left(t\sum_{k=1}^{n}\alpha_{k}\log\left|x_{k}\right|\right)\leq\sum_{k=1}^{n}\alpha_{k}x_{k}^{t}

Taking the \frac{1}{t}-root of both sides, we obtain that

\displaystyle\prod_{k=1}^{n}x_{k}^{\alpha_{k}}=\exp\left(\sum_{k=1}^{n}\alpha_{k}\log\left|x_{k}\right|\right)\leq\left(\sum_{k=1}^{n}\alpha_{k}x_{k}^{t}\right)^{\frac{1}{t}}

Hence, \liminf_{t\rightarrow 0^{+}}M_{t}(x;\alpha)\geq\prod_{k=1}^{n}x_{k}^{\alpha_{k}}. This establises a convenient lower bound. To prove the desired equality, we use l’hospital’s rule:

\begin{array}{lcl}\displaystyle\lim_{t\rightarrow 0^{+}}\log\left(\sum_{k=1}^{n}\alpha_{k}x_{k}^{t}\right)^{\frac{1}{t}}=\lim_{t\rightarrow 0^{+}}\dfrac{\log\left(\sum_{k=1}^{n}\alpha_{k}x_{k}^{t}\right)}{t}&=&\displaystyle\lim_{t\rightarrow 0^{+}}\dfrac{\sum_{k=1}^{n}\alpha_{k}\log(x_{k})x_{k}^{t}}{\sum_{k=1}^{n}\alpha_{k}x_{k}^{t}}\\[.7 em]&=&\displaystyle\dfrac{\sum_{k=1}^{n}\alpha_{k}\log(x_{k})}{\sum_{k=1}^{n}\alpha_{k}}\\[.7 em]&=&\displaystyle\sum_{k=1}^{n}\alpha_{k}\log(x_{k})\end{array}

By the continuity of the exponential, we obtain that

\displaystyle\lim_{t\rightarrow 0^{+}}\left(\sum_{k=1}^{n}\alpha_{k}x_{k}^{t}\right)^{\frac{1}{t}}=\exp\left(\sum_{k=1}^{n}\alpha_{k}\log(x_{k})\right)=\prod_{k=1}^{n}x_{k}^{\alpha_{k}}

\Box

Observe that for t>0,

\begin{array}{lcl} \displaystyle M_{-t}(x;\alpha)=\left(\sum_{k=1}^{n}\alpha_{k}x_{k}^{-t}\right)^{-\frac{1}{t}}=\left(\sum_{k=1}^{n}\alpha_{k}(x_{k}^{-1})^{t}\right)^{-\frac{1}{t}}&=&\displaystyle M_{t}(x^{-1};\alpha)^{-1}\\[.7 em]&\leq&\displaystyle\left(\prod_{k=1}^{n}x_{k}^{-\alpha_{k}}\right)^{-1}\\[.7 em]&=&\displaystyle\prod_{k=1}^{n}x_{k}^{\alpha_{k}}\\[.7 em]&=&\displaystyle M_{0}(x;\alpha)\end{array}

For all s\leq t, M_{s}(x;\alpha)\leq M_{t}(x;\alpha).

Proof. First, assume that s\leq t and s,t \neq 0. Since the function x^{\frac{t}{s}} is convex, we have by Jensen’s inequality that

\displaystyle\left(\sum_{k=1}^{n}\alpha_{k}x_{k}^{s}\right)^{\frac{t}{s}}\leq\sum_{k=1}^{n}\alpha_{k}(x_{k}^{s})^{\frac{t}{s}}=\sum_{k=1}^{n}\alpha_{k}x_{k}^{t}\Longrightarrow \left(\sum_{k=1}^{n}\alpha_{k}x_{k}^{s}\right)^{\frac{1}{s}}\leq\left(\sum_{k=1}^{n}\alpha_{k}x_{k}^{t}\right)^{\frac{1}{t}}

We’ve already proven the cases s=0 and t>0 and s<0 and t=0, so we’re done. \Box

Other properties of the weighted power means include that the function t\mapsto t\log M_{t}(x;\alpha) is convex on \mathbb{R}. Since t\mapsto t\log M_{t}(x;\alpha) is continuous, it suffices to show that (see my post for why)

\displaystyle (t+h)\log M_{t+h}(x;\alpha)+(t-h)\log M_{t-h}(x;\alpha)-2t\log M_{t}(x;\alpha)\geq 0

for all t\in\mathbb{R} and h>0. Observe that this expression is equal to

\begin{array}{lcl}&=&\displaystyle\log\left(\sum_{k=1}^{n}\alpha_{k}x_{k}^{t+h}\right)+\log\left(\sum_{k=1}^{n}\alpha_{k}x_{k}^{t-h}\right)-2\log\left(\sum_{k=1}^{n}\alpha_{k}x_{k}^{t}\right)\\[1.1 em]&=&\displaystyle\log\left(\sum_{k=1}^{n}\sum_{j=1}^{n}\alpha_{k}\alpha_{j}x_{k}^{t+h}x_{j}^{t-h}\right)-\log\left(\sum_{k=1}^{n}\sum_{j=1}^{n}\alpha_{k}\alpha_{j}(x_{k}x_{j})^{t}\right)\\[1.1 em]&=&\displaystyle\log\left(\dfrac{\sum_{k=1}^{n}\sum_{j=1}^{n}\alpha_{k}\alpha_{j}(x_{k}x_{j})^{t}(x_{k}x_{j}^{-1})^{h}}{\sum_{k=1}^{n}\sum_{j=1}^{n}\alpha_{k}\alpha_{j}(x_{k}x_{j})^{t}}\right)\end{array}

Since

\displaystyle\left(\dfrac{x_{j}}{x_{k}}\right)^{h}+\left(\dfrac{x_{k}}{x_{j}}\right)^{h}=\dfrac{x_{j}^{2h}+x_{k}^{2h}}{(x_{k}x_{j})^{h}}\geq \dfrac{2x_{j}^{h}x_{k}^{h}}{(x_{j}x_{k})^{h}}=2,

we conclude that

\displaystyle\log\left(\dfrac{\sum_{k=1}^{n}\sum_{j=1}^{n}\alpha_{k}\alpha_{j}(x_{k}x_{j})^{t}(x_{k}x_{j}^{-1})^{h}}{\sum_{k=1}^{n}\sum_{j=1}^{n}\alpha_{k}\alpha_{j}(x_{k}x_{j})^{t}}\right)> \log(1)=0

So far, we have not said anything about the limiting behavior of M_{t}(x;\alpha) at t=\pm\infty. We shall redress this omission now. Define M_{-\infty}(x;\alpha):=\inf_{1\leq k\leq n}x_{k} and M_{\infty}(x;\alpha):=\sup_{1\leq k\leq n}x_{k}. I claim that

\displaystyle M_{\infty}(x;\alpha)=\lim_{t\rightarrow\infty}M_{t}(x;\alpha),\indent M_{-\infty}(x;\alpha)=\lim_{t\rightarrow-\infty}M_{t}(x;\alpha)

Proof. Let j,j' be integers in \left\{1,\cdots,n\right\} such that x_{j}=M_{\infty}(x;\alpha) and x_{j'}=M_{-\infty}(x;\alpha). Then for t>0,

\displaystyle\alpha_{j}^{\frac{1}{t}}M_{\infty}(x;\alpha)=\left(\alpha_{j}x_{j}^{t}\right)^{\frac{1}{t}}\leq M_{t}(x;\alpha)\leq\left(\sum_{k=1}^{n}\alpha_{k}M_{\infty}(x;\alpha)^{t}\right)^{\frac{1}{t}}=M_{\infty}(x;\alpha)

and for t<0,

\displaystyle\alpha_{j'}^{\frac{1}{t}}M_{-\infty}(x;\alpha)=\left(\alpha_{j'}x_{j'}^{t}\right)^{\frac{1}{t}}\leq M_{t}(x;\alpha)\leq\left(\sum_{k=1}^{n}\alpha_{k}M_{-\infty}(x;\alpha)^{t}\right)^{\frac{1}{t}}=M_{-\infty}(x;\alpha)

Since \alpha_{j}^{\frac{1}{t}}\rightarrow 1 as t\rightarrow\infty and \alpha_{j'}^{\frac{1}{t}}\rightarrow 1 as t\rightarrow-\infty, an application of the squeeze theorem completes the proof. \Box

Advertisements
This entry was posted in math.CA and tagged , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s