Putnam Calculus Problem: Sine and Cosine

Supposedly, the following is a former Putnam Exam, which I came across here on Math.StackExchange.

Suppose f,g: \mathbb{R}\rightarrow\mathbb{R} are nonconstant, differentiable functions satisfying

\displaystyle f(x+y)=f(x)f(y)-g(x)g(y),\indent g(x+y)=f(x)g(y)+f(y)g(x)

and f'(0)=0 for all x,y\in\mathbb{R}. Prove that

\displaystyle f(x)^{2}+g(x)^{2}=1

Here is my attempt at a solution. I believe it’s correct–otherwise, I would not be posting it. If it’s not, please drop me a comment highlighting any errors.

Proof. Observe that

\begin{array}{lcl}\displaystyle f(x)^{2}+g(x)^{2}&=&\displaystyle\left(f\left(\frac{x}{2}\right)^{2}-g\left(\frac{x}{2}\right)^{2}\right)^{2}+\left(f\left(\frac{x}{2}\right)g\left(\frac{x}{2}\right)+g\left(\frac{x}{2}\right)f\left(\frac{x}{2}\right)\right)^{2}\\[.7 em]&=&\displaystyle f\left(\frac{x}{2}\right)^{4}-2f\left(\frac{x}{2}\right)^{2}g\left(\frac{x}{2}\right)^{2}+g\left(\frac{x}{2}\right)^{4}\\[.7 em]&+&\displaystyle f\left(\frac{x}{2}\right)^{2}g\left(\frac{x}{2}\right)^{2}+2f\left(\frac{x}{2}\right)^{2}g\left(\frac{x}{2}\right)^{2}+g\left(\frac{x}{2}\right)^{2}f\left(\frac{x}{2}\right)^{2}\\[.7 em]&=&\displaystyle\left(f\left(\frac{x}{2}\right)^{2}+g\left(\frac{x}{2}\right)^{2}\right)^{2}\end{array}

By induction, we see that

\displaystyle\left(f(x)^{2}+g(x)^{2}\right)^{2^{-n}}=f\left(\frac{x}{2^{n}}\right)^{2}+g\left(\frac{x}{2^{n}}\right)^{2}

Letting n\rightarrow\infty and using the continuity of f and g, we see that

\displaystyle1=\lim_{n\rightarrow\infty}\left(f(x)^{2}+g(x)^{2}\right)^{2^{-n}}=\lim_{n\rightarrow\infty}f\left(\frac{x}{2^{n}}\right)^{2}+g\left(\frac{x}{2^{n}}\right)^{2}=f(0)^{2}+g(0)^{2}

I claim that g(0)=0. Suppose not. Then

\displaystyle g(0)=g(0+0)=2g(0)f(0)\Longrightarrow f(0)=\frac{1}{2},

which is a contradiction since f(0)^{2}+g(0)^{2}=1. Moreover,

\displaystyle f(0)=f(0+0)=f(0)^{2}-g(0)^{2}=f(0)^{2}\Longrightarrow f(0)=1

Using the limit definition of the derivative, we obtain the identities

\begin{array}{lcl}\displaystyle f'(x)=\lim_{h\rightarrow0}\dfrac{f(x+h)-f(x)}{h}&=&\displaystyle\lim_{h\rightarrow 0}\dfrac{f(x)f(h)-g(x)g(h)-f(x)}{h}\\[.7 em]&=&\displaystyle\lim_{h\rightarrow0}\dfrac{f(x)[f(h)-1]-g(x)g(h)}{h}\\[.7 em]&=&\displaystyle f(x)f'(0)-g(x)g'(0)\\[.7 em]&=&\displaystyle-g(x)g'(0),\end{array}

\begin{array}{lcl}\displaystyle g'(x)=\lim_{h\rightarrow0}\dfrac{g(x+h)-g(x)}{h}&=&\displaystyle\lim_{h\rightarrow0}\dfrac{f(x)g(h)+g(x)f(h)-g(x)}{h}\\[.7 em]&=&\displaystyle\lim_{h\rightarrow0}\dfrac{f(x)g(h)+g(x)[f(h)-1]}{h}\\[.7 em]&=&\displaystyle f(x)g'(0)+g(x)f'(0)\\[.7 em]&=&\displaystyle f(x)g'(0)\end{array}

Hence,

\begin{array}{lcl}\displaystyle \frac{d}{dx}\left[f(x)^{2}+g(x)^{2}\right]=2\left[f(x)f'(x)+g(x)g'(x)\right]&=&\displaystyle2\left[-f(x)g(x)g'(0)+g(x)f(x)g'(0)\right]\\&=&\displaystyle0,\end{array}

which implies that the function x\mapsto f(x)^{2}+g(x)^{2} is constant. Since f(0)^{2}+g(0)^{2}=1, we conclude that

\displaystyle f(x)^{2}+g(x)^{2}=1,\indent\forall x\in\mathbb{R}

\Box

Additionally, we can ask what are the only nonconstant differentiable functions f,g: \mathbb{R}\rightarrow\mathbb{R} that satisfy the above hypotheses. I claim that f(x) and g(x) must be, respectively, of the forms

\displaystyle f(x)=\cos(\alpha x),\indent g(x)=\sin(\alpha x),

where \alpha\in\mathbb{R}. Our formulas for f'(x) and g'(x) obtained above show that f and g are infinitely differentiable. Since \max\left\{\left|f(x)\right|,\left|g(x)\right|\right\}\leq1, by induction, we see that

\displaystyle \max\left\{\left|g^{(n)}(x)\right|,\left|f^{(n)}(x)\right|\right\}\leq\left|g'(0)\right|^{n},\indent\forall x\in\mathbb{R},\forall n\in\mathbb{Z}^{\geq0}

I claim that f and g are analytic and x=0 with infinite radius of convergence. Fix x\in\mathbb{R}. For each n\in\mathbb{Z}^{\geq 0}, Taylor’s theorem with remainder (the Lagrange form) tells us that there exists \xi_{n}\in[0,x] such that

\displaystyle f(x)=\sum_{k=0}^{n}\frac{f^{(k)}(0)}{k!}x^{k}+\frac{f^{(n+1)}(\xi_{n})}{(n+1)!}x^{n+1}

But

\displaystyle\left|f^{(n+1)}(\xi_{n})\right|\frac{\left|x\right|^{n+1}}{(n+1)!}\leq\frac{\left|x\right|^{n+1}}{(n+1)!}\longrightarrow 0, n \longrightarrow\infty

We conclude that f(x)=\sum_{k=0}^{\infty}\frac{f^{(k)}(0)}{k!}x^{k}, where the series converges uniformly on compact subsets. A completely analogous argument shows that g(x)=\sum_{k=0}^{\infty}\frac{g^{(k)}(0)}{k!}x^{k}. We can now use the power series method to find series expressions for f and g. An induction argument together with the result f(0)=1 shows that

\displaystyle f^{(n)}(0)=\begin{cases}{(-1)^{k}(g'(0))^{n}}&{n=2k,k\in\mathbb{Z}^{\geq0}} \\ {0}&{n=2k+1,k\in\mathbb{Z}^{\geq0}}\end{cases}

Similarly,

\displaystyle g^{(n)}(0)=\begin{cases}{(-1)^{k}(g'(0))^{n}}&{n=2k+1,k\in\mathbb{Z}^{\geq0}} \\ {0}&{n=2k,k\in\mathbb{Z}^{\geq0}}\end{cases}

We conclude that

\displaystyle f(x)=\sum_{k=0}^{\infty}\frac{(-1)^{k}(g'(0)x)^{2k}}{(2k)!}x^{2k}=\cos(g'(0)x)

and

\displaystyle g(x)=\sum_{k=0}^{\infty}\frac{(-1)^{k}(g'(0)x)^{2k+1}}{(2k+1)!}=\sin(g'(0)x)

Advertisements
This entry was posted in math.CA, Problem Solving and tagged , , , . Bookmark the permalink.

2 Responses to Putnam Calculus Problem: Sine and Cosine

  1. NoOne says:

    Your statement of the problem is wrong — you have an incorrect sign in the second equation and you have omitted the condition $f'(0)=0$.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s