Supposedly, the following is a former Putnam Exam, which I came across here on Math.StackExchange.

Suppose are nonconstant, differentiable functions satisfying

and for all . Prove that

Here is my attempt at a solution. I believe it’s correct–otherwise, I would not be posting it. If it’s not, please drop me a comment highlighting any errors.

*Proof. *Observe that

By induction, we see that

Letting and using the continuity of and , we see that

I claim that . Suppose not. Then

,

which is a contradiction since . Moreover,

Using the limit definition of the derivative, we obtain the identities

Hence,

which implies that the function is constant. Since , we conclude that

Additionally, we can ask what are the only nonconstant differentiable functions that satisfy the above hypotheses. I claim that and must be, respectively, of the forms

,

where . Our formulas for and obtained above show that and are infinitely differentiable. Since , by induction, we see that

I claim that and are analytic and with infinite radius of convergence. Fix . For each , Taylor’s theorem with remainder (the Lagrange form) tells us that there exists such that

But

We conclude that , where the series converges uniformly on compact subsets. A completely analogous argument shows that . We can now use the power series method to find series expressions for and . An induction argument together with the result shows that

Similarly,

We conclude that

and

### Like this:

Like Loading...

*Related*

Your statement of the problem is wrong — you have an incorrect sign in the second equation and you have omitted the condition $f'(0)=0$.

Thank you!