Dear everyone,

Problem 48 says:

Consider the theorem: If f and f' are both strictly increasing real-valued functions on (0,infty), then $$\lim_{x\to\infty} f(x) = \infty$$. The following argument is suggested as a proof of this theorem.

(1) By the Mean Value Theorem, there is a $$c_1$$ in the interval (1,2) such that

$$f'(c_1) = \frac{f(2) - f(1)}{2-1} = f(2) - f(1) > 0$$

and then a bunch more steps.

According to the answer key, this is a valid argument. I must be missing something huge here though, because I was under the impression that the mean value theorem required f(x) to be continuous on [a,b] and f'(x) to be continuous on (a,b). Here all we have is that the functions are strictly increasing. What am I doing wrong?

## 0568 #48

### Re: 0568 #48

The fact that f' exists on (0, infty) implies that f is continuous there.

I missed this on the first pass too.

I missed this on the first pass too.

### Re: 0568 #48

Okay, thanks for that. I think what you said is half of it -- the other half is that if f' exists everywhere on (a,b), it must be continuous on (a,b). Does that seem right?

### Re: 0568 #48

It is if f' is monotone increasing. Monotonicity implies that the left and right limits of f'(x) as x -> x_0 exist - the only thing that could go wrong is that the left limit does not equal the right limit; but if that was the case, then f' would not be defined at x_0 at all.if f' exists everywhere on (a,b), it must be continuous on (a,b). Does that seem right?

In general, the existence of f' does not imply its continuity though. Here is a counterexample: http://planetmath.org/encyclopedia/Exam ... nuous.html

One more question: does the Mean Value Theorem actually require f' to be continuous? Wikipedia only uses differentiability as hypothesis.