0568 #48
Posted: Sun Oct 09, 2011 5:57 pm
Dear everyone,
Problem 48 says:
Consider the theorem: If f and f' are both strictly increasing real-valued functions on (0,infty), then $$\lim_{x\to\infty} f(x) = \infty$$. The following argument is suggested as a proof of this theorem.
(1) By the Mean Value Theorem, there is a $$c_1$$ in the interval (1,2) such that
$$f'(c_1) = \frac{f(2) - f(1)}{2-1} = f(2) - f(1) > 0$$
and then a bunch more steps.
According to the answer key, this is a valid argument. I must be missing something huge here though, because I was under the impression that the mean value theorem required f(x) to be continuous on [a,b] and f'(x) to be continuous on (a,b). Here all we have is that the functions are strictly increasing. What am I doing wrong?
Problem 48 says:
Consider the theorem: If f and f' are both strictly increasing real-valued functions on (0,infty), then $$\lim_{x\to\infty} f(x) = \infty$$. The following argument is suggested as a proof of this theorem.
(1) By the Mean Value Theorem, there is a $$c_1$$ in the interval (1,2) such that
$$f'(c_1) = \frac{f(2) - f(1)}{2-1} = f(2) - f(1) > 0$$
and then a bunch more steps.
According to the answer key, this is a valid argument. I must be missing something huge here though, because I was under the impression that the mean value theorem required f(x) to be continuous on [a,b] and f'(x) to be continuous on (a,b). Here all we have is that the functions are strictly increasing. What am I doing wrong?