To recap:
34. f is differentiable, and as x->infinity, both f(x) and f'(x) exist and are finite. Which of the following must be true?
a) l i m f'(x) = 0
x->inf
b) l i m f''(x) = 0
x->inf
c) blah blah blah
d) blah blah blah
e) blah blah blah
Ok, so as explained in http://www.mathematicsgre.com/viewtopic.php?f=1&t=102, clearly f'(x) -> 0 as x goes to infinity (otherwise f would not be able to approach a fixed limit.) Thus a) appears to be correct.
But why can't we use this same reasoning to deduce that f'' must also approach 0? Check this out:
Let g(x) = f'(x). Then we're given that g(x)->0 as x goes to infinity (as we previously concluded), and g is continuous, so...can you think of a function, ANY function, that is continuous, approaches 0 as x goes to infinity, and whose derivative does NOT also approach zero?
The only problem with this argument I could think of was that we aren't given that g itself is differentiable (i.e., that f'' exists.)
So what I'm picturing is this: our function g does indeed approach 0, but with little "cusps" thrown in here and there so as to make it not differentiable at some points. If we make these cusps occur infinitely often AND with increasing frequency, then I suppose g' would have to not exist, especially if the difference between successive cusps approaches 0 as x approaches infinity.
***
Of course, now that I've typed all this, I just thought of a simpler solution: g is constant (1) for a while, then has a "sharp" turn down to 1/2, then constant at 1/2 for a while, then another sharp turn down to 1/4, then...I suppose this would be another counterrexample, though I think my first solution is more aesthetically pleasing.
No need to stress this three days before the exam. Just having a little fun