



 
OK, sit down, this is complicated. A function defined (naturally or artificially) on an interval [a,b] or [a,infinity) cannot be differentiable at a because that requires a limit to exist at a which requires the function to be defined on an open interval about a. But the relevant quotient may have a onesided limit at a, and hence a onesided derivative. Moreover, we say that a function is differentiable on [a,b] when it is differentiable on (a,b), differentiable from the right at a, and differentiable from the left at b. Experience has shown that these are the right definitions, even though they have some paradoxical repercussions. For instance, a function may be differentiable on [a,b] but not at a; and a function may be differentiable on [a,b] and [b,c] but not on [a,c]. The reason that so many theorems require a function to be continuous on [a,b] and differentiable on (a,b) is not that differentiability on [a,b] is undefined or problematic; it is that they do not need differentiability in any sense at the endpoints, and by using this looser phrasing the theorem becomes more generally applicable. So for instance you can use Rolle's theorem for the square root function on [0,1]. There are other theorems that need the stronger condition. I would suggest, however, that whenever there is any question of a fiddly detail like this you first make sure you have the notation right and also use a few extra words to ensure the reader understands too! Good Hunting!  


Math Central is supported by the University of Regina and The Pacific Institute for the Mathematical Sciences. 