Real Analysis

Differentiation

Differentiation

A function \(f : [a, b] \rightarrow \mathbb{R}\) is differentiable at \(x \in [a, b]\) if there exists the limit \[ f'(x) = \lim\limits_{t \rightarrow x} \frac{f(t) - f(x)}{t - x} \] where \(t \in (a, b)\) and \(t \neq x\). The function \(f'\) is call the derivative of \(f\).

For example,

  1. If \(f\) is continuous on \([a, b]\), \(f\) is not necessarily differentiable on \([a, b]\).

  2. If \(f\) is differentiable on \([a, b]\), \(f\) is continuous on \([a, b]\).

    Why? If \(t \rightarrow x\) then \(\lim\limits_{t \rightarrow x} f(t) - f(x) = \lim\limits_{t \rightarrow x} \frac{f(t) - f(x)}{t - x}(t - x) = f'(x) \cdot \lim\limits_{t \rightarrow x} t - x = f'(x) \cdot 0 = 0\).

  3. If \(f\) is differentiable on \([a, b]\), \(f'\) is not necessarily continuous. However, \(f'\) satisfies the intermediate value property but it does not have any simple discontinuities.

    Consider a version of the topologist’s sine curve. \[\begin{align} f(x) \begin{cases} 0 & \text{if } x = 0 \\ x^\frac{4}{3}\sin\left(\frac{1}{x}\right) & \text{otherwise} \end{cases} \end{align}\]

Call a function \(f\) a \(\mathcal{C}^1\)-function if \(f'\) exists and is continuous. Similarly, call a function \(f\) a \(\mathcal{C}^k\)-function if it has a \(k\)th derivative \(f^{(k)}\) which is continuous. A function is a \(\mathcal{C}^0\)-function if it is continuous. A function is a \(\mathcal{C}^\infty\)-function if all of its derivatives exists and are continuous; alternatively, \(\mathcal{C}^\infty\)-functions are called smooth. As defined, \(\mathcal{C}^k\)-functions are also \(\mathcal{C}^{k-1}\)-functions.

If \(f'\) is a limit then the sum, product, and quotient rules follow.

For example,

  1. \((f + g)' = f' + g'\) because the sum of limits is the limit of the sums.
  2. \((f \cdot g)' = f' \cdot g + f \cdot g'\).

Theorem. There exist functions that are continuous everywhere but are differentiable nowhere.

For example,

  1. Let \(0 < b < 1\) and \(a\) be an odd integer, where \(ab > 1 + \frac{3 \pi}{2}\). Consider the following function.

    \[ f(x) = \sum\limits_{n = 1}^\infty b^n cos(a^n \pi x) \]

The Mean Value Theorem

Theorem (the mean value theorem). If \(f\) is continuous on \([a, b]\) and differentiable on \((a, b)\) then there exists a point \(c \in (a, b)\) such that \(f(b) - f(a) = (b - a)f'(c)\).

\(f(b) - f(a) = (b - a)f'(c)\) relates value of \(f\) to value of \(f'\) without involving limits.

For example,

  1. If \(f'(x) > 0\) for all \(x \in (a, b)\) then \(f(b) > f(a)\).

    Proof. By the mean value theorem, \(f(b) - f(a) = (b - a)f'(c) > 0\) for some \(c \in (a, b)\), as desired. QED.

Theorem (the generalized mean value theorem). If \(f(x)\) and \(g(x)\) are continuous on \([a, b]\) and differentiable on \((a, b)\) then there exists a point \(c \in (a, b)\) such that \((f(b) - f(a))g'(c) = (g(b) - g(a))f'(c)\).

If \(g(x) = x\), we get the mean value theorem.

Consider \(h(x) = (f(b) - f(a))g'(c) - (g(b) - g(a))f'(c)\). Notice \(h(a) = 0\) and \(h(b) = 0\).

Taylor’s Theorem

Suppose we know about \(f(a)\). We want to approximate \(f(b)\). The mean value theorem says \(f(b) = f(a) + (b - a)f'(c)\) for some \(c \in (a, b)\). \((b - a)f'(c)\) is an error term, not precisely known. This suggests \(f(b) = f(a) + f'(a)(b - a) + e\) with \(e\) as an error term. In fact, \(e = \frac{f''(d)}{2}(b - a)^2\), for some \(d \in (a, b)\).

More generally, if \(P_{n - 1}(x) = f(a) + f'(a)(x - a) + \frac{f''(a)}{2!}(x - a)^2 + \dots + \frac{f^{(n - 1)}(a)}{(n - 1)!}(x - a)^{n - 1}\), which is a polynomial with degree \(n - 1\), then Taylor’s theorem states that \(P_{n - 1}(x)\) approximates \(f(x)\) with a small error term.

Theorem (due to Taylor). If \(f^{(n - 1)}\) is continuous on \([a, b]\) and \(f^{(n)}\) exists on \((a, b)\) then \(P_{n - 1}(x)\) approximates \(f(x)\) and \(f(x) = P_{n - 1}(x) + \frac{f^{(n)}(c)}{n!}(x - a)^n\) for some \(c \in (a, x)\).

Proof. Clearly, for some number \(\mathcal{M}\), \(f(b) = P_{n-1}(b) + \mathcal{M}(b - a)^n\). Let \(g(x) = f(x) - P_{n-1}(x) - \mathcal{M}(x - a)^n\). Consider \(g^{(n)}(x) = f^{(n)}(x) - \mathcal{M}n!\). So we have to show \(g^{(n)}(c) = 0\) for some \(c \in (a, b)\). Consider \(g(a) = 0\), \(g'(a) = 0\), \(g''(a) = 0\), upto \(g^{(n-1)}(a) = 0\). Also, \(g(b) = 0\). Since \(g(a) = g(b) = 0\), there exists \(c_1 \in (a, b)\) such that \(g'(c_1) = 0\). Since \(g'(a) = g'(c_1) = 0\), there exists \(c_2 \in (a, b)\) such that \(g''(c_2) = 0\), and so on. So, there exists \(c_{n - 1} \in (a, b)\) such that \(g^{(n-1)}(c_{n-1}) = 0\). Since \(g^{(n-1)}(a) = g^{(n-1)}(c_{n - 1}) = 0\), there exists \(c \in (a, b)\) such that \(g^{(n)}(c) = 0\). This shows \(\mathcal{M} = \frac{f^{(n)}(c)}{n!}(x - a)^n\), as desired. QED.