1-d calculus#
The crucial concepts of calculus are derivative and integral.
Derivative#
The linear function
is nice and simple, and its graph is a straight line. Using differentiation and differentials one can in some way reduce any smooth function to a linear one!
In a similar manner any smooth function is linear in a small neighborhood of the tangent point:
A more scientific definition:
Strict definition
The derivative of \(f \colon (x-\delta, x+\delta) \to \mathbb R\), \(\delta > 0\), at point \(x\) is
Existence of derivative \(f'(x)\) is equivalent to differentiability of \(f\) at point \(x\):
Note that (60) imply this equality but the converse is false.
Differential#
The function \(df(x ,h) = f'(x)h\) is called differential of \(f\) at point \(x\). Note that it is a function of two variables \(x\) and \(h\), and the dependency on \(h\) is linear.
Important
Due to historical reasons, the increment \(h\) is often denoted as \(dx\); then the formula for the differential is
Differential is the main linear part of the increment \(\Delta f = f(x + h) - f(x)\).
Rules of differentiation#
Derivative:
\(f'(x) \equiv 0\) if \(f(x)\equiv \mathrm{const}\)
\((\alpha f(x) + \beta g(x))' = \alpha f'(x) + \beta g'(x)\)
\((f(x)g(x))' = f'(x) g(x) + f(x) g'(x)\)
\(\big(\frac{f(x)}{g(x)}\big)' = \frac{f'(x) g(x) - f(x) g'(x)}{g^2(x)}\) if \(g(x) \ne 0\)
\(((f \circ g)(x))' = f'(g(x)) g'(x)\) (chain rule)
Differential:
\(d(\alpha f + \beta g) = \alpha df + \beta dg\)
\(d(fg) = fdg + g df\)
\(d\big(\frac fg\big) = \frac{gdf - f dg}{g^2}\)
\(d(f \circ g)(x) = df(g(x)) = f'(g(x)) \cdot dg(x)\) (chain rule)
Derivatives of higher orders#
If the function \(f'(x)\) is also differentiable, then its derivative is called the second derivative of \(f\): \(f''(x) =\frac d{dx}(f'(x))\). By indtuction, \(n\)-th derivative is defined as
To find the second differential of \(f\) just differentiate \(df(x) = f'(x) dx\) with respect to \(x\) assuming that \(dx\) is a constant:
A function \(f \colon [a, b] \to \mathbb R\) is called continuously differentiable \(n\) times (denoted as \(f \in C^n[a, b]\)) if its \(n\)-th derivative is continuous: \(f^{(n)} \in C[a, b]\). If \(f \in C^{n+1}[a, b]\) then the Taylor formula holds:
where remainder term \(r_n\) is
If \(\lim\limits_{n\to\infty} r_n = 0\), the function \(f\) can be expanded into Taylor series:
If \(a=0\) the Taylor series if called Maclaurin series.
Example
If \(f(x) = e^x\) then \(f^{(k)}(x) = e^x\) for all \(k\in\mathbb N\). Also,
Hence,
Applications of derivatives#
If \(f'(x) > 0\) (\(f'(x) < 0\)) for all \(x\in (a, b)\), then \(f\) is increasing (decreasing) on \((a, b)\).
If \(f'(x) = 0\) and \(f''(x) > 0\) (\(f''(x) < 0\)), then \(x\) is a local minimum (maximum) of \(f\).
If \(f''(x) > 0\) (\(f''(x) < 0\)) for all \(x\in (a, b)\), then \(f\) is strictly convex (concave) on \((a, b)\)
Exercises#
Find derivative of \(f(x) = \tanh x = \frac{\sinh x}{\cosh x} = \frac {e^x - e^{-x}}{e^x + e^{-x}}\).
Show that \(\sigma'(x) = \sigma(x) (1 - \sigma(x))\) where
\[ \sigma(x) = \frac 1{1 + e^{-x}} \]— sigmoid function.
Find \(\max\limits_{x\in\mathbb R}\sigma'(x)\).
Give an example of a function \(f\) which is differentiable at point \(x\) but (60) does not hold.
Find the first and the second differential of \(f(x) = \sin x\) at point \(x = \frac \pi 3\).
Find Maclaurin series of \(f(x) = \frac 1{1 - x}\) and \(g(x) = \frac 1{(1-x)^2}\).
Find global maximium of
\[ f(x) = \prod\limits_{i=1}^n \exp\Big(-\frac{(x - a_i)^2}{2 \sigma_i^2}\Big), \quad \sigma_i > 0. \]What if \(\sigma_1 = \ldots = \sigma_n = \sigma > 0\)? Does this function has global minimum?