## Wednesday, April 16, 2014

### Implicit differentiation: a motivating example

Without getting into technical definitions of what is an implicit function and when a given equation in two variables defines implicitly one or more -differentiable or not- functions (i will leave such a discussion for some subsequent post on theory), I will just examine a simple yet illuminating example:
Let us consider the function $y=f(x)= \frac{1}{x} \equiv x^{-1}$. It is well known that the power rule of differentiation $x^{a} = ax^{a-1}$ applies for any real value of $a$ (in its respective domain of course). So we can readily conclude that $f'(x) = - \frac{1}{x^{2}}$ in $\mathbb{R}^{*}$.
But let us momentarily think a little different: Since $y=\frac{1}{x} \Rightarrow xy=1$ we have that  $$xf(x)=1 \Leftrightarrow xy=1$$.
We differentiate this last relation, applying the product rule of differentiation to both sides and we get $$f(x)+xf'(x)=0 \Leftrightarrow y+xy'=0$$, thus $y'=f'(x)=-\frac{f(x)}{x}=-\frac{y}{x}$. Using the definition $y=f(x)= \frac{1}{x}$ we finally arrive at $$y'=f'(x) = - \frac{1}{x^{2}}$$
in $\mathbb{R}^{*}$.

## Thursday, April 10, 2014

### Derivative of the inverse: a couple of examples

Let us proceed in today's post to a couple of applications of the theorem on the differentiation of the inverse function:
Example 1 (exp-log):  $\bullet$ Let us consider the case of $y=f(x)=e^{x}$. It is well known that $f'(x)=e^{x}=f(x)$ and that the inverse function $f^{-1}$ can be written as: $x=f^{-1}(y)=lny$. The domain and the range of these functions are
$$\begin{array}{c} D_{f} = (-\infty, \infty) = f^{-1}(D_{f^{-1}}) \\ \\ f(D_{f}) = (0, \infty) = D_{f^{-1}} \end{array}$$
One can easily check that the conditions of the theorem apply and thus for any $x_{0} \in D_{f}$ and for any $y_{0}=f(x_{0}) \in f(D_{f}) \equiv D_{f^{-1}}$ we have:
$$(f^{-1})'(y) = (lny)' = \frac{dx}{dy} = \frac{1}{\frac{dy}{dx}} = \frac{1}{f'(x)} = \frac{1}{(e^{x})'} = \frac{1}{e^{x}} = \frac{1}{y}$$
where -following the same notation conventions as we did in the development of the theory- $(f^{-1})'(y) = (lny)'$ denotes differentiation with respect to $y$ while $f'(x) = (e^{x})'$ denotes differentiation with respect to $x$. So we finally arrive at: $(lny)' = \frac{1}{y}$ and since the independent variable can be always named at our choice we arrive at the familiar rule:
$$(lnx)' = \frac{1}{x}$$
$\bullet$ Let us see how we could have work alternatively, thinking through the chain rule of differentiating composite function: We could differentiate the composition $f\circ f^{-1}$ with respect to (the independent  variable) $y$.  So we get $y = e^{x} = e^{x(y)} = e^{lny}$. Differentiating both sides with respect to $y$ and applying the chain rule in the r.h.s of the last relation, we get
$$1 = \frac{dy}{dy} = e^{lny} \frac{d(lny)}{dy} \Rightarrow \frac{d(lny)}{dy} = \frac{1}{e^{lny}} = \frac{1}{y}$$
and finally, renaming (as usual) the independent variable form $y$ to $x$ we get the familiar relation
$$(lnx)' \equiv \frac{d(lnx)}{dx} = \frac{1}{x}$$
Example 2 (inverse trigonometric functions): $\bullet$ Let $y=f(x)=sinx$. Its inverse function can be written as $x=f^{-1}(y)=arcsin(y) \equiv sin^{-1}(x)$. Of course $sinx$ when considered in its natural domain $\mathbb{R}$ is not a $1-1"$ and thus not an invertible function. However a suitable restriction is: We consider the domains and ranges as follows:

$\begin{array}{l} D_{f} = [-\frac{\pi}{2}, \frac{\pi}{2}] = f^{-1}(D_{f^{-1}}) \\ \\ f(D_{f}) = [-1, 1] = D_{f^{-1}} \end{array}$

Notice that, if displayed in a common set of axis (that is: with the same independent variable $x$) the graphs of $sinx$ and $arcsinx$ are displayed in the following figure:

However, we will proceed the computation using the notation $y= f(x) = sinx$ and $x = f^{-1}(y) = arcsiny$.
We first have to note that the theorem provides us the derivative of the inverse function $x = f^{-1}(y) = arcsiny$ for any $y \in (-1,1)$ excluding thus the points $sin(-\frac{\pi}{2})=-1$, $sin(\frac{\pi}{2})=1$ simply because $f'(x) = (sinx)' = cosx$ and thus $f'(-\frac{\pi}{2})=f'(\frac{\pi}{2})=0$. However, the conditions of the theorem are valid in $(-1,1)$ so we get:
$$\begin{array}{c} (f^{-1})'(y) = (arcsiny)' = \frac{dx}{dy} = \frac{1}{\frac{dy}{dx}} = \frac{1}{f'(x)} = \frac{1}{(sinx)'} = \\ \\ = \frac{1}{cosx} = \frac{1}{\sqrt{1-sin^{2}x}} = \frac{1}{\sqrt{1-y^{2}}} \end{array}$$
for $y$ in $(-1,1)$. Notice that we have used $cosx = \sqrt{1-sin^{2}x} \geq 0$ for $x \in (-\frac{\pi}{2},\frac{\pi}{2})$. We thus have shown (after renaming the independent variable as is customary) that:
$$(arcsinx)' = \frac{1}{\sqrt{1-x^{2}}}$$
for any $x$ in $(-1,1)$.
$\bullet$ Let us now try to work alternatively (through the chain rule) as before: We differentiate $f\circ f^{-1}$ with respect to (the independent  variable) $y$ using the chain rule and keeping in mind that $y = sinx \equiv sinx(y) = sin(arcsin y)$
$$\begin{array}{c} 1 = \frac{dy}{dy} = cos(arcsiny) \frac{d(arcsiny)}{dy} \Rightarrow \\ \\ \Rightarrow \frac{d(arcsiny)}{dy} = \frac{1}{cos(arcsiny)} = \frac{1}{\sqrt{1-sin^{2}(arcsiny)}} = \frac{1}{\sqrt{1-y^{2}}} \end{array}$$
for $y$ in $(-1,1)$. With the -customary now- change in the independent variable from $y$ to $x$ we arrive at our desired formula:
$$(arcsinx)' = \frac{1}{\sqrt{1-x^{2}}}$$
for any $x$ in $(-1,1)$.

## Saturday, April 5, 2014

### Theoretical Remarks #5 - the derivative of the inverse function

In today's post (after quite a long time) I am going to discuss the differentiation of an inverse function $f^{-1}$ given that we know the derivative of $f$. I will start by laying down the theorem which will be the basic ingredient for working with the derivatives of inverse functions:
Theorem: Let a $f:\Delta \rightarrow \mathbb{R}$ be a function defined on an interval $\Delta \subseteq \mathbb{R}$ and let $f$ be
• One-to-one ("1-1") and continuous
• differentiable at a point $\xi \in \Delta$
• $f'(\xi) \neq 0$
then the inverse function $f^{-1}:f(\Delta) \rightarrow \Delta \subseteq \mathbb{R}$ is also differentiable at the point $\zeta = f(\xi) \in f(\Delta)$ and we have
$$\label{invder1}(f^{-1})'( \zeta ) = \frac{1}{f'( \xi )}$$
which can be equivalently written (in Leibniz's notation)
$$\label{invder2} \frac{dx}{dy} |_{\zeta=f(\xi)} = \frac{1}{\frac{dy}{dx} |_{\xi}}$$
with the understanding (in the notation) that:  $\ y=f(x) \Leftrightarrow x=f^{-1}(y)$.

Remark: If the conditions of the above theorem are valid for any point $x_{0} \in \Delta$ then we can write (in a somewhat more simplified notation):
$$\label{invder3} \frac{dx}{dy} |_{y_{0}} = \frac{1}{\frac{dy}{dx} |_{x_{0}}}$$
for any $x_{0} \in \Delta$ and $y_{0}=f(x_{0})$. In that case it is customary to write $$\frac{dx}{dy} = \frac{1}{\frac{dy}{dx}}$$ in $\Delta$. We (again!) we have to keep in mind that $\ y=f(x) \Leftrightarrow x=f^{-1}(y)$.

Let us now proceed to a couple of proofs of the above result:

(I). A geometrical proof:

The following figure, provides us with a  geometrical interpretation of the previous relation between the derivative of a function and the derivative of its inverse function:
It is well known that the graphs $C_{f}$ and $C_{f^{-1}}$ are symmetric with respect to the bisector $y=x$ of the first quadrant.
One can easily convince himself, that the above described symmetry, implies that
$$\label{geominterpr}θ+φ=π/2$$
where $θ \neq 0$ is the angle between the tangent line of $C_{f}$ at $A(\xi,\zeta)$ and the horizontal axis and $φ \neq π/2$ is the angle between the tangent line of $C_{f^{-1}}$ at $B(\zeta,\xi)$ and the horizontal axis.
But since $f'(\xi)=tanθ$ and $(f^{-1})'(\zeta)=tanφ$, it suffices to invoke \eqref{geominterpr} together with the well known trigonometrical relation
$$tanφ=tan(\frac{π}{2}-θ)=\frac{1}{tanθ}$$
to conclude that $(f^{-1})'(\zeta)=\frac{1}{f'(\xi)}$.

The proof presented above is based on the geometrical interpretation of the derivative as the slope of the tangent line to the graph of the function. However, we could have proceeded through a completely different road, using the chain rule of differentiation:

(II). A "proof" through the chain rule of differentiation:
Under the conditions of the theorem, it is clear that the function $f:\Delta \rightarrow \mathbb{R}$ has an inverse function $f^{-1}:f(\Delta) \rightarrow \Delta \subseteq \mathbb{R}$.
We can consider either the inverse (or the initial function) function as a composite function in the following sense:
\label{inverderthrcomp}
x=f^{-1}(y)=f^{-1}(f(x))=(f^{-1} \circ f)(x)

Now we can straightforwardly apply the chain rule of differentiating composite functions as follows: We differentiate \eqref{inverderthrcomp} with respect to $x$:
$$1 = \frac{dx}{dx}|_{x_{0}} = [(f^{-1})(y_{0})]' = (f^{-1})'(y_{0})\cdot f'(x_{0}) \Rightarrow (f^{-1})'(y_{0}) = \frac{1}{f'(x_{0}) }$$
which concludes the proof.
Remarks:
1. The reader should pay particular attention at the notation at this point: the symbol $[(f^{-1})(y_{0})]' \equiv [f^{-1}(f(x_{0}))]' \equiv \frac{dx}{dx}|_{x_{0}}$ denotes the derivative of $f^{-1} \circ f$ with respect to $x$ -computed at $x_{0}$- while $(f^{-1})'(y_{0})$ denotes the derivative of $x=f^{-1}(y)$ with respect to $y$ -computed at $y_{0}$- and $f'(x_{0})$ the derivative of $y=f(x)$ with respect to $x$ (as usually)  computed at $x_{0}$.
2. Using the Leibniz notation, we could have alternatively written:
$$1 = \frac{dx}{dx}|_{x_{0}} = \frac{dx}{dy}|_{y_{0}} \cdot \frac{dy}{dx}|_{x_{0}} \Rightarrow \frac{dx}{dy} |_{y_{0}} = \frac{1}{\frac{dy}{dx} |_{x_{0}}}$$
where $y_{0}=f(x_{0}) \Leftrightarrow x_{0}=f^{-1}(y_{0})$.
3. The discerning reader should notice the following fact in this last "proof": we have only proved \eqref{invder1}, \eqref{invder2}. However we have not shown (in fact we took it for granted) that under the conditions of the theorem the inverse function is actually differentiable or in other words that $(f^{-1})'(y_{0}) = \frac{dx}{dy}|_{y_{0}}$ exists (in the sense that it is a real number). This is the reason for the use of the quotation marks in the word proof.

Finally we have to mention, that we can supply another proof of the above theorem by straightforward use of the definition of the derivative as the limit of the rate of change. We will discuss about this proof in a subsequent post.