MA341 - Real Analysis - Spring 2016

# Infimum and Supremum of sequences

Let $$(x_n)$$ be a monotonic sequence.

If $$(x_n)$$ is bounded then it is convergent

and $\lim_{n\to\infty} x_n = \sup(x_n, n \in \mathbb{N})$ or $$\inf(x_n, n \in \mathbb{N})$$

Proof
1. $$x_n = \frac{1}{n}$$

$$x_n$$ is decreasing and bounded, so it converges to $$\inf(\frac{1}{n}) = 0$$

2. $$x_n = 1 + \frac{1}{2} + ... + \frac{1}{n}$$

$$(x_n)$$ is increasing. We will show it is not bounded.

$$x_n = 1 + \frac{1}{2} + (\frac{1}{3} + \frac{1}{4}) + ... \\ \geq 1 + \frac{1}{2} + (\frac{1}{4} + \frac{1}{4}) + ... = 1 + \frac{n}{2}$$

So $$x_n$$ is not bounded

3. Let $$x_n$$ be defined

$$x_1 = 1$$

$$x_{n+1} = \frac{1}{4}(2x + 3)$$

First prove that $$x_n$$ is increasing. (by induction)

$$x_1 < x_2$$

$$\frac{1}{4}(2x_n + 3) \leq \frac{1}{4}(2x_{n+1} + 3)$$

so $$x_n \leq x_{n+1}$$

Show bounded (by induction)

$$x_1 \leq 2$$

Assume $$x_n \leq 2$$, then

$$x_{n+1} = \frac{1}{4}(2x_n + 3) \leq \frac{7}{4} \leq 2$$

so $$x_n$$ is bounded.

Now to find the limit. $$x_n = x_{n+1}$$

so $$x = \frac{1}{4}(2x + 3) \Rightarrow x = \frac{3}{2}$$

# Subsequences

subsequence

Let $$(y_n)$$ be a sequence.

If the sequence $$(x_n) \subseteq (y_n)$$

Then $$(x_n)$$ is a subsequence of $$(y_n)$$

$$(2,4,6,8,...)$$ is a subsequence of $$(1,2,3,4,...)$$

$$x_n = (-1)^n$$ $$(-1,1,-1,1,...)$$

$$x_n = 1$$ $$(1,1,1,1,...)$$

$$x_n = -1$$ $$(-1,-1,-1,-1,...)$$

## Convergence

Let $$(x_n)$$ be a convergent sequence

Then all subsequences are also convergent and the limits are the same.

Proof
1. $$x_n = \frac{1}{n^4}$$

Since $$n^4 \in \mathbb{N}$$, $$x_n$$ is a subsequence of $$\frac{1}{n}$$.

So $$x_n$$ must also converge to 0.

2. $$x_n = (-1)^n$$

$\lim_{n\to\infty} x_{2n} = 1$

$\lim_{n\to\infty} x_{2n+1}= -1$

So $$x_n$$ is divergent

## Divergence

Similarly, if $$(x_n)$$ is a sequence and $$(y_n)$$ is a divergent subsequence

then $$(x_n)$$ is divergent

Proof

## Monotonicity

Let $$(x_n)$$ be a sequence.

There exists a subsequence $$(x_m)$$ which is monotonic

Proof

# Bolzano-Weierstrauss

Let $$(x_n)$$ be a bounded sequence

Then there exists a subsequence $$(x_m)$$ that is convergent

Proof
1. $$x_n = (-1)^n$$

$$x_{2n} \Rightarrow 1$$

$$x_{2n+1} \Rightarrow -1$$

2. Any subsequence of $$(x_n) = \sin(n)$$ can converge to any $$x \in [-1,1]$$

# Cauchy Sequence

cauchy sequence

Let $$(x_n)$$ be a sequence.

$$(x_n)$$ is a cauchy sequence if $$\forall \varepsilon > 0$$ $$\exists K(\varepsilon) \geq 0$$

$$\forall n,m \geq K(\varepsilon), | x_n - x_m | \leq \varepsilon$$

## Cauchy and inverses

Let $$(x_n)$$ be a convergent sequence.

Then it is a Cauchy sequence

The inverse is not true

Proof
1. Not a Cauchy sequence:

$$(x_n) = \sqrt{n}$$

$$(x_n)$$ is Cauchy, but does not converge

Let $$(x_n)$$ be a bounded Cauchy sequence

Then $$(x_n)$$ is convergent

Proof
1. $$x_n = 1 + \frac{1}{2} + ... + \frac{1}{n}$$

$$x_{2n} - x_n = \frac{1}{n+1} + \frac{1}{n+2} + ... + \frac{1}{2n} \geq \frac{1}{2n} + ... + \geq \frac{1}{2n}$$

So $$x_{2n} - x_n$$ does not converge to $$0$$ therefore $$(x_n)$$ is not a Cauchy sequence

# Contracting Sequences

contracting sequence

Let $$(x_n)$$ be a sequence

$$(x_n)$$ is contracting if

$\exists C > 0, \forall n \in \mathbb{N}, | x_{n+1} - x_n | \leq C | x_n - x_{n-1} |$

## Convergence

Any contracting sequence is convergent

Proof
1. xn = n

Let $$M \geq 0$$

for $$n \geq M, x_n \geq M$$

So $$(x_n)$$ goes to infinity
2. $$x_n = 1 - \frac{1}{n} + n \geq M$$
3. $$x_n = n^2 - n = n(n-1) \geq (n-1)^2$$

## Divergence

Let $$(x_n)$$ be increasing and not bounded. Then $\lim_{n \to \infty} x_n = \infty$

Let $$(x_n)$$ be decreasing and not bounded. Then $\lim_{n \to -\infty} x_n = -\infty$

Proof

Let $$(x_n),(y_n)$$ be sequences s.t. $$(x_n) \geq (y_n)$$

$\lim_{n \to \infty} (y_n) = \infty$ implies

$\lim_{n \to \infty} (y_n) = \infty$

Proof

# Series

series

A sequence generated by the sum of the first $$n$$ terms of another sequence

The series $$(s_n)$$ generated by $$(x_n)$$ lookks like:

$$s_1 = x_1$$

$$s_2 = x_1 + x_2$$

...

$$s_n = x_1 + ... + x_n$$

limit of series

$\lim_{n \to \infty} s_n = \sum_{k=1}^{\infty} x_k$

1. Let $$(x_n) = \frac{1}{n(n+1)}$$

Then

\begin{aligned} (s_n) & = \sum_{k=1}^{\infty} \frac{1}{k(k+1)} \\ & = \sum_{k=1}^{\infty} \frac{1}{k} - \sum_{k=1}^{\infty} \frac{1}{k+1} \\ & = \sum_{k=1}^{\infty} \frac{1}{k} - \sum_{k=2}^{\infty} \frac{1}{k} \\ & = 1 - \frac{1}{n+1} \end{aligned}

2. Let $$(x_n) = r^{n-1}$$

then

\begin{aligned} s_n & = \sum_{k=1}^{\infty} r^{n-1} \\ & = \frac{1-r^n}{1-r} \end{aligned}

case $$|r| < 1$$

$\lim_{n \to \infty} r^n = 0$

So $\sum_{k=1}^{\infty} r^{n-1} = \frac{1}{1-n}$

and $$(s_n)$$ is convergent

case $$|r| = 1$$

$\sum_{k=1}^{\infty} x_k = \infty$

$$s_n = \frac{1}{2} (1 - (-1^n))$$

So $$(s_n)$$ is divergent

3. Let $$x_n = \frac{1}{2^n}$$

then $\lim_{n \to \infty} s_n = \sum_{k=1}^{\infty} \frac{1}{2^k} = 2$

Let $$(s_n)$$ be a series generated by the sequence $$(x_n)$$

If $$(s_n)$$ converges, then $$(x_n)$$ also converges

Proof

Convergence of $$(x_n)$$ does not imply convergence of $$(s_n)$$

ex. harmonic series = $$x_n = \frac{1}{n}$$

If $$(x_n)$$ is a positive sequence

then $0 < \lim_{n \to \infty} s_n \leq \infty$

Proof

Let $$(x_n)$$ and $$(y_n)$$ be sequences s.t. $$0 \leq x_n \leq y_n$$

Let $$(s_n)$$ and $$(t_n)$$ be sequences of $$(x_n)$$ and $$(y_n)$$ respectively

If $$(t_n)$$ is convergent, then $$(s_n)$$ is convergent

If $$(s_n)$$ is divergent, then $$(t_n)$$ is divergent

Proof
1. Let $$x_n = \frac{1}{n^r}, r \geq 0$$

then

\begin{aligned} s_n = \sum_{k=1}^{\infty} \frac{1}{k^r} \ & = 1 + \frac{1}{2^r} + ... \end{aligned}

case $$0 \leq r \leq 1$$

$$\frac{1}{n^r} \geq \frac{1}{n}$$

So $$s_n$$ is divergent

case $$r > 1$$

For $$n \geq 2$$ (by bernoulli inequality)

$$\frac{1}{nr} \leq \frac{1}{r-1} (\frac{1}{(n-1)^{r-1}} - \frac{1}{n^{r-1}})$$

Let $$y_n = \frac{1}{r-1} (\frac{1}{(n-1)^r} - \frac{1}{n^{r-1}})$$

Then $\sum_{k=1}^{\infty} y_k = \frac{1}{r-1}(1 - \frac{1}{n^{r-1}}) < \infty$

So $\sum_{k=1}^{\infty} x_k < \infty$

We conclude $\sum_{k=1}^{\infty} \frac{1}{n^r} < \infty$ if and and only if $$r > 1$$

# Functions

cluster point

Let $$A \subseteq \mathbb{R}$$ be a non empty set.

Then $$c \in \mathbb{R}$$ is a cluster point of A if

$$\exists a \in A$$, $$a \neq c$$ s.t. $$\forall \varepsilon > 0$$,

$$|a - c| < \varepsilon$$

$$c$$ is a cluster point of the set $$A$$ if the values of $$A$$ get arbitrarily close to $$c$$

1. Let $$A = \{ \frac{1}{n}, n \in \mathbb{N}\}$$

0 is the only cluster point for the set $$A$$

2. Let $$A = \{n | n \in \mathbb{N}\}$$

There are no cluster points in the set. We can always find a smaller $$\varepsilon$$ which excludes $$c$$

3. Let $$A = \mathbb{Q} = \{\frac{x}{y} | x,y \in \mathbb{N}\}$$

The set of cluster points is $$\mathbb{R}$$

This is because $$\mathbb{Q}$$ is dense in $$\mathbb{R}$$

function

Let $$A \subset \mathbb{R}$$ be a non empty set.

Then the mapping $$f: A \rightarrow \mathbb{R}$$ is a function

limit

$$f$$ admits a limit $$L \in \mathbb{R}$$ if:

$$\forall \varepsilon > 0$$, $$\exists \zeta > 0$$,

s.t. $$|x - a| \leq \zeta$$ implies $$|f(x) - L| \leq \varepsilon$$

Or, as $$x$$ converges to $$a$$, $$f(x)$$ converges to $$L$$

Let $$f(x) = x$$

Let $$a \in \mathbb{R}$$, $$\varepsilon > 0$$

if $$|x - a| \leq \varepsilon$$,

then $$|f(x) - a| \leq \varepsilon$$

Uniqueness of Limit

Let $$A \subseteq \mathbb{R}$$ be a nonempty set

Let $$f: A \to \mathbb{R}$$ and let $$c \in \mathbb{R}$$ be a cluster point of $$A$$

Assume $$f$$ has a limit $$L$$ at $$c$$ and also has limit $$L'$$ at $$c$$

Then $$L = L'$$

and $$L = \lim_{x \to c} f(x)$$ is called the limit of $$f$$ at $$c$$

Proof
1. find the limit of $$f(x) = a, a \in \mathbb{R}$$

Let $$\varepsilon > 0$$

For $$x \in \mathbb{R}$$, s.t. $$|x - 1| \leq 1$$

we have $$|f(x) - a| < \varepsilon$$

2. Let $$f(x) = x$$. Show $$\lim_{x \to 1} f(x) = 1$$

Let $$\varepsilon > 0$$

for $$x \in \mathbb{R}$$ s.t. $$|x - 1| < \varepsilon$$ we have $$|f(x) - 1| = |x - 1| < \varepsilon$$

3. Let $$f(x) = x^2$$ Show $$\lim_{x \to a} f(x) = a^2$$

Let $$\varepsilon > 0$$

$$|x^2 - a^2| = |x-a||x+a|$$

If $$|x - a| \leq 1$$, $$|x| \leq 1 + |a|$$,

so $$|x + a| \leq 1 + 2 |a|$$

$$|x^2 - a^2| \leq |x-a|(1 + 2 |a|)$$

Let $$\zeta = \min(1, \frac{\varepsilon}{1 + 2 |a|})$$

For $$x \in \mathbb{R}$$ s.t. $$|x - a| \leq \zeta$$,

we have $$|x^2 - a^2| < \varepsilon$$

4. Let $$f(x) = \frac{1}{x}, x > 0$$. Prove $$\lim_{x \to a} \frac{1}{x} = \frac{1}{a}$$

$$|\frac{1}{x} - \frac{1}{a}| = \frac{|a - x|}{|a||x|}$$

For $$|x - a| < \frac{a}{2}, x > \frac{a}{2}$$

we have $$|\frac{1}{x} - \frac{1}{a}| \leq \frac{2}{a^2} |a - x|$$

Let $$\zeta = \min(\frac{a}{2}, \frac{\varepsilon a^2}{2})$$

if $$|x - a| \leq \zeta$$

then $$|\frac{1}{x} - \frac{1}{a}| \leq \varepsilon$$

Let $$A \subset \mathbb{R}$$ be a nonempty set and $$f: A \rightarrow \mathbb{R}$$

Let c be a cluster point of A. Then the conditions are equivalent:

1. $$\lim_{x \to c} f(x) = L$$
2. For every sequence $$(x_n)$$ in A where $$x_n \neq c$$ and $$\lim_{n \to \infty} x_n = c$$,

then $$\lim_{n \to \infty} f(x_n) = c$$

Proof

Let $$A \subset \mathbb{R}$$ be nonempty, $$f:A \rightarrow \mathbb{R}$$, and $$c \in \mathbb{R}$$ be a cluster point of A

1. Assume that you can find a sequence $$(x_n)$$ in A, $$x_n \neq c$$ and $$x_n \rightarrow c$$, but $$f(x_n)$$ is not convergent

Then $$\lim_{x \ to c} f(x)$$ does not exist

1. Let $$f(x) = \frac{1}{x}, x > 0$$

Let $xn = \frac{1}{n}. We have$xn →0, $$f(x_n)$$ is convergent so

$$\lim_{x \to 0} f(x)$$ does not exist

2. Show the function does not converge

$f(x) = \begin{cases} 1 & x \in \mathbb{Q} \\ 0 & \text{else} \end{cases}$

Let $$(x_n) = \frac{1}{n}$$. $$\lim_{n \to \infty} f(x_n) = 1$$

Let $$(y_n) = \frac{\sqrt{2}}{n}$$. $$\lim_{n \to \infty} f(x_n) = 0$$

Since the subsequences don't converge to the same value, $$f(x)$$ does not converge

## Limit Theorems

Let $$A \subset \mathbb{R}$$ be a nonempty set. Let $$c \in \mathbb{R}$$ be a cluster point of $$A$$. Let $$f: A \longrightarrow \mathbb{R}$$

Then $$f$$ is bounded in a neighborhood of c if there exists $$M > 0$$ and $$\zeta > 0$$

s.t. if $$x\in[c - \zeta, c + \zeta] \cap A$$

$$|f(x) \leq M$$

1. Let $$f(x) = x^2$$. For $$x \in [-1,1]$$, $$f(x) \leq 1$$ so $$f$$ is bounded in the neighborhood of 0

2. Let $$f(x) = \frac{1}{x}$$

Let $$f: A \longrightarrow \mathbb{R}$$ Let $$c$$ be a cluster point of $$A$$.

If $$\lim_{x \to c} f(x)$$ exists, then $$f$$ is a bounded in a neighborhood of $$c$$

Proof

Let $$f,g: A \longrightarrow \mathbb{R}$$

• $$(f + g)(x) = f(x) + g(x)$$
• $$(fg)(x) = f(x)g(x)$$
• if $$\forall x \in A, g(x) \neq 0$$

$$\frac{f}{g}(x) = \frac{f(x)}{g(x)}$$

Let $$f,g: A \longrightarrow \mathbb{R}$$

Let $$c \in \mathbb{R}$$ be a cluster point of $$A$$

If $$\lim_{x \to c} f(x)$$ and $$\lim_{x \to c} g(x)$$ exist, then

$$\lim_{x \to c} f(x) + \lim_{x \to c} g(x) = \lim_{x \to c} f(x) + g(x)$$

Proof

Let $$f,g: A \to \mathbb{R}$$. Let $$c \in \mathbb{R}$$ be a cluster point of $$A$$.

If $$\lim_{x \to c} f(x)$$ and limx →c g(x)\$ exist, then

$$\lim_{x \to c} f(x)g(x) = \lim_{x \to c} f(x) \lim_{x \to c} g(x)$$

1. Find $$\lim_{x \to 1} \frac{x^2 + 1}{x}$$

$$\lim_{x \to 1} \frac{x^2 + 1}{x} = \frac{\lim_{x \to 1} x^2 + 1}{\lim_{x \to 1} x} = 2$$

2. Let $$f(x) = x + \mathbb{1}_{\mathbb{Q}}$$

The limit of $$x$$ exists, but not $$\mathbb{1}_{\mathbb{Q}}$$, so the limit of $$f(x)$$ does not exist

3. Let $$f(x) = x \mathbb{1}_{\mathbb{Q}}$$

Since $$\mathbb{1}_{\mathbb{Q}}$$ is bounded and $$x$$ is convergent, $$f(x)$$ is convergent

## Squeeze Theorem

Let $$f,g,h: A \to \mathbb{R}$$. Let $$c \in \mathbb{R}$$ be a cluster point of $$A$$.

Assume $$f \leq g \leq h$$ for all $$x$$.

If $$\lim_{x \to c} f(x) = \lim_{x \to c} h(x) = L$$, then $$\lim_{x \to c} g(x) = L$$

Proof

M > 0. prop div if xn > M for all n > k

## Proper Divergence

Let $$f: A \to \mathbb{R}$$ Let $$c \in \mathbb{R}$$ be a cluster point of $$A$$

We say that $$\lim_{x \to c} f(x) = \infty$$

if for $$\forall M \geq 0$$, $$\exists \zeta > 0$$, s.t. for $$x \in [c - \zeta, c + \zeta] \cap A, x \neq c$$

$$f(x) \geq M$$

1. $$f(x) = \frac{1}{x}, x > 0$$

$$\lim_{x \to 0} f(x) = \infty$$

2. Let $$f(x) = \frac{1}{x}, x < 0$$

$$\lim_{x \to 0} f(x) = -\infty$$

changing A can change the limit

Let $$f: A \to \mathbb{R}$$. Let $$c \in \mathbb{R}$$ be a cluster point of $$A$$

then $$\lim_{x \to c} f(x) = \infty$$ iff

for every sequence $$(x_n) \in A$$ converging to $$c$$, $$x_n \neq c$$

$$\lim_{n \to \infty} f(x_n) = \infty$$

Proof

## Continuous Functions

Continuous at point

Let $$\varepsilon > 0$$.

$$f$$ is continuous at $$c$$ if there exists $$\zeta > 0$$ such that

$$\forall x \in A, |x - c| \leq \zeta$$ implies $$|f(x) - f(c)| \leq \varepsilon$$

If $$c \in A$$ is a cluster point of A,

then $$f$$ is a continuous function at $$c$$ iff $$\lim_{x \to c} f(x) = f(c)$$

If $$c \in A$$ is not a cluster point, then $$f$$ is automatically continuous at $$c$$

1. Let $$f(x) = x, x \in \mathbb{Z}$$.

Then $$f$$ is continuous at $$x$$

2. Let

$f(x) = \begin{cases} 1 & x \geq 0 \\ 0 & x < 0 \end{cases}$

Then $$f$$ is not continuous at 0

3. Let

$f(x) = \begin{cases} 1 & x \in \mathbb{Q} \\ 0 & x \notin \mathbb{Q} \end{cases}$

f(x) is nowhere continuous

4. Let

$f(x) = \begin{cases} 1 & x \in \mathbb{R} \\ 0 & x \notin \mathbb{R} \end{cases}$

from the squeeze theorem, $$f$$ is continuous at 0

but if $$x \neq 0$$, then the limit does not exist and $$f$$ is not continuous

Continuity of points

Let $$A \rightarrow \mathbb{R}$$

Let $$c \in A$$, then $$f$$ is continuous at $$c$$ iff

for every $$(x_n)$$ s.t. $$\lim_{n\to \infty} x_n = c$$, we have $$\lim_{n\to \infty} f(x_n) = f(c)$$

$$f$$ is not continuous at $$c$$, iff there exists $$(x_n)$$ that converges to c

where $$f(x_n)$$ does not converge to $$f(c)$$

Continuity of functions

Let $$f:A \to \mathbb{R}$$

Let $$B \subset A$$

We say $$f$$ is continuous on $$B$$ if $$f(x)$$ is continuous for every $$x \in B$$

1. Let $$f(x) = 1, x > 0$$

$$f$$ is continuous on $$(0, \infty)$$

Let $$f:A \to \mathbb{R}$$ and $$c \in \mathbb{R}$$ be a cluster point of $$A$$.

Assume $$c \notin A$$, but $$\lim_{x \to c} f(x)$$ exists. Then we can define

$$f'(x) = f(x), x in A, \lim_{x \to c}f(x), x = c$$

f' fills in holes

## Combinations of continuous functions

Let $$f,g:A \to \mathbb{R}$$ be continuous functions at $$c$$. Let $$c \in A$$

Then $$f+g$$, $$fg$$, and $$\frac{f}{g}$$ ($$g \neq 0$$) are continuous at $$c$$

Proof
1. Let $$f$$ be any polynomial in $$\mathbb{R}$$

Since $$f$$ is a sum of products of $$x$$ and $$x$$ is continuous on $$\mathbb{R}$$, $$f$$ is also continuous on $$\mathbb{R}$$

Let $$f:A \to \mathbb{R},g:B \to \mathbb{R}$$

Assume for $$x \in B$$, $$g(x) \in A$$

We define $$(f \circ g)(x) = f(g(x))$$

If $$g$$ is continuous at $$c$$ and $$f$$ is continuous at $$g(c)$$, then $$f \circ g$$ is continuous at $$c$$

Let $$A,B \subseteq \mathbb{R}$$. Let $$f$$ be continuous on $$A$$, and $$g$$ be continuous on $$B$$

If $$f(A) \subseteq B$$, then $$g \circ f$$ is continuous on $$A$$

closed set/function

## Intervals

interval is a continuous subset of R

f has a supremum on a bounded set uses squeeze theorem

Assume that $$f$$ is not bounded on $$I$$. We can assume $$f$$ has no upper bound.

$$n$$ is not an upper bound for $$f$$, so we can find $$(x_n) \in I$$ s.t. $$f(x_n) \geq n$$

$$(x_n)$$ is bounded, therefore it admits a convergent subsequence $$(x_m)$$

Let $$x^* = \lim x_m$$

Since I is closed, $$x^* \in I$$

Since $$f$$ is continuous at $$x^* \lim f(x_m) = \lim f(x^*)$$

This contradicts $$f(x_n) \geq n$$

So $$f$$ admits a supremum on $$I$$

The proof is similar for infimum

The supremum/infimum of a function of closed set is also a min/max

Let $$I = [a,b]$$. Let $$f:I \to \mathbb{R}$$ be a continuous function.

Let $$M = \text{sup}{f(x), x \in I}$$, $$S = \inf {f(x), x \in I}$$

$$\exists x_S, x_M \in I$$ s.t. $$f(x_S) = S, f(x_M) = M$$

Proof
1. Let $$f(x) = x^2, I = [-1, 1]$$

$$\sup{F} = 1 = f(1)$$

$$\inf{F} = 0 = f(0)$$

2. Let $$f(x) = x, I = (0,1)$$

$$\inf f = 0$$, but there is no $$x \in I$$ such that $$f(x) = 0$$

Intermediate Value Theorem

Let $$I$$ be an interval and $$f:I \to \mathbb{R}$$ be continuous

Let $$a,b,c \in I$$ s.t. $$a \leq b$$ $$f(a) \leq f(b)$$

Let $$\lambda \in [f(a), f(b)]$$. We can always find $$c \in [a,b]$$

s.t. $$f(c) = \lambda$$

Proof

Zero crossing corollary

Let $$f:[a,b] \to \mathbb{R}$$ be continuous

If $$f(a) < 0, f(b) > 0$$ then $$\exists c$$ s.t. $$f(c) = 0$$

Let $$f:\mathbb{R} \to \mathbb{R}$$ be continuous. Let $$I \in \mathbb{R}$$ be an interval.

Then $$f(I) = {f(x) | x \in I}$$ is an interval

Proof

## Uniform continuity

$$f$$ is uniformly continuous if

for all $$\varepsilon$$ there exists $$\delta$$ such that

$$|x_1 - x_2| \leq \delta$$ implies $$|f(x_1) - f(x_2)| \leq \varepsilon$$

Lipschitz

Let $$f:A \to \mathbb{R}$$

$$f$$ is lipschitz if, there exists $$K \geq 0$$ such that

for all $$x, y \in A$$

$$|f(x) - f(y)| \leq K|x - y|$$

1. Let $$f(x) = x$$.

$$|f(x) - f(y)| \leq 1 * |x - y|$$, so $$f$$ is lipschitz

2. Let $$f(x) = \sqrt{x}$$

$$|f(x) - f(y)| = |\sqrt{x} - \sqrt{y}| = \frac{1}{\sqrt{x} + \sqrt{y}} |x - y|$$

so $$f$$ is not lipschitz on $$[0, \infty)$$.

If we let $$x,y \in [a, \infty)$$, then

$$|f(x) - f(y)| \leq \frac{1}{2\sqrt{a}} |x - y|$$

so $$f$$ is lipschitz on $$[a, \infty)$$

3. Let $$f(x) = x^2$$

$$|f(x) - f(y)| = |x^2 - y^2| = |x - y||x + y|$$.

$$f$$ is not lipschitz on $$\mathbb{R}$$

but it is for any bounded interval, since some $$K$$ will be greater than $$|x - y|$$

Any lipschitz function is uniformly continuous

Proof

The converse is not true, as $$f(x) = \sqrt{x}$$ is uniformly continuous, but not lipschitz.

ie. $$\sqrt{x}$$ is continuous on a bounded interval $$[0,1]$$, but the derivative is not bounded ($$\lim_{x \to 0} f'(x) = \infty$$)

The derivative of Lipschitz functions are bounded by K

Proof

## Monotonic functions

*Increasing and decreasing functions

Let $$f:A \to \mathbb{R}$$, and $$x,y \in A$$, such that $$x \leq y$$

then $$f$$ is increasing if $$f(x) \leq f(y)$$

the definition is similar for strictly increasing/decreasing

1. Let $$f(x) = x$$

$$f$$ is strictly increasing

2. Let $$f(x) = x^2$$

$$f$$ is strictly increasing on the domain $$x \geq 0$$

The left and right limits exist for all points in monotonic functions.

Let $$f:I \to \mathbb{R}$$ be monotonic.

Let $$c \in I$$ which is not an endpoint of $$I$$.

Then $$\lim_{x \to +c} f(x)$$ and $$\lim_{x \to -c} f(x)$$ exist

Proof

## Inverse functions

Inverse The function $$g:f(A) \to A$$ such that $$g(f(a)) = a$$ for $$a \in A$$

A function $$f$$ has an inverse if $$f$$ is injective. ($$f$$ uniquely maps values in its domain)

ie. $$x \neq y$$ implies $$f(x) \neq f(y)$$

Let $$f:A \to \mathbb{R}$$ be a strictly increasing function.

Then $$f$$ is injective and $$g:f(A) \to A$$ is strictly increasing.

Proof

Let $$f$$ be a strictly increasing and continuous function. Then $$f^{-1}$$ is strictly increasing and continuous.

Proof

# Derivatives

Differentiability

We say that a function $$f:I_1 \to \mathbb{R}$$ is differentiable at $$c \in I$$ if

$\lim_{x \to c} \dfrac{f(x) - f(c)}{x - c}$

exists. We call this $$f'(c)$$

1. Let $$f(x) = c$$ for $$c \in \mathbb{R}$$

$$\frac{f(x) - f(c)}{x - c} = 0$$

so $$f'(x) = 0$$

2. Let $$f(x) = x$$

$$\frac{f(x) - f(c)}{x - c} = \frac{x - c}{x - c} = 1$$

so $$\lim_{x \to c} f'(x) = 1$$

3. Let $$f(x) = x^2$$

$$\frac{f(x) - f(c)}{x - c} = \frac{x^2 - c^2}{x - c} = \frac{(x - c)(x + c)}{x - c} = x + c$$

so $$\lim_{x \to c} x + c = 2c$$

4. Let $$f(x) = x^3$$

$$\frac{f(x) - f(c)}{x - c} = \frac{x^3 - c^3}{x - c} = \frac{(x - c)(x^2 + xc + c^2)}{x - c} = x^2 + xc + c^2$$

so $$\lim_{x \to c} x^2 + xc + c^2 = 3c^2$$

If a function is differentiable at c, it is continuous at c

Proof

Sum Rule

The derivative of a sum of functions is the sum of the derivative of the functions

$$(f + g)'(x) = f'(x) = g'(x)$$

Proof

Product Rule

$$(fg)'(c) = f'(c)g(c) = f(c)g'(c)$$

Proof

Quotient Rule

Let $$f$$ be a differentiable function which is not zero on a neighborhood around $$c$$

Then $$(\frac{1}{f})'(c) = \frac{f'(c)}{f(c)^2}$$

Proof

Let $$f,g$$ be differentiable functions where $$g$$ is not zero on a neighborhood around $$c$$

then $$\left(\frac{f}{g}\right)'(c) = \frac{f'g(c) - fg'(c)}{g(c)^2}$$

Proof

The Carathéoday Lemma is needed to prove the next theorem.

Carathéoday Lemma

Let $$f:I \to \mathbb{R}$$ be continuous.

$$f$$ is differentiable at $$c \in I$$ iff,

there exists a continuous $$\phi:I \to \mathbb{R}$$ such that

$$f(x) - f(c) = \phi(x)(x - c)$$ for all $$x \in I$$

Proof

Chain rule

Let $$f:I \to \mathbb{R}$$ and $$g:f(I) \to \mathbb{R}$$ be continuous.

then $$g(f(c))' = g'(f(c))f'(c)$$

Proof

Derivative of inverse

Let $$f:I \to \mathbb{R}$$ and $$f^{-1}:f(I) \to I$$ be continuous.

then $$f'^{-1}(c) = \frac{1}{f'(g(f(c)))}$$

Proof

Relative Extrema

Let $$f: I \to \mathbb{R}$$ be continuous.

We say $$f$$ has a relative minimum at $$x_0 \in I$$ if there exists $$\zeta$$ s.t.

for $$x \in [x_0 - \zeta, x_0 + \zeta ] \cap I$$

$$f(x) \geq f(x_0)$$

Note that by the above definition, constant functions are maximum and minimums everywhere.

Derivative at relative extrema

Let $$f: I \to \mathbb{R}$$, $$x_0 \in I$$, where $$x_0$$ is not a boundary point

Assume $$x_0$$ is a relative extrema and $$f$$ is differentiable at $$x_0$$

Then $$f'(x_0) = 0$$

Proof

Note that the converse is not true. $$\frac{d}{dx}\left[ x^3 \right]_{x = 0} = 0$$, but $$x = 0$$ is not an extremum

1. Let $$f(x) = |x|$$

There is a relative minimum at 0, but the function is not differentiable

Rolle's Theorem

Let $$f:I \to \mathbb{R}$$ be a differentiable function.

If $$f(a) = f(b)$$, where $$a < b$$

then $$f'(c) = 0$$ for some $$c \in (a,b)$$

if abs(x) is not differentiable, then what is the integral of step function

Proof

Mean Value Theorem

Let $$f:I \to \mathbb{R}$$ be differentiable over $$[a,b]$$, where $$a < b$$

then there is a point $$c \in (a,b)$$ such that $$f'(c) = \frac{f(b) - f(a)}{b - a}$$

Proof

Derivative is zero implies constant function

Let $$f:I \to \mathbb{R}$$ be differentiable such that $$f'(x) = 0$$ for all $$x$$

Then $$f(x)$$ is constant on $$I$$

Proof

Positive derivatives implies increasing function

Let $$f:I \to \mathbb{R}$$ be differentiable.

$$f$$ is increasing iff $$f'(x) \geq 0$$

Proof

## First Derivative Test for Extrema

Let $$f:I \to \mathbb{R}$$

If $$f'(c) = 0$$ and $$f'(x) \geq 0$$ for $$x \in (c - \delta, c]$$

and $$f'(x) \leq 0$$ for $$x \in [c,c + \delta)$$

then $$c$$ is a relative minimum.

# L'Hôpital's Rule

Indeterminate limit

Let $$f,g:(a,b) \to \mathbb{R}$$ be continuous.

If $$\lim_{x \to a+)} f(x) = 0$$ and $$\lim_{x \to a+)} g(x) = 0$$

Then we say $\lim_{x \to a+} \frac{f(x)}{g(x)}$

is indeterminate

1. If it exists, evaluate $$\lim_{x \to 0} \frac{sin(x)}{x}$$

$$\frac{sin(x)}{x} = \frac{sin(x) - sin(0)}{x - 0}$$

Since $$\sin(x)$$ differentiable at 0, $$\lim_{x \to 0} \frac{\sin(x)}{x} = \cos(0) = 1$$

2. If it exists, evaluate $$\lim_{x \to 0} \frac{x}{x^2}$$

$$\lim_{x \to 0} \frac{x}{x^2} = \infty$$

so it is indeterminate

L'Hôpital's Rule #1

Let $$f,g:I \to \mathbb{R}$$ be continuous.

Assume $$f(a) = g(a) = 0$$ and that $$f,g$$ are differentiable at $$a$$

and $$g'(a) \neq 0$$

we have $\lim_{x \to a} \frac{f(x)}{g(x)} = \frac{f'(a)}{g'(a)}$

Proof
1. If it exists, compute $$\lim_{x \to 0} \frac{1 - \cos(x)}{x}$$

Let $$f(x) = 1 - \cos(x)$$, $$g(x) = x$$

Then $$f'(0) = 0$$, $$g'(0) = 1$$

so by L'Hôpital's rule, $$\lim_{x \to 0} \frac{1 - \cos(x)}{x}$$

2. If it exists $$\lim_{x \to 0} \frac{1 - \cos(x)}{x^2}$$

We apply L'Hôpital's rule twice:

$$f(x) = 1 - \cos(x)$$, $$f'(x) = \sin(x)$$, $$f''(x) = \cos(x)$$

$$g(x) = x^2$$, $$g'(x) = 2x$$, $$g''(x) = 2$$

so $$\lim_{x \to 0} \frac{1 - \cos(x)}{x^2} = \frac{f''(0)}{g''(0)} = \frac{1}{2}$$

L'Hôpital's Rule #2

Let $$f,g:I \to \mathbb{R}$$ be continuous ad differentiable functions.

Assume $$\lim_{x \to a+} f(x),g(x) = 0$$ and $$g'(x) \neq 0$$ for $$x \in (a,b)$$

a) If $$\lim_{x \to a+} \frac{f'(x)}{g'(x)} = L$$, then $$\lim_{x \to a+} \frac{f(x)}{g(x)} = L$$ b) If $$\lim_{x \to a+} \frac{f'(x)}{g'(x)} = \infty$$, then $$\lim_{x \to a+} \frac{f(x)}{g(x)} = \infty$$

This does not require that $$f,g$$ be differentiable at $$a$$

To prove the above, we need the Cauchy Mean Value Theorem

Cauchy Mean Value Theorem

Let $$f,g:I \to \mathbb{R} be differentiable$$

Let $$a,b \in I, a < b$$

Assume $$g'(x) \neq 0$$, for $$x \in (a,b)$$

You can find $$c \in (a,b)$$ such that

$\frac{f(b) - f(a)}{g(b) - g(a)} = \frac{f'(c)}{g'(c)}$

Proof
Proof
1. Evaluate $$\lim_{x \to 0} \frac{e^x - 1 - x}{x^2}$$

$$f(x) = e^x - 1 - x$$, $$f'(x) = e^x - 1$$, $$f''(x) = e^x$$

$$g(x) = x^2$$, $$g'(x) = 2x$$, $$g''(x) = 2$$

$$\lim_{x \to 0} \frac{f(x)}{g(x)} = \lim_{x \to 0} \frac{f'(x)}{g'(x)} = \lim_{x \to 0} \frac{f''(x)}{g''(x)} = \frac{1}{2}$$

# Riemann Integrals

Partition

Let $$I=[a,b]$$ A partition of $$I$$ is a sequence $$x_0,...,x_n$$

such that $$a = x_0 < ... < x_n = b$$

Tagged partition

A tagged partition is a partition where a tag $$t_i$$ is defined for every interval.

where $$t_i \in [x_{i-1}, x_i]$$

Mesh

The mesh of a partition is defined as $$||\cal{P}|| = max(x_i - x_{i - 1} , 1 \leq i \leq n)$$

$\lim_{\Delta x \to 0} \sum_{n = 0}^{\frac{b - a}{\Delta x}} \Delta x (f(a + \Delta x n))$

Riemann Sum

$$S(f, \bar{\cal{P}}) = \sum_{i = 1}{n} f(t_i)(x_i - x_{i-1})$$

Riemann Integrable We say a function $$f$$ is Riemann integrable f $$\forall \varepsilon > 0$$, you can find $$\delta > 0$$ such that

for every tagged partition $$\bar{\cal{P}}$$ with mesh $$||\bar{\cal{P}}|| \leq \delta$$

you have $$|S(f,\bar{\cal{P}}) - L| \leq \varepsilon$$ and $$L$$ is called the Riemann integral, where $$L = \int_a^b f(x) dx$$

1. Let $$f(x) = 0, x \in [0,1]$$

$$S(f,\cal{\bar{P}}) = 0$$ for any tagged partition.

Therefore $$\int_a^b f(x) dx = 0$$

2. Let $$f(x) = 1, x \in \mathbb{R}$$

\begin{aligned} S(f,\bar{\cal{P}}) & = \sum_{i = 1}^n f(t_i)(x_i - x_{i-1}) \\ & = \sum_{i=1}^n (x_i - x_{i - 1}) \\ & = b - a \end{aligned}

So $$\int_a^b f(x) dx = b - a$$

Properties of integrals Let $$f,g \in \cal{R}[a,b]$$

1. $$\int_a^b f(x) + g(x) dx = \int_a^b f(x) dx + \int_a^b g(x) dx$$
2. $$\int_a^b kf(x) dx = k\int_a^b f(x) dx$$ for $$k \in \mathbb{R}$$
3. if $$f$$ is Riemann integrable, $$f$$ must be bounded
Proof

If $$f:I \to \mathbb{R}$$ is continuous, then $$f$$ is Riemann integrable.

The converse is not true

If $$f$$ is a monotonic function, then it is Riemann integrable