Sequences
Review the definition of a sequence:
A sequence $\{p_n\}$ in a metric space $X$ is said to converge if there is a point $p \in X$ with the following property: For every $\epsilon > 0,$ there is an integer $N$ such that $n \geq N$ implies that $d(p_n, p) < \epsilon.$
If a sequence $\{p_n\}$ converges to $p,$ we say that $p$ is the limit of $\{p_n\},$ denoted as:
$$ \lim_{n \to \infty} p_n = p. $$
Referenced by (2 direct)
The set of all points $p_n$ of a sequence $\{p_n\} (n = 1, 2, 3, \dots)$ is the range of $\{p_n\}.$
Referenced by (2 direct, 1 transitive)
Direct references:
Transitive (depth 1):
The range (sequence) of a sequence may be finite or it may be infinite.
The sequence $\{p_n\}$ is said to be bounded if its range (sequence) is bounded.
Referenced by (1 direct)
Direct references:
In the following theorems, let $\{p_n\}$ be a sequence in a metric space $X.$
$\{p_n\}$ converges to $p \in X$ if and only if every neighborhood of $p$ contains $p_n$ for all but finitely many $n.$
Suppose $\{p_n\}$ converges to $p \in X.$ Let $\epsilon > 0.$ For some integer $N,$ $d(p, p_n) < \epsilon$ when $n > N.$ Therefore, $p_n \in B_\epsilon(p)$ for all but the finitely many $p_n$ where $n \leq N.$
Conversely, suppose every neighborhood of $p$ contains all but finitely many $p_n,$ i.e., for all but $N$ elements of $\{p_n\}.$ Let $\epsilon > 0.$ Then, $p_n \in B_\epsilon(p)$ whenever $n \geq N,$ therefore, $d(p, p_n) < \epsilon.$
$\square$Referenced by (2 direct)
If $p \in X, p' \in X$ and $\{p_n\}$ converges to $p$ and $p',$ then $p = p'.$
Suppose, for contradiction, that $p \neq p'.$ Then, $\epsilon = d(p, p') > 0.$ Let $\delta = \epsilon/2$ and $B_\delta(p), B_\delta(p')$ be balls around $p$ and $p',$ respectively. This means that only finitely many points from $\{p_n\}$ are not in $B_\delta(p)$. However, since $B_\delta(p)$ and $B_\delta(p')$ are disjoint by construction, this means only finitely many points of $\{p_n\}$ are in $B_\delta(p'),$ a contradiction. Therefore, our assumption that $p \neq p'$ is incorrect, so $p = p'.$
$\square$Referenced by (1 direct)
Direct references:
If $\{p_n\}$ converges, then $\{p_n\}$ is bounded (sequence).
Let $\epsilon > 0.$ Only finitely many points in $\{p_n\}$ lie outside of $B_\epsilon(p)$. That is, for some integer $N,$ only the points $p_n$ where $n \leq N$ lie outside of $B_\epsilon(p).$ Let $\delta = \max\{\epsilon, d(p, p_1), d(p, p_2), \dots, d(p, p_n)\}, n = 1, 2, \dots, N.$ Then, $d(p, p_n) < \delta$ for all $n = 1, 2, 3, \dots.$
$\square$If $E \subset X$ and if $p$ is a limit point of $E,$ then there is a sequence $\{p_n\}$ in $E$ such that $p = \lim_{n \to \infty} p_n.$
For each $n = 1,2,3 \dots,$ there is a point $p_n \in E$ such that $d(p, p_n) < 1/n$. Let $\epsilon > 0,$ and pick $N$ so that $N \epsilon > 1.$ Then, if $n > N,$ $d(p, p_n) < \epsilon,$ so $\lim_{n \to \infty} p_n = p$.
$\square$Suppose $\{s_n\}$ and $\{t_n\}$ are complex sequences, and $\lim_{n \to \infty} s_n = s, \lim_{n \to \infty} t_n = t.$ Then
(a) $\lim_{n \to \infty} (s_n + t_n) = s + t;$
(b) $\lim_{n \to \infty} c s_n = cs,$ for any number $c;$
(c) $\lim_{n \to \infty} (c + s_n) = c + s,$ for any number $c;$
(d) $\lim_{n \to \infty} s_n t_n = st;$
(e) $\lim_{n \to \infty} \frac{1}{s_n} = \frac{1}{s}, s_n \neq 0, s \neq 0.$
For (a), let $\epsilon > 0,$ pick $N_s$ such that $|s_n - s| < \epsilon/2$ when $n \geq N_s,$ and pick $N_t$ such that $|t_n - s| < \epsilon/2$ when $n \geq N_t.$ Then, let $N = \max\{N_s, N_t\}.$ Then, when $n \geq N,$
$$ \begin{aligned} |(s_n + t_n) - (s + t) | & = | (s_n - s) + (t_n - t)| \\ & \leq |s_n - s| + |t_n - t| \\ & < \epsilon. \end{aligned} $$
For (b), If $c = 0,$ let $\epsilon > 0,$ and then $|c s_n - c s| = 0 < \epsilon.$ Otherwise,
Let $\epsilon / |c| > 0.$ For some $N,$ when $n \geq N,$ we have
$$ \begin{aligned} |s_n - s| < \epsilon / |c| & \iff |c| |s_n - s| < \epsilon \\ & \iff |cs_n - cs | < \epsilon. \end{aligned} $$
For (c), let $\epsilon > 0.$ For some $N,$ when $n \geq N,$ we have
$$ |(s_n + c) - (s + c)| = |s_n - s| < \epsilon. $$
For (d), first note the identity
$$ s_n t_n - st = (s_n - s)(t_n - t) + s(t_n - t) + t(s_n - s). \tag{1} $$
Now, let $\epsilon > 0.$ Pick $N_s$ such that $|s_n - s| < \sqrt{\epsilon}$ when $n \geq N_s,$ and pick $N_t$ such that $|t_n - s| < \sqrt{\epsilon}$ when $n \geq N_t.$ Then, let $N = \max\{N_s, N_t\}.$ Then, when $n \geq N,$
$$ |(s_n - s)(t_n - t)| < \epsilon, $$
which means
$$ \lim_{n \to \infty}(s_n - s)(t_n - t) = 0. $$
Applying this, along with the results of (a) and (b) to (1) gives:
$$ \begin{aligned} \lim_{n \to \infty} (s_n t_n - st) & = \lim_{n \to \infty} ((s_n - s)(t_n - t) + s(t_n - t) + t(s_n - s)) \\ & = \lim_{n \to \infty} (s(t_n - t) + t(s_n - s)) \\ & = \lim_{n \to \infty} (s(t_n - t)) + \lim_{n \to \infty} (t(s_n - s)) \\ & = s * 0 + t * 0 \\ & = 0. \end{aligned} $$
For (e), pick $m$ such that when $n \geq m, |s_n - s| < \frac{1}{2}|s|,$ so we have that
$$ \begin{aligned} |s_n - s| < \frac{1}{2}|s| & \iff |s_n| + |s| < \frac{1}{2}|s| \\ & \iff |s_n| < -\frac{1}{2}|s| \\ & \iff |s_n| > \frac{1}{2}|s| \\ & \iff \frac{1}{2 |s_n|}|s| < 1. \end{aligned} $$
Now, let $\epsilon > 0.$ For some integer $N > m,$ when $n \geq M,$ we have that
$$ |s_n - s | < \frac{1}{2}|s|^2 \epsilon. $$
Thus, when $n \geq N,$
$$ \begin{aligned} \left | \frac{1}{s_n} - \frac{1}{s} \right | & = \left | \frac{s_n - s}{s_n s} \right | \\ & = \frac{|s_n - s|}{|s_n s|} \\ & < \frac{1}{2 |s_n||s|}|s|^2 \epsilon \\ & = \frac{1}{2 |s_n|}|s| \epsilon \\ & < \epsilon. \end{aligned} $$
$\square$(a) Suppose $\vec{x}_n \in R^k, (n = 1, 2, 3, \dots)$ and $\vec{x}_n = (\alpha_{1,n}, \dots, \alpha_{k,n}).$
Then, $\{\vec{x}_n\}$ converges to $\vec{x} = (\alpha_1, \dots, \alpha_k)$ if and only if
$$ \lim_{n \to \infty} \alpha_{j,n} = \alpha_j, \quad (1 \leq j \leq k). $$
(b) Suppose $\{\vec{x}_n\}, \{\vec{y}_n\}$ are sequences in $R^k,$ $\{\beta_n\}$ is a sequence of real numbers, and $\vec{x}_n \to \vec{x}, \vec{y}_n \to \vec{y}, \beta_n \to \beta.$ Then
$$ \lim_{n \to \infty} (\vec{x}_n + \vec{y}_n) = \vec{x} + \vec{y}, \quad \lim_{n \to \infty} \vec{x}_n \cdot \vec{y}_n = \vec{x} \cdot \vec{y}, \quad \lim_{n \to \infty} \beta_n \vec{x}_n = \beta \vec{x}. $$
For (a), assume $\vec x_{n} \to \vec x.$ Then, from the definition of the norm,
$$ |\alpha_{j,n} - \alpha_j | \leq |\vec{x}_n - \vec{x} |, $$
that is, the distance from $\alpha_{k,n}$ to $\alpha_{n}$ is always less than or equal to the distance from $\vec{x}_n$ to $\vec{x}.$ Therefore, for $\epsilon > 0,$ $|\vec{x}_n - \vec{x}| < \epsilon \implies |\alpha_{j,n} - \alpha_j| < \epsilon,$ and we can pick $n$ to make this true for as small of $\epsilon$ as we'd like. Therefore, $\lim_{n \to \infty} \alpha_{j, n} = \alpha_j.$
Conversely, assume $\lim_{n \to \infty} \alpha_{j, n} = \alpha_j.$ Let $\epsilon > 0.$ For some integer $N,$ when $n \geq N$ we have
$$ |\alpha_{j,n} - \alpha_{j}| \leq \frac{\epsilon}{\sqrt{k}}, \quad (1 \leq j \leq k). $$
Therefore, $n \geq N$ implies that
$$ |\vec{x_n} - \vec{x}| = \sqrt{\sum_{j=1}^k |\alpha_{j,n} - \alpha_j|^2} < \epsilon, $$
so $\vec{x}_n \to \vec{x}. $
Part (b) follows from part (a) and sequence-in-rk-converges-iff-its-components-converge.
$\square$Referenced by (1 direct)
Direct references:
Subsequences
Given a sequence $\{p_n\},$ consider a sequence $\{n_k\}$ of positive integers, such that $n_1 < n_2 < n_3 < \cdots.$ Then the sequence $\{p_{n_i}\}$ is called a subsequence of $\{p_n\}.$
Referenced by (5 direct, 1 transitive)
Direct references:
Transitive (depth 1):
If a subsequence $\{p_{n_i}\}$ of $\{p_n\}$ converges, its limit is called a subsequential limit of $\{p_n\}.$
Referenced by (1 direct)
Direct references:
A sequence $\{p_n\}$ converges to $p$ if and only if every subsequence of $\{p_n\}$ converges to $p.$
Suppose that $\{p_n\}$ converges to $p.$ Suppose some subsequence $\{p_{n_i}\}$ converges to $q,$ and suppose, for contradiction, that $q \neq p.$ Now, following an argument similar to the proof that limits of sequences are unique, we can see that if $p \neq q,$ arbitrary neighborhoods around both can't contain all but finitely many points, so we have a contradiction, and $p = q.$
Conversely, suppose every subsequence of $\{p_n\}$ converges to $p.$ Then, $\{p_n\}$ is a subsequence of itself, so it converges to $p.$
$\square$If $\{p_n\}$ is a sequence in a compact metric space $X,$ then some subsequence of $\{p_n\}$ converges to a point in $X.$
Let $E$ be the range of $\{p_n\}.$ If $E$ is finite, then at least one point $p$ in $E$ must be repeated infinitely many times in $\{p_n\}.$ If we let $\{n_i\}$ be the indices of the occurrences of $p$ in $\{p_n\}:$
$$ p_{n_1} = p_{n_2} = \cdots = p, $$
then the subsequence $\{p_{n_i}\}$ converges to $p.$
On the other hand, if $E$ is infinite, then $E$ has a limit point $p \in X$. Pick $n_1$ so that $d(p, p_{n_1} < 1.$ Now, after picking $n_1, \dots, n_{i -1},$ we can pick $n_i > n_{i-1}$ such that $d(p, p_{n_i}) < 1/i$, so $\{p_{n_i}\}$ converges to $p.$
$\square$Referenced by (1 direct)
Direct references:
Every bounded sequence in $R^k$ contains a convergent subsequence.
Note that any bounded sequence $\{p_n\} \subset R^k$ is a subset of some closed set, bounded and thus compact k-cell in $R^k.$ Therefore, $\{p_n\}$ is a sequence in a compact metric space, and has a convergent subsequence.
$\square$The subsequential limits of a sequence $\{p_n\}$ in a metric space $X$ form a closed set subset of $X.$
Let $E^*$ be the set of all subsequential limits of $\{p_n\}$ and let $q$ be a limit point of $E^*.$ We want to show that $q \in E^*.$
First, note that if the range of $\{p_n\}$ is just $\{q\},$ then $q$ is the only subsequential limit of $\{p_n\}.$ In this case, $E* = \{q\}$ is a singleton and is closed set, as it vacuously contains all of its limit points. So, assume this is not the case.
Choose $n_1$ so that $p_{n_1} \neq q,$ and let $\delta = d(q, p_{n_1}).$ Suppose $n_1, \dots, n_{i-1}$ are chosen. Since $q$ is a limit point of $E^*,$ there is an $x \in E^*$ with $d(q, x) < \frac{\delta}{2^i}.$ Since $x \in E^*$ and is thus the limit of some subsequence of $\{p_n\},$ there is an $n_i > n_{i-1}$ such that $d(x, p_{n_i}) < \frac{\delta}{2^i}.$ Now, via the triangle inequality,
$$ \begin{aligned} d(q, p_{n_i}) & \leq d(q, x) + d(x, p_{n_i}) \\ & < \frac{\delta}{2^i} + \frac{\delta}{2^i} \\ & = \frac{2\delta}{2^i} = \frac{\delta}{2^{i-1}}, \quad i = 1, 2, 3, \dots. \end{aligned} $$
This means that $\{p_{n_i}\}$ converges to $q,$ because we can find a $p_{n_i}$ as close as desired to $q.$ Therefore $q$ is a subsequential limit of $\{p_n\}$ and so $q \in E^*,$ and $E^*$ is closed.
$\square$Referenced by (1 direct)
Direct references:
The theorem above tells us about the long term behavior of a sequence, even if it doesn't converge. The set of all subsequential limits of $\{p_n\}$ gives us the set of all points that are approached arbitrarily closely infinitely often in $\{p_n\}.$ It's basically the set of points that $\{p_n\}$ likes to hang out around! $E^*$ being closed means that if there is a point in $X$ that the points of $E^*$ get arbitrarily close to, then it is also a point $\{p_n\}$ likes to hang out around.