From 62759ea3f910fae2617d033bf8f878d7574f4edd Mon Sep 17 00:00:00 2001
From: Prefetch
Date: Sun, 7 Nov 2021 19:34:18 +0100
Subject: Expand knowledge base, reorganize measure theory, update gitignore
---
content/know/concept/sigma-algebra/index.pdc | 61 ----------------------------
1 file changed, 61 deletions(-)
(limited to 'content/know/concept/sigma-algebra')
diff --git a/content/know/concept/sigma-algebra/index.pdc b/content/know/concept/sigma-algebra/index.pdc
index 96240ff..94e7306 100644
--- a/content/know/concept/sigma-algebra/index.pdc
+++ b/content/know/concept/sigma-algebra/index.pdc
@@ -42,9 +42,6 @@ Likewise, a **sub-$\sigma$-algebra**
is a sub-family of a certain $\mathcal{F}$,
which is a valid $\sigma$-algebra in its own right.
-
-## Notable applications
-
A notable $\sigma$-algebra is the **Borel algebra** $\mathcal{B}(\Omega)$,
which is defined when $\Omega$ is a metric space,
such as the real numbers $\mathbb{R}$.
@@ -54,64 +51,6 @@ and all the subsets of $\mathbb{R}$ obtained by countable sequences
of unions and intersections of those intervals.
The elements of $\mathcal{B}$ are **Borel sets**.
-
-
-Another example of a $\sigma$-algebra is the **information**
-obtained by observing a [random variable](/know/concept/random-variable/) $X$.
-Let $\sigma(X)$ be the information generated by observing $X$,
-i.e. the events whose occurrence can be deduced from the value of $X$:
-
-$$\begin{aligned}
- \sigma(X)
- = X^{-1}(\mathcal{B}(\mathbb{R}^n))
- = \{ A \in \mathcal{F} : A = X^{-1}(B) \mathrm{\:for\:some\:} B \in \mathcal{B}(\mathbb{R}^n) \}
-\end{aligned}$$
-
-In other words, if the realized value of $X$ is
-found to be in a certain Borel set $B \in \mathcal{B}(\mathbb{R}^n)$,
-then the preimage $X^{-1}(B)$ (i.e. the event yielding this $B$)
-is known to have occurred.
-
-Given a $\sigma$-algebra $\mathcal{H}$,
-a random variable $Y$ is said to be *"$\mathcal{H}$-measurable"*
-if $\sigma(Y) \subseteq \mathcal{H}$,
-meaning that $\mathcal{H}$ contains at least
-all information extractable from $Y$.
-
-Note that $\mathcal{H}$ can be generated by another random variable $X$,
-i.e. $\mathcal{H} = \sigma(X)$.
-In that case, the **Doob-Dynkin lemma** states
-that $Y$ is only $\sigma(X)$-measurable
-if $Y$ can always be computed from $X$,
-i.e. there exists a function $f$ such that
-$Y(\omega) = f(X(\omega))$ for all $\omega \in \Omega$.
-
-
-
-The concept of information can be extended for
-stochastic processes (i.e. time-indexed random variables):
-if $\{ X_t : t \ge 0 \}$ is a stochastic process,
-its **filtration** $\mathcal{F}_t$ contains all
-the information generated by $X_t$ up to the current time $t$:
-
-$$\begin{aligned}
- \mathcal{F}_t
- = \sigma(X_s : 0 \le s \le t)
-\end{aligned}$$
-
-In other words, $\mathcal{F}_t$ is the "accumulated" $\sigma$-algebra
-of all information extractable from $X_t$,
-and hence grows with time: $\mathcal{F}_s \subset \mathcal{F}_t$ for $s < t$.
-Given $\mathcal{F}_t$, all values $X_s$ for $s \le t$ can be computed,
-i.e. if you know $\mathcal{F}_t$, then the present and past of $X_t$ can be reconstructed.
-
-Given some filtration $\mathcal{H}_t$, a stochastic process $X_t$
-is said to be *"$\mathcal{H}_t$-adapted"*
-if $X_t$'s own filtration $\sigma(X_s : 0 \le s \le t) \subseteq \mathcal{H}_t$,
-meaning $\mathcal{H}_t$ contains enough information
-to determine the current and past values of $X_t$.
-Clearly, $X_t$ is always adapted to its own filtration.
-
## References
--
cgit v1.2.3