summaryrefslogtreecommitdiff
path: root/source/know/concept/hilbert-space
diff options
context:
space:
mode:
authorPrefetch2022-10-14 23:25:28 +0200
committerPrefetch2022-10-14 23:25:28 +0200
commit6ce0bb9a8f9fd7d169cbb414a9537d68c5290aae (patch)
treea0abb6b22f77c0e84ed38277d14662412ce14f39 /source/know/concept/hilbert-space
Initial commit after migration from Hugo
Diffstat (limited to 'source/know/concept/hilbert-space')
-rw-r--r--source/know/concept/hilbert-space/index.md196
1 files changed, 196 insertions, 0 deletions
diff --git a/source/know/concept/hilbert-space/index.md b/source/know/concept/hilbert-space/index.md
new file mode 100644
index 0000000..d2b9770
--- /dev/null
+++ b/source/know/concept/hilbert-space/index.md
@@ -0,0 +1,196 @@
+---
+title: "Hilbert space"
+date: 2021-02-22
+categories:
+- Mathematics
+- Quantum mechanics
+layout: "concept"
+---
+
+A **Hilbert space**, also called an **inner product space**, is an
+abstract **vector space** with a notion of length and angle.
+
+
+## Vector space
+
+An abstract **vector space** $\mathbb{V}$ is a generalization of the
+traditional concept of vectors as "arrows". It consists of a set of
+objects called **vectors** which support the following (familiar)
+operations:
+
++ **Vector addition**: the sum of two vectors $V$ and $W$, denoted $V + W$.
++ **Scalar multiplication**: product of a vector $V$ with a scalar $a$, denoted $a V$.
+
+In addition, for a given $\mathbb{V}$ to qualify as a proper vector
+space, these operations must obey the following axioms:
+
++ **Addition is associative**: $U + (V + W) = (U + V) + W$
++ **Addition is commutative**: $U + V = V + U$
++ **Addition has an identity**: there exists a $\mathbf{0}$ such that $V + 0 = V$
++ **Addition has an inverse**: for every $V$ there exists $-V$ so that $V + (-V) = 0$
++ **Multiplication is associative**: $a (b V) = (a b) V$
++ **Multiplication has an identity**: There exists a $1$ such that $1 V = V$
++ **Multiplication is distributive over scalars**: $(a + b)V = aV + bV$
++ **Multiplication is distributive over vectors**: $a (U + V) = a U + a V$
+
+A set of $N$ vectors $V_1, V_2, ..., V_N$ is **linearly independent** if
+the only way to satisfy the following relation is to set all the scalar coefficients $a_n = 0$:
+
+$$\begin{aligned}
+ \mathbf{0} = \sum_{n = 1}^N a_n V_n
+\end{aligned}$$
+
+In other words, these vectors cannot be expressed in terms of each
+other. Otherwise, they would be **linearly dependent**.
+
+A vector space $\mathbb{V}$ has **dimension** $N$ if only up to $N$ of
+its vectors can be linearly indepedent. All other vectors in
+$\mathbb{V}$ can then be written as a **linear combination** of these $N$ **basis vectors**.
+
+Let $\vu{e}_1, ..., \vu{e}_N$ be the basis vectors, then any
+vector $V$ in the same space can be **expanded** in the basis according to
+the unique weights $v_n$, known as the **components** of $V$
+in that basis:
+
+$$\begin{aligned}
+ V = \sum_{n = 1}^N v_n \vu{e}_n
+\end{aligned}$$
+
+Using these, the vector space operations can then be implemented as follows:
+
+$$\begin{gathered}
+ V = \sum_{n = 1} v_n \vu{e}_n
+ \quad
+ W = \sum_{n = 1} w_n \vu{e}_n
+ \\
+ \quad \implies \quad
+ V + W = \sum_{n = 1}^N (v_n + w_n) \vu{e}_n
+ \qquad
+ a V = \sum_{n = 1}^N a v_n \vu{e}_n
+\end{gathered}$$
+
+
+## Inner product
+
+A given vector space $\mathbb{V}$ can be promoted to a **Hilbert space**
+or **inner product space** if it supports an operation $\Inprod{U}{V}$
+called the **inner product**, which takes two vectors and returns a
+scalar, and has the following properties:
+
++ **Skew symmetry**: $\Inprod{U}{V} = (\Inprod{V}{U})^*$, where ${}^*$ is the complex conjugate.
++ **Positive semidefiniteness**: $\Inprod{V}{V} \ge 0$, and $\Inprod{V}{V} = 0$ if $V = \mathbf{0}$.
++ **Linearity in second operand**: $\Inprod{U}{(a V + b W)} = a \Inprod{U}{V} + b \Inprod{U}{W}$.
+
+The inner product describes the lengths and angles of vectors, and in
+Euclidean space it is implemented by the dot product.
+
+The **magnitude** or **norm** $|V|$ of a vector $V$ is given by
+$|V| = \sqrt{\Inprod{V}{V}}$ and represents the real positive length of $V$.
+A **unit vector** has a norm of 1.
+
+Two vectors $U$ and $V$ are **orthogonal** if their inner product
+$\Inprod{U}{V} = 0$. If in addition to being orthogonal, $|U| = 1$ and
+$|V| = 1$, then $U$ and $V$ are known as **orthonormal** vectors.
+
+Orthonormality is desirable for basis vectors, so if they are
+not already like that, it is common to manually turn them into a new
+orthonormal basis using e.g. the [Gram-Schmidt method](/know/concept/gram-schmidt-method).
+
+As for the implementation of the inner product, it is given by:
+
+$$\begin{gathered}
+ V = \sum_{n = 1}^N v_n \vu{e}_n
+ \quad
+ W = \sum_{n = 1}^N w_n \vu{e}_n
+ \\
+ \quad \implies \quad
+ \Inprod{V}{W} = \sum_{n = 1}^N \sum_{m = 1}^N v_n^* w_m \Inprod{\vu{e}_n}{\vu{e}_j}
+\end{gathered}$$
+
+If the basis vectors $\vu{e}_1, ..., \vu{e}_N$ are already
+orthonormal, this reduces to:
+
+$$\begin{aligned}
+ \Inprod{V}{W} = \sum_{n = 1}^N v_n^* w_n
+\end{aligned}$$
+
+As it turns out, the components $v_n$ are given by the inner product
+with $\vu{e}_n$, where $\delta_{nm}$ is the Kronecker delta:
+
+$$\begin{aligned}
+ \Inprod{\vu{e}_n}{V} = \sum_{m = 1}^N \delta_{nm} v_m = v_n
+\end{aligned}$$
+
+
+## Infinite dimensions
+
+As the dimensionality $N$ tends to infinity, things may or may not
+change significantly, depending on whether $N$ is **countably** or
+**uncountably** infinite.
+
+In the former case, not much changes: the infinitely many **discrete**
+basis vectors $\vu{e}_n$ can all still be made orthonormal as usual,
+and as before:
+
+$$\begin{aligned}
+ V = \sum_{n = 1}^\infty v_n \vu{e}_n
+\end{aligned}$$
+
+A good example of such a countably-infinitely-dimensional basis are the
+solution eigenfunctions of a [Sturm-Liouville problem](/know/concept/sturm-liouville-theory/).
+
+However, if the dimensionality is uncountably infinite, the basis
+vectors are **continuous** and cannot be labeled by $n$. For example, all
+complex functions $f(x)$ defined for $x \in [a, b]$ which
+satisfy $f(a) = f(b) = 0$ form such a vector space.
+In this case $f(x)$ is expanded as follows, where $x$ is a basis vector:
+
+$$\begin{aligned}
+ f(x) = \int_a^b \Inprod{x}{f} \dd{x}
+\end{aligned}$$
+
+Similarly, the inner product $\Inprod{f}{g}$ must also be redefined as
+follows:
+
+$$\begin{aligned}
+ \Inprod{f}{g} = \int_a^b f^*(x) \: g(x) \dd{x}
+\end{aligned}$$
+
+The concept of orthonormality must be also weakened. A finite function
+$f(x)$ can be normalized as usual, but the basis vectors $x$ themselves
+cannot, since each represents an infinitesimal section of the real line.
+
+The rationale in this case is that action of the identity operator $\hat{I}$ must
+be preserved, which is given here in [Dirac notation](/know/concept/dirac-notation/):
+
+$$\begin{aligned}
+ \hat{I} = \int_a^b \Ket{\xi} \Bra{\xi} \dd{\xi}
+\end{aligned}$$
+
+Applying the identity operator to $f(x)$ should just give $f(x)$ again:
+
+$$\begin{aligned}
+ f(x) = \Inprod{x}{f} = \matrixel{x}{\hat{I}}{f}
+ = \int_a^b \Inprod{x}{\xi} \Inprod{\xi}{f} \dd{\xi}
+ = \int_a^b \Inprod{x}{\xi} f(\xi) \dd{\xi}
+\end{aligned}$$
+
+Since we want the latter integral to reduce to $f(x)$, it is plain to see that
+$\Inprod{x}{\xi}$ can only be a [Dirac delta function](/know/concept/dirac-delta-function/),
+i.e $\Inprod{x}{\xi} = \delta(x - \xi)$:
+
+$$\begin{aligned}
+ \int_a^b \Inprod{x}{\xi} f(\xi) \dd{\xi}
+ = \int_a^b \delta(x - \xi) f(\xi) \dd{\xi}
+ = f(x)
+\end{aligned}$$
+
+Consequently, $\Inprod{x}{\xi} = 0$ if $x \neq \xi$ as expected for an
+orthogonal set of vectors, but if $x = \xi$ the inner product
+$\Inprod{x}{\xi}$ is infinite, unlike earlier.
+
+Technically, because the basis vectors $x$ cannot be normalized, they
+are not members of a Hilbert space, but rather of a superset called a
+**rigged Hilbert space**. Such vectors have no finite inner product with
+themselves, but do have one with all vectors from the actual Hilbert
+space.