diff options
author | Prefetch | 2022-10-20 18:25:31 +0200 |
---|---|---|
committer | Prefetch | 2022-10-20 18:25:31 +0200 |
commit | 16555851b6514a736c5c9d8e73de7da7fc9b6288 (patch) | |
tree | 76b8bfd30f8941d0d85365990bcdbc5d0643cabc /source/know/concept/hilbert-space | |
parent | e5b9bce79b68a68ddd2e51daa16d2fea73b84fdb (diff) |
Migrate from 'jekyll-katex' to 'kramdown-math-sskatex'
Diffstat (limited to 'source/know/concept/hilbert-space')
-rw-r--r-- | source/know/concept/hilbert-space/index.md | 100 |
1 files changed, 50 insertions, 50 deletions
diff --git a/source/know/concept/hilbert-space/index.md b/source/know/concept/hilbert-space/index.md index ef55d2b..57926ce 100644 --- a/source/know/concept/hilbert-space/index.md +++ b/source/know/concept/hilbert-space/index.md @@ -14,28 +14,28 @@ abstract **vector space** with a notion of length and angle. ## Vector space -An abstract **vector space** $\mathbb{V}$ is a generalization of the +An abstract **vector space** $$\mathbb{V}$$ is a generalization of the traditional concept of vectors as "arrows". It consists of a set of objects called **vectors** which support the following (familiar) operations: -+ **Vector addition**: the sum of two vectors $V$ and $W$, denoted $V + W$. -+ **Scalar multiplication**: product of a vector $V$ with a scalar $a$, denoted $a V$. ++ **Vector addition**: the sum of two vectors $$V$$ and $$W$$, denoted $$V + W$$. ++ **Scalar multiplication**: product of a vector $$V$$ with a scalar $$a$$, denoted $$a V$$. -In addition, for a given $\mathbb{V}$ to qualify as a proper vector +In addition, for a given $$\mathbb{V}$$ to qualify as a proper vector space, these operations must obey the following axioms: -+ **Addition is associative**: $U + (V + W) = (U + V) + W$ -+ **Addition is commutative**: $U + V = V + U$ -+ **Addition has an identity**: there exists a $\mathbf{0}$ such that $V + 0 = V$ -+ **Addition has an inverse**: for every $V$ there exists $-V$ so that $V + (-V) = 0$ -+ **Multiplication is associative**: $a (b V) = (a b) V$ -+ **Multiplication has an identity**: There exists a $1$ such that $1 V = V$ -+ **Multiplication is distributive over scalars**: $(a + b)V = aV + bV$ -+ **Multiplication is distributive over vectors**: $a (U + V) = a U + a V$ ++ **Addition is associative**: $$U + (V + W) = (U + V) + W$$ ++ **Addition is commutative**: $$U + V = V + U$$ ++ **Addition has an identity**: there exists a $$\mathbf{0}$$ such that $$V + 0 = V$$ ++ **Addition has an inverse**: for every $$V$$ there exists $$-V$$ so that $$V + (-V) = 0$$ ++ **Multiplication is associative**: $$a (b V) = (a b) V$$ ++ **Multiplication has an identity**: There exists a $$1$$ such that $$1 V = V$$ ++ **Multiplication is distributive over scalars**: $$(a + b)V = aV + bV$$ ++ **Multiplication is distributive over vectors**: $$a (U + V) = a U + a V$$ -A set of $N$ vectors $V_1, V_2, ..., V_N$ is **linearly independent** if -the only way to satisfy the following relation is to set all the scalar coefficients $a_n = 0$: +A set of $$N$$ vectors $$V_1, V_2, ..., V_N$$ is **linearly independent** if +the only way to satisfy the following relation is to set all the scalar coefficients $$a_n = 0$$: $$\begin{aligned} \mathbf{0} = \sum_{n = 1}^N a_n V_n @@ -44,13 +44,13 @@ $$\begin{aligned} In other words, these vectors cannot be expressed in terms of each other. Otherwise, they would be **linearly dependent**. -A vector space $\mathbb{V}$ has **dimension** $N$ if only up to $N$ of +A vector space $$\mathbb{V}$$ has **dimension** $$N$$ if only up to $$N$$ of its vectors can be linearly indepedent. All other vectors in -$\mathbb{V}$ can then be written as a **linear combination** of these $N$ **basis vectors**. +$$\mathbb{V}$$ can then be written as a **linear combination** of these $$N$$ **basis vectors**. -Let $\vu{e}_1, ..., \vu{e}_N$ be the basis vectors, then any -vector $V$ in the same space can be **expanded** in the basis according to -the unique weights $v_n$, known as the **components** of $V$ +Let $$\vu{e}_1, ..., \vu{e}_N$$ be the basis vectors, then any +vector $$V$$ in the same space can be **expanded** in the basis according to +the unique weights $$v_n$$, known as the **components** of $$V$$ in that basis: $$\begin{aligned} @@ -73,25 +73,25 @@ $$\begin{gathered} ## Inner product -A given vector space $\mathbb{V}$ can be promoted to a **Hilbert space** -or **inner product space** if it supports an operation $\Inprod{U}{V}$ +A given vector space $$\mathbb{V}$$ can be promoted to a **Hilbert space** +or **inner product space** if it supports an operation $$\Inprod{U}{V}$$ called the **inner product**, which takes two vectors and returns a scalar, and has the following properties: -+ **Skew symmetry**: $\Inprod{U}{V} = (\Inprod{V}{U})^*$, where ${}^*$ is the complex conjugate. -+ **Positive semidefiniteness**: $\Inprod{V}{V} \ge 0$, and $\Inprod{V}{V} = 0$ if $V = \mathbf{0}$. -+ **Linearity in second operand**: $\Inprod{U}{(a V + b W)} = a \Inprod{U}{V} + b \Inprod{U}{W}$. ++ **Skew symmetry**: $$\Inprod{U}{V} = (\Inprod{V}{U})^*$$, where $${}^*$$ is the complex conjugate. ++ **Positive semidefiniteness**: $$\Inprod{V}{V} \ge 0$$, and $$\Inprod{V}{V} = 0$$ if $$V = \mathbf{0}$$. ++ **Linearity in second operand**: $$\Inprod{U}{(a V + b W)} = a \Inprod{U}{V} + b \Inprod{U}{W}$$. The inner product describes the lengths and angles of vectors, and in Euclidean space it is implemented by the dot product. -The **magnitude** or **norm** $|V|$ of a vector $V$ is given by -$|V| = \sqrt{\Inprod{V}{V}}$ and represents the real positive length of $V$. +The **magnitude** or **norm** $$|V|$$ of a vector $$V$$ is given by +$$|V| = \sqrt{\Inprod{V}{V}}$$ and represents the real positive length of $$V$$. A **unit vector** has a norm of 1. -Two vectors $U$ and $V$ are **orthogonal** if their inner product -$\Inprod{U}{V} = 0$. If in addition to being orthogonal, $|U| = 1$ and -$|V| = 1$, then $U$ and $V$ are known as **orthonormal** vectors. +Two vectors $$U$$ and $$V$$ are **orthogonal** if their inner product +$$\Inprod{U}{V} = 0$$. If in addition to being orthogonal, $$|U| = 1$$ and +$$|V| = 1$$, then $$U$$ and $$V$$ are known as **orthonormal** vectors. Orthonormality is desirable for basis vectors, so if they are not already like that, it is common to manually turn them into a new @@ -108,15 +108,15 @@ $$\begin{gathered} \Inprod{V}{W} = \sum_{n = 1}^N \sum_{m = 1}^N v_n^* w_m \Inprod{\vu{e}_n}{\vu{e}_j} \end{gathered}$$ -If the basis vectors $\vu{e}_1, ..., \vu{e}_N$ are already +If the basis vectors $$\vu{e}_1, ..., \vu{e}_N$$ are already orthonormal, this reduces to: $$\begin{aligned} \Inprod{V}{W} = \sum_{n = 1}^N v_n^* w_n \end{aligned}$$ -As it turns out, the components $v_n$ are given by the inner product -with $\vu{e}_n$, where $\delta_{nm}$ is the Kronecker delta: +As it turns out, the components $$v_n$$ are given by the inner product +with $$\vu{e}_n$$, where $$\delta_{nm}$$ is the Kronecker delta: $$\begin{aligned} \Inprod{\vu{e}_n}{V} = \sum_{m = 1}^N \delta_{nm} v_m = v_n @@ -125,12 +125,12 @@ $$\begin{aligned} ## Infinite dimensions -As the dimensionality $N$ tends to infinity, things may or may not -change significantly, depending on whether $N$ is **countably** or +As the dimensionality $$N$$ tends to infinity, things may or may not +change significantly, depending on whether $$N$$ is **countably** or **uncountably** infinite. In the former case, not much changes: the infinitely many **discrete** -basis vectors $\vu{e}_n$ can all still be made orthonormal as usual, +basis vectors $$\vu{e}_n$$ can all still be made orthonormal as usual, and as before: $$\begin{aligned} @@ -141,16 +141,16 @@ A good example of such a countably-infinitely-dimensional basis are the solution eigenfunctions of a [Sturm-Liouville problem](/know/concept/sturm-liouville-theory/). However, if the dimensionality is uncountably infinite, the basis -vectors are **continuous** and cannot be labeled by $n$. For example, all -complex functions $f(x)$ defined for $x \in [a, b]$ which -satisfy $f(a) = f(b) = 0$ form such a vector space. -In this case $f(x)$ is expanded as follows, where $x$ is a basis vector: +vectors are **continuous** and cannot be labeled by $$n$$. For example, all +complex functions $$f(x)$$ defined for $$x \in [a, b]$$ which +satisfy $$f(a) = f(b) = 0$$ form such a vector space. +In this case $$f(x)$$ is expanded as follows, where $$x$$ is a basis vector: $$\begin{aligned} f(x) = \int_a^b \Inprod{x}{f} \dd{x} \end{aligned}$$ -Similarly, the inner product $\Inprod{f}{g}$ must also be redefined as +Similarly, the inner product $$\Inprod{f}{g}$$ must also be redefined as follows: $$\begin{aligned} @@ -158,17 +158,17 @@ $$\begin{aligned} \end{aligned}$$ The concept of orthonormality must be also weakened. A finite function -$f(x)$ can be normalized as usual, but the basis vectors $x$ themselves +$$f(x)$$ can be normalized as usual, but the basis vectors $$x$$ themselves cannot, since each represents an infinitesimal section of the real line. -The rationale in this case is that action of the identity operator $\hat{I}$ must +The rationale in this case is that action of the identity operator $$\hat{I}$$ must be preserved, which is given here in [Dirac notation](/know/concept/dirac-notation/): $$\begin{aligned} \hat{I} = \int_a^b \Ket{\xi} \Bra{\xi} \dd{\xi} \end{aligned}$$ -Applying the identity operator to $f(x)$ should just give $f(x)$ again: +Applying the identity operator to $$f(x)$$ should just give $$f(x)$$ again: $$\begin{aligned} f(x) = \Inprod{x}{f} = \matrixel{x}{\hat{I}}{f} @@ -176,9 +176,9 @@ $$\begin{aligned} = \int_a^b \Inprod{x}{\xi} f(\xi) \dd{\xi} \end{aligned}$$ -Since we want the latter integral to reduce to $f(x)$, it is plain to see that -$\Inprod{x}{\xi}$ can only be a [Dirac delta function](/know/concept/dirac-delta-function/), -i.e $\Inprod{x}{\xi} = \delta(x - \xi)$: +Since we want the latter integral to reduce to $$f(x)$$, it is plain to see that +$$\Inprod{x}{\xi}$$ can only be a [Dirac delta function](/know/concept/dirac-delta-function/), +i.e $$\Inprod{x}{\xi} = \delta(x - \xi)$$: $$\begin{aligned} \int_a^b \Inprod{x}{\xi} f(\xi) \dd{\xi} @@ -186,11 +186,11 @@ $$\begin{aligned} = f(x) \end{aligned}$$ -Consequently, $\Inprod{x}{\xi} = 0$ if $x \neq \xi$ as expected for an -orthogonal set of vectors, but if $x = \xi$ the inner product -$\Inprod{x}{\xi}$ is infinite, unlike earlier. +Consequently, $$\Inprod{x}{\xi} = 0$$ if $$x \neq \xi$$ as expected for an +orthogonal set of vectors, but if $$x = \xi$$ the inner product +$$\Inprod{x}{\xi}$$ is infinite, unlike earlier. -Technically, because the basis vectors $x$ cannot be normalized, they +Technically, because the basis vectors $$x$$ cannot be normalized, they are not members of a Hilbert space, but rather of a superset called a **rigged Hilbert space**. Such vectors have no finite inner product with themselves, but do have one with all vectors from the actual Hilbert |