summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
-rw-r--r--content/know/_index.md9
-rw-r--r--content/know/concept/calculus-of-variations/index.pdc236
-rw-r--r--content/know/concept/ehrenfests-theorem/index.pdc137
-rw-r--r--content/know/concept/heisenberg-picture/index.pdc117
-rw-r--r--content/know/concept/hilbert-space/index.pdc2
5 files changed, 494 insertions, 7 deletions
diff --git a/content/know/_index.md b/content/know/_index.md
index 41aeef6..8993ee1 100644
--- a/content/know/_index.md
+++ b/content/know/_index.md
@@ -16,14 +16,11 @@ trying to make sense of obscure, poorly explained concepts.
To help me remember what I learn, I write notes in LaTeX.
This knowledge base is based on my notes, and
-is freely available to anyone who might need it.
+is freely available to anyone who needs it.
I hope it helps you in your studies or work.
-Currently there's isn't much here yet,
-but I have over 200 pages of LaTeX waiting to be converted.
-Keep in mind that I'm only human,
-so there are probably mistakes in my work.
-I take no responsibility for any injuries incurred as a consequence.
+Keep in mind that I'm only human, so there are probably mistakes here.
+I take no responsibility for any resulting injuries.
If you're doing something important,
you should check things yourself!
diff --git a/content/know/concept/calculus-of-variations/index.pdc b/content/know/concept/calculus-of-variations/index.pdc
new file mode 100644
index 0000000..fb043e0
--- /dev/null
+++ b/content/know/concept/calculus-of-variations/index.pdc
@@ -0,0 +1,236 @@
+---
+title: "Calculus of variations"
+firstLetter: "C"
+publishDate: 2021-02-24
+categories:
+- Mathematics
+- Physics
+
+date: 2021-02-24T18:50:06+01:00
+draft: false
+markup: pandoc
+---
+
+# Calculus of variations
+
+The **calculus of variations** lays the mathematical groundwork
+for Lagrangian mechanics.
+
+Consider a **functional** $J$, mapping a function $f(x)$ to a scalar value
+by integrating over the so-called **Lagrangian** $L$,
+which represents an expression involving $x$, $f$ and the derivative $f'$:
+
+$$\begin{aligned}
+ J[f] = \int_{x_0}^{x_1} L(f, f', x) \dd{x}
+\end{aligned}$$
+
+If $J$ in some way measures the physical "cost" (e.g. energy) of
+the path $f(x)$ taken by a physical system,
+the **principle of least action** states that $f$ will be a minimum of $J[f]$,
+so for example the expended energy will be minimized.
+
+If $f(x, \alpha\!=\!0)$ is the optimal route, then a slightly
+different (and therefore worse) path between the same two points can be expressed
+using the parameter $\alpha$:
+
+$$\begin{aligned}
+ f(x, \alpha) = f(x, 0) + \alpha \eta(x)
+ \qquad \mathrm{or} \qquad
+ \delta f = \alpha \eta(x)
+\end{aligned}$$
+
+Where $\eta(x)$ is an arbitrary differentiable deviation.
+Since $f(x, \alpha)$ must start and end in the same points as $f(x,0)$,
+we have the boundary conditions:
+
+$$\begin{aligned}
+ \eta(x_0) = \eta(x_1) = 0
+\end{aligned}$$
+
+Given $L$, the goal is to find an equation for the optimal path $f(x,0)$.
+Just like when finding the minimum of a real function,
+the minimum $f$ of a functional $J[f]$ is a stationary point
+with respect to the deviation weight $\alpha$,
+a condition often written as $\delta J = 0$.
+In the following, the integration limits have been omitted:
+
+$$\begin{aligned}
+ 0
+ &= \delta J
+ = \pdv{J}{\alpha} \Big|_{\alpha = 0}
+ = \int \pdv{L}{\alpha} \dd{x}
+ = \int \pdv{L}{f} \pdv{f}{\alpha} + \pdv{L}{f'} \pdv{f'}{\alpha} \dd{x}
+ \\
+ &= \int \pdv{L}{f} \eta + \pdv{L}{f'} \eta' \dd{x}
+ = \Big[ \pdv{L}{f'} \eta \Big]_{x_0}^{x_1} + \int \pdv{L}{f} \eta - \frac{d}{dx} \Big( \pdv{L}{f'} \Big) \eta \dd{x}
+\end{aligned}$$
+
+The boundary term from partial integration vanishes due to the boundary
+conditions for $\eta(x)$. We are thus left with:
+
+$$\begin{aligned}
+ 0
+ = \int \eta \bigg( \pdv{L}{f} - \dv{x} \Big( \pdv{L}{f'} \Big) \bigg) \dd{x}
+\end{aligned}$$
+
+This holds for all $\eta$, but $\eta$ is arbitrary, so in fact
+only the parenthesized expression matters:
+
+$$\begin{aligned}
+ \boxed{
+ 0 = \pdv{L}{f} - \dv{x} \Big( \pdv{L}{f'} \Big)
+ }
+\end{aligned}$$
+
+This is known as the **Euler-Lagrange equation** of the Lagrangian $L$,
+and its solutions represent the optimal paths $f(x, 0)$.
+
+
+## Multiple functions
+
+Suppose that the Lagrangian $L$ depends on multiple independent functions
+$f_1, f_2, ..., f_N$:
+
+$$\begin{aligned}
+ J[f_1, ..., f_N] = \int_{x_0}^{x_1} L(f_1, ..., f_N, f_1', ..., f_N', x) \dd{x}
+\end{aligned}$$
+
+In this case, every $f_n(x)$ has its own deviation $\eta_n(x)$,
+satisfying $\eta_n(x_0) = \eta_n(x_1) = 0$:
+
+$$\begin{aligned}
+ f_n(x, \alpha) = f_n(x, 0) + \alpha \eta_n(x)
+\end{aligned}$$
+
+The derivation procedure is identical to the case $N = 1$ from earlier:
+
+$$\begin{aligned}
+ 0
+ &= \pdv{J}{\alpha} \Big|_{\alpha = 0}
+ = \int \pdv{L}{\alpha} \dd{x}
+ = \int \sum_{n} \Big( \pdv{L}{f_n} \pdv{f_n}{\alpha} + \pdv{L}{f_n'} \pdv{f_n'}{\alpha} \Big) \dd{x}
+ \\
+ &= \int \sum_{n} \Big( \pdv{L}{f_n} \eta_n + \pdv{L}{f_n'} \eta_n' \Big) \dd{x}
+ \\
+ &= \Big[ \sum_{n} \pdv{L}{f_n'} \eta_n \Big]_{x_0}^{x_1}
+ + \int \sum_{n} \eta_n \bigg( \pdv{L}{f_n} - \frac{d}{dx} \Big( \pdv{L}{f_n'} \Big) \bigg) \dd{x}
+\end{aligned}$$
+
+Once again, $\eta_n(x)$ is arbitrary and disappears at the boundaries,
+so we end up with $N$ equations of the same form as for a single function:
+
+$$\begin{aligned}
+ \boxed{
+ 0 = \pdv{L}{f_1} - \dv{x} \Big( \pdv{L}{f_1'} \Big)
+ \quad \cdots \quad
+ 0 = \pdv{L}{f_N} - \dv{x} \Big( \pdv{L}{f_N'} \Big)
+ }
+\end{aligned}$$
+
+
+## Higher-order derivatives
+
+Suppose that the Lagrangian $L$ depends on multiple derivatives of $f(x)$:
+
+$$\begin{aligned}
+ J[f] = \int_{x_0}^{x_1} L(f, f', f'', ..., f^{(N)}, x) \dd{x}
+\end{aligned}$$
+
+Once again, the derivation procedure is the same as before:
+
+$$\begin{aligned}
+ 0
+ &= \pdv{J}{\alpha} \Big|_{\alpha = 0}
+ = \int \pdv{L}{\alpha} \dd{x}
+ = \int \pdv{L}{f} \pdv{f}{\alpha} + \sum_{n} \pdv{L}{f^{(n)}} \pdv{f^{(n)}}{\alpha} \dd{x}
+ \\
+ &= \int \pdv{L}{f} \eta + \sum_{n} \pdv{L}{f^{(n)}} \eta^{(n)} \dd{x}
+\end{aligned}$$
+
+The goal is to turn each $\eta^{(n)}(x)$ into $\eta(x)$, so we need to
+partially integrate the $n$th term of the sum $n$ times. In this case,
+we will need some additional boundary conditions for $\eta(x)$:
+
+$$\begin{aligned}
+ \eta'(x_0) = \eta'(x_1) = 0
+ \qquad \cdots \qquad
+ \eta^{(N-1)}(x_0) = \eta^{(N-1)}(x_1) = 0
+\end{aligned}$$
+
+This eliminates the boundary terms from partial integration, leaving:
+
+$$\begin{aligned}
+ 0
+ &= \int \eta \bigg( \pdv{L}{f} + \sum_{n} (-1)^n \dv[n]{x} \Big( \pdv{L}{f^{(n)}} \Big) \bigg) \dd{x}
+\end{aligned}$$
+
+Once again, because $\eta(x)$ is arbitrary, the Euler-Lagrange equation becomes:
+
+$$\begin{aligned}
+ \boxed{
+ 0 = \pdv{L}{f} + \sum_{n} (-1)^n \dv[n]{x} \Big( \pdv{L}{f^{(n)}} \Big)
+ }
+\end{aligned}$$
+
+
+## Multiple coordinates
+
+Suppose now that $f$ is a function of multiple variables.
+For brevity, we only consider two variables $x$ and $y$,
+but the results generalize effortlessly to larger amounts.
+The Lagrangian now depends on all the partial derivatives of $f(x, y)$:
+
+$$\begin{aligned}
+ J[f] = \iint_{(x_0, y_0)}^{(x_1, y_1)} L(f, f_x, f_y, x, y) \dd{x} \dd{y}
+\end{aligned}$$
+
+The arbitrary deviation $\eta$ is then also a function of multiple variables:
+
+$$\begin{aligned}
+ f(x, y; \alpha) = f(x, y; 0) + \alpha \eta(x, y)
+\end{aligned}$$
+
+The derivation procedure starts in the exact same way as before:
+
+$$\begin{aligned}
+ 0
+ &= \pdv{J}{\alpha} \Big|_{\alpha = 0}
+ = \iint \pdv{L}{\alpha} \dd{x} \dd{y}
+ \\
+ &= \iint \pdv{L}{f} \pdv{f}{\alpha} + \pdv{L}{f_x} \pdv{f_x}{\alpha} + \pdv{L}{f_y} \pdv{f_y}{\alpha} \dd{x} \dd{y}
+ \\
+ &= \iint \pdv{L}{f} \eta + \pdv{L}{f_x} \eta_x + \pdv{L}{f_y} \eta_y \dd{x} \dd{y}
+\end{aligned}$$
+
+We partially integrate for both $\eta_x$ and $\eta_y$, yielding:
+
+$$\begin{aligned}
+ 0
+ &= \int \Big[ \pdv{L}{f_x} \eta \Big]_{x_0}^{x_1} \dd{y} + \int \Big[ \pdv{L}{f_y} \eta \Big]_{y_0}^{y_1} \dd{x}
+ \\
+ &\quad + \iint \eta \bigg( \pdv{L}{f} - \dv{x} \Big( \pdv{L}{f_x} \Big) - \dv{y} \Big( \pdv{L}{f_y} \Big) \bigg) \dd{x} \dd{y}
+\end{aligned}$$
+
+But now, to eliminate these boundary terms, we need extra conditions for $\eta$:
+
+$$\begin{aligned}
+ \forall y: \eta(x_0, y) = \eta(x_1, y) = 0
+ \qquad
+ \forall x: \eta(x, y_0) = \eta(x, y_1) = 0
+\end{aligned}$$
+
+In other words, the deviation $\eta$ must be zero on the whole "box".
+Again relying on the fact that $\eta$ is arbitrary, the Euler-Lagrange
+equation is:
+
+$$\begin{aligned}
+ 0 = \pdv{L}{f} - \dv{x} \Big( \pdv{L}{f_x} \Big) - \dv{y} \Big( \pdv{L}{f_y} \Big)
+\end{aligned}$$
+
+This generalizes nicely to functions of even more variables $x_1, x_2, ..., x_N$:
+
+$$\begin{aligned}
+ \boxed{
+ 0 = \pdv{L}{f} - \sum_{n} \dv{x_n} \Big( \pdv{L}{f_{x_n}} \Big)
+ }
+\end{aligned}$$
diff --git a/content/know/concept/ehrenfests-theorem/index.pdc b/content/know/concept/ehrenfests-theorem/index.pdc
new file mode 100644
index 0000000..bdcb908
--- /dev/null
+++ b/content/know/concept/ehrenfests-theorem/index.pdc
@@ -0,0 +1,137 @@
+---
+title: "Ehrenfest's theorem"
+firstLetter: "E"
+publishDate: 2021-02-24
+categories:
+- Quantum mechanics
+- Physics
+
+date: 2021-02-24T14:53:13+01:00
+draft: false
+markup: pandoc
+---
+
+# Ehrenfest's theorem
+
+In quantum mechanics, **Ehrenfest's theorem** gives a general expression for the
+time evolution of an observable's expectation value $\expval*{\hat{L}}$.
+
+The time-dependent Schrödinger equation is as follows,
+where prime denotes differentiation with respect to time $t$:
+
+$$\begin{aligned}
+ \ket{\psi'} = \frac{1}{i \hbar} \hat{H} \ket{\psi}
+ \qquad
+ \bra{\psi'} = - \frac{1}{i \hbar} \bra{\psi} \hat{H}
+\end{aligned}$$
+
+Given an observable operator $\hat{L}$ and a state $\ket{\psi}$,
+the time-derivative of the expectation value $\expval*{\hat{L}}$ is as follows
+(due to the product rule of differentiation):
+
+$$\begin{aligned}
+ \dv{\expval*{\hat{L}}}{t}
+ &= \matrixel{\psi}{\hat{L}}{\psi'} + \matrixel{\psi'}{\hat{L}}{\psi} + \matrixel{\psi}{\hat{L}'}{\psi}
+ \\
+ &= \frac{1}{i \hbar} \matrixel{\psi}{\hat{L}\hat{H}}{\psi}
+ - \frac{1}{i \hbar} \matrixel{\psi}{\hat{H}\hat{L}}{\psi}
+ + \expval{\dv{\hat{L}}{t}}
+\end{aligned}$$
+
+The first two terms on the right can be rewritten using a commutator,
+yielding the general form of Ehrenfest's theorem:
+
+$$\begin{aligned}
+ \boxed{
+ \dv{\expval*{\hat{L}}}{t}
+ = \frac{1}{i \hbar} \expval{[\hat{L}, \hat{H}]} + \expval{\dv{\hat{L}}{t}}
+ }
+\end{aligned}$$
+
+In practice, since most operators are time-independent,
+the last term often vanishes.
+
+As a interesting side note, in the [Heisenberg picture](/know/concept/heisenberg-picture/),
+this relation proves itself,
+when one simply wraps all terms in $\bra{\psi}$ and $\ket{\psi}$.
+
+Two observables of particular interest are the position $\hat{X}$ and momentum $\hat{P}$.
+Applying the above theorem to $\hat{X}$ yields the following,
+which we reduce using the fact that $\hat{X}$ commutes
+with the potential $V(\hat{X})$,
+because one is a function of the other:
+
+$$\begin{aligned}
+ \dv{\expval*{\hat{X}}}{t}
+ &= \frac{1}{i \hbar} \expval{[\hat{X}, \hat{H}]}
+ = \frac{1}{2 i \hbar m} \expval{[\hat{X}, \hat{P}^2] + 2 m [\hat{X}, V(\hat{X})]}
+ = \frac{1}{2 i \hbar m} \expval{[\hat{X}, \hat{P}^2]}
+ \\
+ &= \frac{1}{2 i \hbar m} \expval{\hat{P} [\hat{X}, \hat{P}] + [\hat{X}, \hat{P}] \hat{P}}
+ = \frac{2 i \hbar}{2 i \hbar m} \expval*{\hat{P}}
+ = \frac{\expval*{\hat{P}}}{m}
+\end{aligned}$$
+
+This is the first part of the "original" form of Ehrenfest's theorem,
+which is reminiscent of classical Newtonian mechanics:
+
+$$\begin{gathered}
+ \boxed{
+ \dv{\expval*{\hat{X}}}{t} = \frac{\expval*{\hat{P}}}{m}
+ }
+\end{gathered}$$
+
+Next, applying the general formula to the expected momentum $\expval*{\hat{P}}$
+gives us:
+
+$$\begin{aligned}
+ \dv{\expval*{\hat{P}}}{t}
+ &= \frac{1}{i \hbar} \expval{[\hat{P}, \hat{H}]}
+ = \frac{1}{2 i \hbar m} \expval{[\hat{P}, \hat{P}^2] + 2 m [\hat{P}, V(\hat{X})]}
+ = \frac{1}{i \hbar} \expval{[\hat{P}, V(\hat{X})]}
+\end{aligned}$$
+
+To find the commutator, we go to the $\hat{X}$-basis and use a test
+function $f(x)$:
+
+$$\begin{aligned}
+ \comm{- i \hbar \dv{x}}{V(x)} \: f(x)
+ &= - i \hbar \frac{dV}{dx} f(x) - i \hbar V(x) \frac{df}{dx} + i \hbar V(x) \frac{df}{dx}
+ = - i \hbar \frac{dV}{dx} f(x)
+\end{aligned}$$
+
+By inserting this result back into the previous equation, we find the following:
+
+$$\begin{aligned}
+ \dv{\expval*{\hat{P}}}{t}
+ &= - \frac{i \hbar}{i \hbar} \expval{\frac{d V}{d \hat{X}}}
+ = - \expval{\frac{d V}{d \hat{X}}}
+\end{aligned}$$
+
+This is the second part of Ehrenfest's theorem,
+which is also similar to Newtonian mechanics:
+
+$$\begin{gathered}
+ \boxed{
+ \dv{\expval*{\hat{P}}}{t} = - \expval{\pdv{V}{\hat{X}}}
+ }
+\end{gathered}$$
+
+There is an important consequence of Ehrenfest's original theorems
+for the symbolic derivatives of the Hamiltonian $\hat{H}$
+with respect to $\hat{X}$ and $\hat{P}$:
+
+$$\begin{gathered}
+ \boxed{
+ \expval{\pdv{\hat{H}}{\hat{P}}}
+ = \dv{\expval*{\hat{X}}}{t}
+ }
+ \qquad \quad
+ \boxed{
+ - \expval{\pdv{\hat{H}}{\hat{X}}}
+ = \dv{\expval*{\hat{P}}}{t}
+ }
+\end{gathered}$$
+
+These are easy to prove yourself,
+and are analogous to Hamilton's canonical equations.
diff --git a/content/know/concept/heisenberg-picture/index.pdc b/content/know/concept/heisenberg-picture/index.pdc
new file mode 100644
index 0000000..2dd4118
--- /dev/null
+++ b/content/know/concept/heisenberg-picture/index.pdc
@@ -0,0 +1,117 @@
+---
+title: "Heisenberg picture"
+firstLetter: "H"
+publishDate: 2021-02-24
+categories:
+- Quantum mechanics
+- Physics
+
+date: 2021-02-24T16:46:26+01:00
+draft: false
+markup: pandoc
+---
+
+# Heisenberg picture
+
+The **Heisenberg picture** is an alternative formulation of quantum
+mechanics, and is equivalent to the traditionally-taught Schrödinger equation.
+
+In the Schrödinger picture, the operators (observables) are fixed
+(as long as they do not depend on time), while the state
+$\ket{\psi_S(t)}$ changes according to the Schrödinger equation,
+which can be written using the generator of translations
+$\hat{U}(t) = \exp{} (- i t \hat{H} / \hbar)$ like so:
+
+$$\begin{aligned}
+ \ket{\psi_S(t)} = \hat{U}(t) \ket{\psi_S(0)}
+\end{aligned}$$
+
+In contrast, the Heisenberg picture reverses the roles:
+the states $\ket{\psi_H}$ are invariant,
+and instead the operators vary with time.
+An advantage of this is that the basis states remain the same.
+
+Given a Schrödinger-picture state $\ket{\psi_S(t)}$, and operator
+$\hat{L}_S(t)$ which may or may not depend on time, they can be
+converted to the Heisenberg picture by the following change of basis:
+
+$$\begin{aligned}
+ \boxed{
+ \ket{\psi_H} = \ket{\psi_S(0)}
+ \qquad
+ \hat{L}_H(t) = \hat{U}^\dagger(t) \hat{L}_S(t) \hat{U}(t)
+ }
+\end{aligned}$$
+
+Since $\hat{U}(t)$ is unitary, the expectation value of a given operator is unchanged:
+
+$$\begin{aligned}
+ \expval*{\hat{L}_H}
+ &= \matrixel{\psi_H}{\hat{L}_H(t)}{\psi_H}
+ = \matrixel{\psi_S(0)}{\hat{U}^\dagger(t) \: \hat{L}_S(t) \: \hat{U}(t)}{\psi_S(0)}
+ \\
+ &= \matrixel*{\hat{U}(t) \psi_S(0)}{\hat{L}_S(t)}{\hat{U}(t) \psi_S(0)}
+ = \matrixel{\psi_S(t)}{\hat{L}_S}{\psi_S(t)}
+ = \expval*{\hat{L}_S}
+\end{aligned}$$
+
+The Schrödinger and Heisenberg pictures therefore respectively
+correspond to active and passive transformations by $\hat{U}(t)$
+in [Hilbert space](/know/concept/hilbert-space/).
+The two formulations are thus entirely equivalent,
+and can be derived from one another,
+as will be shown shortly.
+
+In the Heisenberg picture, the states are constant,
+so the time-dependent Schrödinger equation is not directly useful.
+Instead, we will use it derive a new equation for $\hat{L}_H(t)$.
+The key is that the generator $\hat{U}(t)$ is defined from the Schrödinger equation:
+
+$$\begin{aligned}
+ \dv{t} \hat{U}(t) = - \frac{i}{\hbar} \hat{H}_S(t) \hat{U}(t)
+\end{aligned}$$
+
+Where $\hat{H}_S(t)$ may depend on time. We differentiate the definition of
+$\hat{L}_H(t)$ and insert the other side of the Schrödinger equation
+when necessary:
+
+$$\begin{aligned}
+ \dv{\hat{L}_H}{t}
+ &= \dv{\hat{U}^\dagger}{t} \hat{L}_S \hat{U}
+ + \hat{U}^\dagger \hat{L}_S \dv{\hat{U}}{t}
+ + \hat{U}^\dagger \dv{\hat{L}_S}{t} \hat{U}
+ \\
+ &= \frac{i}{\hbar} \hat{U}^\dagger \hat{H}_S (\hat{U} \hat{U}^\dagger) \hat{L}_S \hat{U}
+ - \frac{i}{\hbar} \hat{U}^\dagger \hat{L}_S (\hat{U} \hat{U}^\dagger) \hat{H}_S \hat{U}
+ + \Big( \dv{\hat{L}_S}{t} \Big)_H
+ \\
+ &= \frac{i}{\hbar} \hat{H}_H \hat{L}_H
+ - \frac{i}{\hbar} \hat{L}_H \hat{H}_H
+ + \Big( \dv{\hat{L}_S}{t} \Big)_H
+ = \frac{i}{\hbar} [\hat{H}_H, \hat{L}_H] + \Big( \dv{\hat{L}_S}{t} \Big)_H
+\end{aligned}$$
+
+We thus get the equation of motion for operators in the Heisenberg picture:
+
+$$\begin{aligned}
+ \boxed{
+ \dv{t} \hat{L}_H(t) = \frac{i}{\hbar} [\hat{H}_H(t), \hat{L}_H(t)] + \Big( \dv{t} \hat{L}_S(t) \Big)_H
+ }
+\end{aligned}$$
+
+This equation is closer to classical mechanics than the Schrödinger picture:
+inserting the position $\hat{X}$ and momentum $\hat{P} = - i \hbar \: d/d\hat{X}$
+gives the following Newton-style equations:
+
+$$\begin{aligned}
+ \dv{\hat{X}}{t}
+ &= \frac{i}{\hbar} [\hat{H}, \hat{X}]
+ = \frac{\hat{P}}{m}
+ \\
+ \dv{\hat{P}}{t}
+ &= \frac{i}{\hbar} [\hat{H}, \hat{P}]
+ = - \dv{V(\hat{X})}{\hat{X}}
+\end{aligned}$$
+
+For a proof, see [Ehrenfest's theorem](/know/concept/ehrenfests-theorem/),
+which is closely related to the Heisenberg picture.
diff --git a/content/know/concept/hilbert-space/index.pdc b/content/know/concept/hilbert-space/index.pdc
index 1faf08a..c557fb7 100644
--- a/content/know/concept/hilbert-space/index.pdc
+++ b/content/know/concept/hilbert-space/index.pdc
@@ -13,7 +13,7 @@ markup: pandoc
# Hilbert space
-A **Hilbert space**, also known as an **inner product space**, is an
+A **Hilbert space**, also called an **inner product space**, is an
abstract **vector space** with a notion of length and angle.