summaryrefslogtreecommitdiff
path: root/content/know
diff options
context:
space:
mode:
Diffstat (limited to 'content/know')
-rw-r--r--content/know/concept/calculus-of-variations/index.pdc40
-rw-r--r--content/know/concept/heaviside-step-function/index.pdc91
-rw-r--r--content/know/concept/holomorphic-function/index.pdc232
-rw-r--r--content/know/concept/kramers-kronig-relations/index.pdc133
-rw-r--r--content/know/concept/schwartz-distribution/index.pdc119
5 files changed, 595 insertions, 20 deletions
diff --git a/content/know/concept/calculus-of-variations/index.pdc b/content/know/concept/calculus-of-variations/index.pdc
index fb043e0..c5280e5 100644
--- a/content/know/concept/calculus-of-variations/index.pdc
+++ b/content/know/concept/calculus-of-variations/index.pdc
@@ -29,18 +29,18 @@ the path $f(x)$ taken by a physical system,
the **principle of least action** states that $f$ will be a minimum of $J[f]$,
so for example the expended energy will be minimized.
-If $f(x, \alpha\!=\!0)$ is the optimal route, then a slightly
+If $f(x, \varepsilon\!=\!0)$ is the optimal route, then a slightly
different (and therefore worse) path between the same two points can be expressed
-using the parameter $\alpha$:
+using the parameter $\varepsilon$:
$$\begin{aligned}
- f(x, \alpha) = f(x, 0) + \alpha \eta(x)
+ f(x, \varepsilon) = f(x, 0) + \varepsilon \eta(x)
\qquad \mathrm{or} \qquad
- \delta f = \alpha \eta(x)
+ \delta f = \varepsilon \eta(x)
\end{aligned}$$
Where $\eta(x)$ is an arbitrary differentiable deviation.
-Since $f(x, \alpha)$ must start and end in the same points as $f(x,0)$,
+Since $f(x, \varepsilon)$ must start and end in the same points as $f(x,0)$,
we have the boundary conditions:
$$\begin{aligned}
@@ -50,16 +50,16 @@ $$\begin{aligned}
Given $L$, the goal is to find an equation for the optimal path $f(x,0)$.
Just like when finding the minimum of a real function,
the minimum $f$ of a functional $J[f]$ is a stationary point
-with respect to the deviation weight $\alpha$,
+with respect to the deviation weight $\varepsilon$,
a condition often written as $\delta J = 0$.
In the following, the integration limits have been omitted:
$$\begin{aligned}
0
&= \delta J
- = \pdv{J}{\alpha} \Big|_{\alpha = 0}
- = \int \pdv{L}{\alpha} \dd{x}
- = \int \pdv{L}{f} \pdv{f}{\alpha} + \pdv{L}{f'} \pdv{f'}{\alpha} \dd{x}
+ = \pdv{J}{\varepsilon} \Big|_{\varepsilon = 0}
+ = \int \pdv{L}{\varepsilon} \dd{x}
+ = \int \pdv{L}{f} \pdv{f}{\varepsilon} + \pdv{L}{f'} \pdv{f'}{\varepsilon} \dd{x}
\\
&= \int \pdv{L}{f} \eta + \pdv{L}{f'} \eta' \dd{x}
= \Big[ \pdv{L}{f'} \eta \Big]_{x_0}^{x_1} + \int \pdv{L}{f} \eta - \frac{d}{dx} \Big( \pdv{L}{f'} \Big) \eta \dd{x}
@@ -99,16 +99,16 @@ In this case, every $f_n(x)$ has its own deviation $\eta_n(x)$,
satisfying $\eta_n(x_0) = \eta_n(x_1) = 0$:
$$\begin{aligned}
- f_n(x, \alpha) = f_n(x, 0) + \alpha \eta_n(x)
+ f_n(x, \varepsilon) = f_n(x, 0) + \varepsilon \eta_n(x)
\end{aligned}$$
The derivation procedure is identical to the case $N = 1$ from earlier:
$$\begin{aligned}
0
- &= \pdv{J}{\alpha} \Big|_{\alpha = 0}
- = \int \pdv{L}{\alpha} \dd{x}
- = \int \sum_{n} \Big( \pdv{L}{f_n} \pdv{f_n}{\alpha} + \pdv{L}{f_n'} \pdv{f_n'}{\alpha} \Big) \dd{x}
+ &= \pdv{J}{\varepsilon} \Big|_{\varepsilon = 0}
+ = \int \pdv{L}{\varepsilon} \dd{x}
+ = \int \sum_{n} \Big( \pdv{L}{f_n} \pdv{f_n}{\varepsilon} + \pdv{L}{f_n'} \pdv{f_n'}{\varepsilon} \Big) \dd{x}
\\
&= \int \sum_{n} \Big( \pdv{L}{f_n} \eta_n + \pdv{L}{f_n'} \eta_n' \Big) \dd{x}
\\
@@ -140,9 +140,9 @@ Once again, the derivation procedure is the same as before:
$$\begin{aligned}
0
- &= \pdv{J}{\alpha} \Big|_{\alpha = 0}
- = \int \pdv{L}{\alpha} \dd{x}
- = \int \pdv{L}{f} \pdv{f}{\alpha} + \sum_{n} \pdv{L}{f^{(n)}} \pdv{f^{(n)}}{\alpha} \dd{x}
+ &= \pdv{J}{\varepsilon} \Big|_{\varepsilon = 0}
+ = \int \pdv{L}{\varepsilon} \dd{x}
+ = \int \pdv{L}{f} \pdv{f}{\varepsilon} + \sum_{n} \pdv{L}{f^{(n)}} \pdv{f^{(n)}}{\varepsilon} \dd{x}
\\
&= \int \pdv{L}{f} \eta + \sum_{n} \pdv{L}{f^{(n)}} \eta^{(n)} \dd{x}
\end{aligned}$$
@@ -187,17 +187,17 @@ $$\begin{aligned}
The arbitrary deviation $\eta$ is then also a function of multiple variables:
$$\begin{aligned}
- f(x, y; \alpha) = f(x, y; 0) + \alpha \eta(x, y)
+ f(x, y; \varepsilon) = f(x, y; 0) + \varepsilon \eta(x, y)
\end{aligned}$$
The derivation procedure starts in the exact same way as before:
$$\begin{aligned}
0
- &= \pdv{J}{\alpha} \Big|_{\alpha = 0}
- = \iint \pdv{L}{\alpha} \dd{x} \dd{y}
+ &= \pdv{J}{\varepsilon} \Big|_{\varepsilon = 0}
+ = \iint \pdv{L}{\varepsilon} \dd{x} \dd{y}
\\
- &= \iint \pdv{L}{f} \pdv{f}{\alpha} + \pdv{L}{f_x} \pdv{f_x}{\alpha} + \pdv{L}{f_y} \pdv{f_y}{\alpha} \dd{x} \dd{y}
+ &= \iint \pdv{L}{f} \pdv{f}{\varepsilon} + \pdv{L}{f_x} \pdv{f_x}{\varepsilon} + \pdv{L}{f_y} \pdv{f_y}{\varepsilon} \dd{x} \dd{y}
\\
&= \iint \pdv{L}{f} \eta + \pdv{L}{f_x} \eta_x + \pdv{L}{f_y} \eta_y \dd{x} \dd{y}
\end{aligned}$$
diff --git a/content/know/concept/heaviside-step-function/index.pdc b/content/know/concept/heaviside-step-function/index.pdc
new file mode 100644
index 0000000..0471acf
--- /dev/null
+++ b/content/know/concept/heaviside-step-function/index.pdc
@@ -0,0 +1,91 @@
+---
+title: "Heaviside step function"
+firstLetter: "H"
+publishDate: 2021-02-25
+categories:
+- Mathematics
+- Physics
+
+date: 2021-02-25T11:28:02+01:00
+draft: false
+markup: pandoc
+---
+
+# Heaviside step function
+
+The **Heaviside step function** $\Theta(t)$,
+is a discontinuous function used for enforcing causality
+or for representing a signal switched on at $t = 0$.
+It is defined as:
+
+$$\begin{aligned}
+ \boxed{
+ \Theta(t) =
+ \begin{cases}
+ 0 & \mathrm{if}\: t < 0 \\
+ 1 & \mathrm{if}\: t > 1
+ \end{cases}
+ }
+\end{aligned}$$
+
+The value of $\Theta(t \!=\! 0)$ varies between definitions;
+common choices are $0$, $1$ and $1/2$.
+In practice, this rarely matters, and some authors even
+change their definition on the fly for convenience.
+For physicists, $\Theta(0) = 1$ is generally best, such that:
+
+$$\begin{aligned}
+ \boxed{
+ \forall n \in \mathbb{R}: \Theta^n(t) = \Theta(t)
+ }
+\end{aligned}$$
+
+Unsurprisingly, the first-order derivative of $\Theta(t)$ is
+the [Dirac delta function](/know/concept/dirac-delta-function/):
+
+$$\begin{aligned}
+ \boxed{
+ \Theta'(t) = \delta(t)
+ }
+\end{aligned}$$
+
+The [Fourier transform](/know/concept/fourier-transform/)
+of $\Theta(t)$ is noteworthy.
+In this case, it is easiest to use $\Theta(0) = 1/2$,
+such that the Heaviside step function can be expressed
+using the signum function $\mathrm{sgn}(t)$:
+
+$$\begin{aligned}
+ \Theta(t) = \frac{1}{2} + \frac{\mathrm{sgn}(t)}{2}
+\end{aligned}$$
+
+We then take the Fourier transform,
+where $A$ and $s$ are constants from its definition:
+
+$$\begin{aligned}
+ \tilde{\Theta}(\omega)
+ = \hat{\mathcal{F}}\{\Theta(t)\}
+ = \frac{A}{2} \Big( \int_{-\infty}^\infty \exp(i s \omega t) \dd{t} + \int_{-\infty}^\infty \mathrm{sgn}(t) \exp(i s \omega t) \dd{t} \Big)
+\end{aligned}$$
+
+The first term is proportional to the Dirac delta function.
+The second integral is problematic, so we take the Cauchy principal value $\pv{}$
+and look up the integral:
+
+$$\begin{aligned}
+ \tilde{\Theta}(\omega)
+ &= A \pi \delta(s \omega) + \frac{A}{2} \pv{\int_{-\infty}^\infty \mathrm{sgn}(t) \exp(i s \omega t) \dd{t}}
+ = \frac{A}{|s|} \pi \delta(\omega) + i \frac{A}{s} \pv{\frac{1}{\omega}}
+\end{aligned}$$
+
+The use of $\pv{}$ without an integral is an abuse of notation,
+and means that this result only makes sense when wrapped in an integral.
+Formally, $\pv{\{1 / \omega\}}$ is a [Schwartz distribution](/know/concept/schwartz-distribution/).
+We thus have:
+
+$$\begin{aligned}
+ \boxed{
+ \tilde{\Theta}(\omega)
+ = \frac{A}{|s|} \Big( \pi \delta(\omega) + i \: \mathrm{sgn}(s) \pv{\frac{1}{\omega}} \Big)
+ }
+\end{aligned}$$
diff --git a/content/know/concept/holomorphic-function/index.pdc b/content/know/concept/holomorphic-function/index.pdc
new file mode 100644
index 0000000..3e7a91e
--- /dev/null
+++ b/content/know/concept/holomorphic-function/index.pdc
@@ -0,0 +1,232 @@
+---
+title: "Holomorphic function"
+firstLetter: "H"
+publishDate: 2021-02-25
+categories:
+- Mathematics
+
+date: 2021-02-25T14:40:45+01:00
+draft: false
+markup: pandoc
+---
+
+# Holomorphic function
+
+In complex analysis, a complex function $f(z)$ of a complex variable $z$
+is called **holomorphic** or **analytic** if it is complex differentiable in the
+neighbourhood of every point of its domain.
+This is a very strong condition.
+
+As a result, holomorphic functions are infinitely differentiable and
+equal their Taylor expansion at every point. In physicists' terms,
+they are extremely "well-behaved" throughout their domain.
+
+More formally, a given function $f(z)$ is holomorphic in a certain region
+if the following limit exists for all $z$ in that region,
+and for all directions of $\Delta z$:
+
+$$\begin{aligned}
+ \boxed{
+ f'(z) = \lim_{\Delta z \to 0} \frac{f(z + \Delta z) - f(z)}{\Delta z}
+ }
+\end{aligned}$$
+
+We decompose $f$ into the real functions $u$ and $v$ of real variables $x$ and $y$:
+
+$$\begin{aligned}
+ f(z) = f(x + i y) = u(x, y) + i v(x, y)
+\end{aligned}$$
+
+Since we are free to choose the direction of $\Delta z$, we choose $\Delta x$ and $\Delta y$:
+
+$$\begin{aligned}
+ f'(z)
+ &= \lim_{\Delta x \to 0} \frac{f(z + \Delta x) - f(z)}{\Delta x}
+ = \pdv{u}{x} + i \pdv{v}{x}
+ \\
+ &= \lim_{\Delta y \to 0} \frac{f(z + i \Delta y) - f(z)}{i \Delta y}
+ = \pdv{v}{y} - i \pdv{u}{y}
+\end{aligned}$$
+
+For $f(z)$ to be holomorphic, these two results must be equivalent.
+Because $u$ and $v$ are real by definition,
+we thus arrive at the **Cauchy-Riemann equations**:
+
+$$\begin{aligned}
+ \boxed{
+ \pdv{u}{x} = \pdv{v}{y}
+ \qquad
+ \pdv{v}{x} = - \pdv{u}{y}
+ }
+\end{aligned}$$
+
+Therefore, a given function $f(z)$ is holomorphic if and only if its real
+and imaginary parts satisfy these equations. This gives an idea of how
+strict the criteria are to qualify as holomorphic.
+
+
+## Integration formulas
+
+Holomorphic functions satisfy **Cauchy's integral theorem**, which states
+that the integral of $f(z)$ over any closed curve $C$ in the complex plane is zero,
+provided that $f(z)$ is holomorphic for all $z$ in the area enclosed by $C$:
+
+$$\begin{aligned}
+ \boxed{
+ \oint_C f(z) \dd{z} = 0
+ }
+\end{aligned}$$
+
+*__Proof__*.
+*Just like before, we decompose $f(z)$ into its real and imaginary parts:*
+
+$$\begin{aligned}
+ \oint_C f(z) \:dz
+ &= \oint_C (u + i v) \dd{(x + i y)}
+ = \oint_C (u + i v) \:(\dd{x} + i \dd{y})
+ \\
+ &= \oint_C u \dd{x} - v \dd{y} + i \oint_C v \dd{x} + u \dd{y}
+\end{aligned}$$
+
+*Using Green's theorem, we integrate over the area $A$ enclosed by $C$:*
+
+$$\begin{aligned}
+ \oint_C f(z) \:dz
+ &= - \iint_A \pdv{v}{x} + \pdv{u}{y} \dd{x} \dd{y} + i \iint_A \pdv{u}{x} - \pdv{v}{y} \dd{x} \dd{y}
+\end{aligned}$$
+
+*Since $f(z)$ is holomorphic, $u$ and $v$ satisfy the Cauchy-Riemann
+equations, such that the integrands disappear and the final result is zero.*
+*__Q.E.D.__*
+
+An interesting consequence is **Cauchy's integral formula**, which
+states that the value of $f(z)$ at an arbitrary point $z_0$ is
+determined by its values on an arbitrary contour $C$ around $z_0$:
+
+$$\begin{aligned}
+ \boxed{
+ f(z_0) = \frac{1}{2 \pi i} \oint_C \frac{f(z)}{z - z_0} \dd{z}
+ }
+\end{aligned}$$
+
+*__Proof__*.
+*Thanks to the integral theorem, we know that the shape and size
+of $C$ is irrelevant. Therefore we choose it to be a circle with radius $r$,
+such that the integration variable becomes $z = z_0 + r e^{i \theta}$. Then
+we integrate by substitution:*
+
+$$\begin{aligned}
+ \frac{1}{2 \pi i} \oint_C \frac{f(z)}{z - z_0} \dd{z}
+ &= \frac{1}{2 \pi i} \int_0^{2 \pi} f(z) \frac{i r e^{i \theta}}{r e^{i \theta}} \dd{\theta}
+ = \frac{1}{2 \pi} \int_0^{2 \pi} f(z_0 + r e^{i \theta}) \dd{\theta}
+\end{aligned}$$
+
+*We may choose an arbitrarily small radius $r$, such that the contour approaches $z_0$:*
+
+$$\begin{aligned}
+ \lim_{r \to 0}\:\: \frac{1}{2 \pi} \int_0^{2 \pi} f(z_0 + r e^{i \theta}) \dd{\theta}
+ &= \frac{f(z_0)}{2 \pi} \int_0^{2 \pi} \dd{\theta}
+ = f(z_0)
+\end{aligned}$$
+
+*__Q.E.D.__*
+
+Similarly, **Cauchy's differentiation formula**,
+or **Cauchy's integral formula for derivatives**
+gives all derivatives of a holomorphic function as follows,
+and also guarantees their existence:
+
+$$\begin{aligned}
+ \boxed{
+ f^{(n)}(z_0)
+ = \frac{n!}{2 \pi i} \oint_C \frac{f(z)}{(z - z_0)^{n + 1}} \dd{z}
+ }
+\end{aligned}$$
+
+*__Proof__*.
+*By definition, the first derivative $f'(z)$ of a
+holomorphic function $f(z)$ exists and is given by:*
+
+$$\begin{aligned}
+ f'(z_0)
+ = \lim_{z \to z_0} \frac{f(z) - f(z_0)}{z - z_0}
+\end{aligned}$$
+
+*We evaluate the numerator using Cauchy's integral theorem as follows:*
+
+$$\begin{aligned}
+ f'(z_0)
+ &= \lim_{z \to z_0} \frac{1}{z - z_0}
+ \bigg( \frac{1}{2 \pi i} \oint_C \frac{f(\zeta)}{\zeta - z} \dd{\zeta} - \frac{1}{2 \pi i} \oint_C \frac{f(\zeta)}{\zeta - z_0} \dd{\zeta} \bigg)
+ \\
+ &= \frac{1}{2 \pi i} \lim_{z \to z_0} \frac{1}{z - z_0}
+ \oint_C \frac{f(\zeta)}{\zeta - z} - \frac{f(\zeta)}{\zeta - z_0} \dd{\zeta}
+ \\
+ &= \frac{1}{2 \pi i} \lim_{z \to z_0} \frac{1}{z - z_0}
+ \oint_C \frac{f(\zeta) (z - z_0)}{(\zeta - z)(\zeta - z_0)} \dd{\zeta}
+\end{aligned}$$
+
+*This contour integral converges uniformly, so we may apply the limit on the inside:*
+
+$$\begin{aligned}
+ f'(z_0)
+ &= \frac{1}{2 \pi i} \oint_C \Big( \lim_{z \to z_0} \frac{f(\zeta)}{(\zeta - z)(\zeta - z_0)} \Big) \dd{\zeta}
+ = \frac{1}{2 \pi i} \oint_C \frac{f(\zeta)}{(\zeta - z_0)^2} \dd{\zeta}
+\end{aligned}$$
+
+*Since the second-order derivative $f''(z)$ is simply the derivative of $f'(z)$,
+this proof works inductively for all higher orders $n$.*
+*__Q.E.D.__*
+
+
+## Residue theorem
+
+A function $f(z)$ is **meromorphic** if it is holomorphic except in
+a finite number of **simple poles**, which are points $z_p$ where
+$f(z_p)$ diverges, but where the product $(z - z_p) f(z)$ is non-zero and
+still holomorphic close to $z_p$.
+
+The **residue** $R_p$ of a simple pole $z_p$ is defined as follows, and
+represents the rate at which $f(z)$ diverges close to $z_p$:
+
+$$\begin{aligned}
+ \boxed{
+ R_p = \lim_{z \to z_p} (z - z_p) f(z)
+ }
+\end{aligned}$$
+
+**Cauchy's residue theorem** generalizes Cauchy's integral theorem
+to meromorphic functions, and states that the integral of a contour $C$,
+depends on the simple poles $p$ it encloses:
+
+$$\begin{aligned}
+ \boxed{
+ \oint_C f(z) \dd{z} = i 2 \pi \sum_{p} R_p
+ }
+\end{aligned}$$
+
+*__Proof__*. *From the definition of a meromorphic function,
+we know that we can decompose $f(z)$ as follows,
+where $h(z)$ is holomorphic and $p$ are all its poles:*
+
+$$\begin{aligned}
+ f(z) = h(z) + \sum_{p} \frac{R_p}{z - z_p}
+\end{aligned}$$
+
+*We integrate this over a contour $C$ which contains all poles, and apply
+both Cauchy's integral theorem and Cauchy's integral formula to get:*
+
+$$\begin{aligned}
+ \oint_C f(z) \dd{z}
+ &= \oint_C h(z) \dd{z} + \sum_{p} R_p \oint_C \frac{1}{z - z_p} \dd{z}
+ = \sum_{p} R_p \: 2 \pi i
+\end{aligned}$$
+
+*__Q.E.D.__*
+
+This theorem might not seem very useful,
+but in fact, thanks to some clever mathematical magic,
+it allows us to evaluate many integrals along the real axis,
+most notably [Fourier transforms](/know/concept/fourier-transform/).
+It can also be used to derive the Kramers-Kronig relations.
+
diff --git a/content/know/concept/kramers-kronig-relations/index.pdc b/content/know/concept/kramers-kronig-relations/index.pdc
new file mode 100644
index 0000000..1c2977e
--- /dev/null
+++ b/content/know/concept/kramers-kronig-relations/index.pdc
@@ -0,0 +1,133 @@
+---
+title: "Kramers-Kronig relations"
+firstLetter: "K"
+publishDate: 2021-02-25
+categories:
+- Mathematics
+- Physics
+
+date: 2021-02-25T15:20:24+01:00
+draft: false
+markup: pandoc
+---
+
+# Kramers-Kronig relations
+
+Let $\chi(t)$ be a complex function describing
+the response of a system to an impulse $f(t)$ starting at $t = 0$.
+The **Kramers-Kronig relations** connect the real and imaginary parts of $\chi(t)$,
+such that one can be reconstructed from the other.
+Suppose we can only measure $\chi_r(t)$ or $\chi_i(t)$:
+
+$$\begin{aligned}
+ \chi(t) = \chi_r(t) + i \chi_i(t)
+\end{aligned}$$
+
+Assuming that the system was at rest until $t = 0$,
+the response $\chi(t)$ cannot depend on anything from $t < 0$,
+since the known impulse $f(t)$ had not started yet,
+This principle is called **causality**, and to enforce it,
+we use the [Heaviside step function](/know/concept/heaviside-step-function/)
+$\Theta(t)$ to create a **causality test** for $\chi(t)$:
+
+$$\begin{aligned}
+ \chi(t) = \chi(t) \: \Theta(t)
+\end{aligned}$$
+
+If we [Fourier transform](/know/concept/fourier-transform/) this equation,
+then it will become a convolution in the frequency domain
+thanks to the [convolution theorem](/know/concept/convolution-theorem/),
+where $A$, $B$ and $s$ are constants from the FT definition:
+
+$$\begin{aligned}
+ \tilde{\chi}(\omega)
+ %= \hat{\mathcal{F}}\{\chi_c(t) \: \Theta(t)\}
+ = (\tilde{\chi} * \tilde{\Theta})(\omega)
+ = B \int_{-\infty}^\infty \tilde{\chi}(\omega') \: \tilde{\Theta}(\omega - \omega') \dd{\omega'}
+\end{aligned}$$
+
+We look up the FT of the step function $\tilde{\Theta}(\omega)$,
+which involves the signum function $\mathrm{sgn}(t)$,
+the [Dirac delta function](/know/concept/dirac-delta-function/) $\delta$,
+and the Cauchy principal value $\pv{}$.
+We arrive at:
+
+$$\begin{aligned}
+ \tilde{\chi}(\omega)
+ &= \frac{A B}{|s|} \: \pv{\int_{-\infty}^\infty \tilde{\chi}(\omega')
+ \Big( \pi \delta(\omega - \omega') + i \:\mathrm{sgn} \frac{1}{\omega - \omega'} \Big) \dd{\omega'}}
+ \\
+ &= \Big( \frac{1}{2} \frac{2 \pi A B}{|s|} \Big) \tilde{\chi}(\omega)
+ + i \Big( \frac{\mathrm{sgn}(s)}{2 \pi} \frac{2 \pi A B}{|s|} \Big)
+ \: \pv{\int_{-\infty}^\infty \frac{\tilde{\chi}(\omega')}{\omega - \omega'} \dd{\omega'}}
+\end{aligned}$$
+
+From the definition of the Fourier transform we know that $2 \pi A B / |s| = 1$:
+
+$$\begin{aligned}
+ \tilde{\chi}(\omega)
+ &= \frac{1}{2} \tilde{\chi}(\omega)
+ + \mathrm{sgn}(s) \frac{i}{2 \pi} \: \pv{\int_{-\infty}^\infty \frac{\tilde{\chi}(\omega')}{\omega - \omega'} \dd{\omega'}}
+\end{aligned}$$
+
+We isolate this equation for $\tilde{\chi}(\omega)$
+to get the final version of the causality test:
+
+$$\begin{aligned}
+ \boxed{
+ \tilde{\chi}(\omega)
+ = - \mathrm{sgn}(s) \frac{i}{\pi} \: \pv{\int_{-\infty}^\infty \frac{\tilde{\chi}(\omega')}{\omega - \omega'} \dd{\omega'}}
+ }
+\end{aligned}$$
+
+By inserting $\tilde{\chi}(\omega) = \tilde{\chi}_r(\omega) + i \tilde{\chi}_i(\omega)$
+and splitting the equation into real and imaginary parts,
+we get the Kramers-Kronig relations:
+
+$$\begin{aligned}
+ \boxed{
+ \begin{aligned}
+ \tilde{\chi}_r(\omega)
+ &= \mathrm{sgn}(s) \frac{1}{\pi} \: \pv{\int_{-\infty}^\infty \frac{\tilde{\chi}_i(\omega')}{\omega' - \omega} \dd{\omega'}}
+ \\
+ \tilde{\chi}_i(\omega)
+ &= - \mathrm{sgn}(s) \frac{1}{\pi} \: \pv{\int_{-\infty}^\infty \frac{\tilde{\chi}_r(\omega')}{\omega' - \omega} \dd{\omega'}}
+ \end{aligned}
+ }
+\end{aligned}$$
+
+If the time-domain response function $\chi(t)$ is real
+(so far we have assumed it to be complex),
+then we can take advantage of the fact that
+the FT of a real function satisfies
+$\tilde{\chi}(-\omega) = \tilde{\chi}^*(\omega)$, i.e. $\tilde{\chi}_r(\omega)$
+is even and $\tilde{\chi}_i(\omega)$ is odd. We multiply the fractions by
+$(\omega' + \omega)$ above and below:
+
+$$\begin{aligned}
+ \tilde{\chi}_r(\omega)
+ &= \mathrm{sgn}(s) \bigg( \frac{1}{\pi} \: \pv{\int_{-\infty}^\infty \frac{\omega' \tilde{\chi}_i(\omega')}{{\omega'}^2 - \omega^2} \dd{\omega'}}
+ + \frac{\omega}{\pi} \: \pv{\int_{-\infty}^\infty \frac{\tilde{\chi}_i(\omega')}{{\omega'}^2 - \omega^2} \dd{\omega'}} \bigg)
+ \\
+ \tilde{\chi}_i(\omega)
+ &= - \mathrm{sgn}(s) \bigg( \frac{1}{\pi} \: \pv{\int_{-\infty}^\infty \frac{\omega' \tilde{\chi}_r(\omega')}{{\omega'}^2 - \omega^2} \dd{\omega'}}
+ + \frac{\omega}{\pi} \: \pv{\int_{-\infty}^\infty \frac{\tilde{\chi}_r(\omega')}{{\omega'}^2 - \omega^2} \dd{\omega'}} \bigg)
+\end{aligned}$$
+
+For $\tilde{\chi}_r(\omega)$, the second integrand is odd, so we can drop it.
+Similarly, for $\tilde{\chi}_i(\omega)$, the first integrand is odd.
+We therefore find the following variant of the Kramers-Kronig relations:
+
+$$\begin{aligned}
+ \boxed{
+ \begin{aligned}
+ \tilde{\chi}_r(\omega)
+ &= \mathrm{sgn}(s) \frac{2}{\pi} \: \pv{\int_0^\infty \frac{\omega' \tilde{\chi}_i(\omega')}{{\omega'}^2 - \omega^2} \dd{\omega'}}
+ \\
+ \tilde{\chi}_i(\omega)
+ &= - \mathrm{sgn}(s) \frac{2 \omega}{\pi} \: \pv{\int_0^\infty \frac{\tilde{\chi}_r(\omega')}{{\omega'}^2 - \omega^2} \dd{\omega'}}
+ \end{aligned}
+ }
+\end{aligned}$$
+
+To reiterate: this version is only valid if $\chi(t)$ is real in the time domain.
diff --git a/content/know/concept/schwartz-distribution/index.pdc b/content/know/concept/schwartz-distribution/index.pdc
new file mode 100644
index 0000000..2d9f9df
--- /dev/null
+++ b/content/know/concept/schwartz-distribution/index.pdc
@@ -0,0 +1,119 @@
+---
+title: "Schwartz distribution"
+firstLetter: "S"
+publishDate: 2021-02-25
+categories:
+- Mathematics
+
+date: 2021-02-25T13:47:16+01:00
+draft: false
+markup: pandoc
+---
+
+# Schwartz distribution
+
+A **Schwartz distribution**, also known as a **generalized function**,
+is a generalization of a function,
+allowing us to work with otherwise pathological definitions.
+
+Notable examples of distributions are
+the [Dirac delta function](/know/concept/dirac-delta-function/)
+and the [Heaviside step function](/know/concept/heaviside-step-function/),
+whose unusual properties are justified by this generalization.
+
+We define the **Schwartz space** $\mathcal{S}$ of functions,
+whose members are often called **test functions**.
+Every such $\phi(x) \in \mathcal{S}$ must satisfy
+the following constraint for any $p, q \in \mathbb{N}$:
+
+$$\begin{aligned}
+ \mathrm{max} \big| x^p \phi^{(q)}(x) \big| < \infty
+\end{aligned}$$
+
+In other words, a test function and its derivatives
+decay faster than any polynomial.
+Furthermore, all test functions must be infinitely differentiable.
+These are quite strict requirements.
+
+The **space of distributions** $\mathcal{S}'$ (note the prime)
+is then said to consist of *functionals* $f[\phi]$
+which map a test function $\phi$ from $\mathcal{S}$,
+to a number from $\mathbb{C}$,
+which is often written as $\braket{f}{\phi}$.
+This notation looks like the inner product of
+a [Hilbert space](/know/concept/hilbert-space/),
+for good reason: any well-behaved function $f(x)$ can be embedded
+into $\mathcal{S}'$ by defining the corresponding functional $f[\phi]$ as follows:
+
+$$\begin{aligned}
+ f[\phi]
+ = \braket{f}{\phi}
+ = \int_{-\infty}^\infty f(x) \: \phi(x) \dd{x}
+\end{aligned}$$
+
+Not all functionals qualify for $\mathcal{S}'$:
+they also need to be linear in $\phi$, and **continuous**,
+which in this context means: if a series $\phi_n$
+converges to $\phi$, then $\braket{f}{\phi_n}$
+converges to $\braket{f}{\phi}$ for all $f$.
+
+The power of this generalization is that $f(x)$ does not need to be well-behaved:
+for example, the Dirac delta function can also be used,
+whose definition is nonsensical *outside* of an integral,
+but perfectly reasonable *inside* one.
+By treating it as a distribution,
+we gain the ability to sanely define e.g. its derivatives.
+
+Using the example of embedding a well-behaved function $f(x)$ into $\mathcal{S}$,
+we can work out what the derivative of a distribution is:
+
+$$\begin{aligned}
+ \braket{f'}{\phi}
+ = \int_{-\infty}^\infty f'(x) \: \phi(x) \dd{x}
+ = \Big[ f(x) \: \phi(x) \Big]_{-\infty}^\infty - \int_{-\infty}^\infty f(x) \: \phi'(x) \dd{x}
+\end{aligned}$$
+
+The test function removes the boundary term, yielding the result
+$- \braket{f}{\phi'}$. Although this was an example for a specific $f(x)$,
+we use it to define the derivative of any distribution:
+
+$$\begin{aligned}
+ \boxed{
+ \braket{f'}{\phi} = - \braket{f}{\phi'}
+ }
+\end{aligned}$$
+
+Using the same trick, we can find the
+[Fourier transform](/know/concept/fourier-transform/) (FT)
+of a generalized function.
+We define the FT as follows,
+but be prepared for some switching of the names $k$ and $x$:
+
+$$\begin{aligned}
+ \tilde{\phi}(x)
+ = \int_{-\infty}^\infty \phi(k) \exp(- i k x) \dd{k}
+\end{aligned}$$
+
+The FT of a Schwartz distribution $f$ then turns out to be as follows:
+
+$$\begin{aligned}
+ \braket*{\tilde{f}}{\phi}
+ &= \int_{-\infty}^\infty \tilde{f}(k) \: \phi(k) \dd{k}
+ = \iint_{-\infty}^\infty f(x) \exp(- i k x) \: \phi(k) \dd{x} \dd{k}
+ \\
+ &= \int_{-\infty}^\infty f(x) \: \tilde{\phi}(x) \dd{x}
+ = \braket*{f}{\tilde{\phi}}
+\end{aligned}$$
+
+Note that the ordinary FT $\tilde{f}(k) = \hat{\mathcal{F}}\{f(x)\}$ is
+already a 1:1 mapping of test functions $\phi \leftrightarrow \tilde{\phi}$.
+As it turns out,
+in this generalization it is also a 1:1 mapping of distributions in $\mathcal{S}'$,
+defined as:
+
+$$\begin{aligned}
+ \boxed{
+ \braket*{\tilde{f}}{\phi}
+ = \braket*{f}{\tilde{\phi}}
+ }
+\end{aligned}$$