diff options
author | Prefetch | 2022-01-08 14:09:13 +0100 |
---|---|---|
committer | Prefetch | 2022-01-08 14:13:44 +0100 |
commit | 7c2d27ca89c5b096694b950c766e50df2dc87001 (patch) | |
tree | ed9ee3c02fe746350b9e0714f4648a554ade52b0 /content/know/concept/ito-process | |
parent | 63966407338ed0bdb061ddfd67f8940c2ccb51d2 (diff) |
Minor fixes, rename "Ion Sound Wave" and "Ito Process"
Diffstat (limited to 'content/know/concept/ito-process')
-rw-r--r-- | content/know/concept/ito-process/index.pdc | 367 |
1 files changed, 367 insertions, 0 deletions
diff --git a/content/know/concept/ito-process/index.pdc b/content/know/concept/ito-process/index.pdc new file mode 100644 index 0000000..d27a2fb --- /dev/null +++ b/content/know/concept/ito-process/index.pdc @@ -0,0 +1,367 @@ +--- +title: "Itō process" +firstLetter: "I" +publishDate: 2021-11-06 +categories: +- Mathematics +- Stochastic analysis + +date: 2021-11-06T14:34:00+01:00 +draft: false +markup: pandoc +--- + +# Itō process + +Given two [stochastic processes](/know/concept/stochastic-process/) +$F_t$ and $G_t$, consider the following random variable $X_t$, +where $B_t$ is the [Wiener process](/know/concept/wiener-process/), +i.e. Brownian motion: + +$$\begin{aligned} + X_t + = X_0 + \int_0^t F_s \dd{s} + \int_0^t G_s \dd{B_s} +\end{aligned}$$ + +Where the latter is an [Itō integral](/know/concept/ito-integral/), +assuming $G_t$ is Itō-integrable. +We call $X_t$ an **Itō process** if $F_t$ is locally integrable, +and the initial condition $X_0$ is known, +i.e. $X_0$ is $\mathcal{F}_0$-measurable, +where $\mathcal{F}_t$ is the filtration +to which $F_t$, $G_t$ and $B_t$ are adapted. +The above definition of $X_t$ is often abbreviated as follows, +where $X_0$ is implicit: + +$$\begin{aligned} + \dd{X_t} + = F_t \dd{t} + G_t \dd{B_t} +\end{aligned}$$ + +Typically, $F_t$ is referred to as the **drift** of $X_t$, +and $G_t$ as its **intensity**. +Because the Itō integral of $G_t$ is a +[martingale](/know/concept/martingale/), +it does not contribute to the mean of $X_t$: + +$$\begin{aligned} + \mathbf{E}[X_t] + = \int_0^t \mathbf{E}[F_s] \dd{s} +\end{aligned}$$ + +Now, consider the following **Itō stochastic differential equation** (SDE), +where $\xi_t = \dv*{B_t}{t}$ is white noise, +informally treated as the $t$-derivative of $B_t$: + +$$\begin{aligned} + \dv{X_t}{t} + = f(X_t, t) + g(X_t, t) \: \xi_t +\end{aligned}$$ + +An Itō process $X_t$ is said to satisfy this equation +if $f(X_t, t) = F_t$ and $g(X_t, t) = G_t$, +in which case $X_t$ is also called an **Itō diffusion**. +All Itō diffusions are [Markov processes](/know/concept/markov-process/), +since only the current value of $X_t$ determines the future, +and $B_t$ is also a Markov process. + + +## Itō's lemma + +Classically, given $y \equiv h(x(t), t)$, +the chain rule of differentiation states that: + +$$\begin{aligned} + \dd{y} + = \pdv{h}{t} \dd{t} + \pdv{h}{x} \dd{x} +\end{aligned}$$ + +However, for a stochastic process $Y_t \equiv h(X_t, t)$, +where $X_t$ is an Itō process, +the chain rule is modified to the following, +known as **Itō's lemma**: + +$$\begin{aligned} + \boxed{ + \dd{Y_t} + = \bigg( \pdv{h}{t} + \pdv{h}{x} F_t + \frac{1}{2} \pdv[2]{h}{x} G_t^2 \bigg) \dd{t} + \pdv{h}{x} G_t \dd{B_t} + } +\end{aligned}$$ + +<div class="accordion"> +<input type="checkbox" id="proof-lemma"/> +<label for="proof-lemma">Proof</label> +<div class="hidden"> +<label for="proof-lemma">Proof.</label> +We start by applying the classical chain rule, +but we go to second order in $x$. +This is also valid classically, +but there we would neglect all higher-order infinitesimals: + +$$\begin{aligned} + \dd{Y_t} + = \pdv{h}{t} \dd{t} + \pdv{h}{x} \dd{X_t} + \frac{1}{2} \pdv[2]{h}{x} \dd{X_t}^2 +\end{aligned}$$ + +But here we cannot neglect $\dd{X_t}^2$. +We insert the definition of an Itō process: + +$$\begin{aligned} + \dd{Y_t} + &= \pdv{h}{t} \dd{t} + \pdv{h}{x} \Big( F_t \dd{t} + G_t \dd{B_t} \Big) + \frac{1}{2} \pdv[2]{h}{x} \Big( F_t \dd{t} + G_t \dd{B_t} \Big)^2 + \\ + &= \pdv{h}{t} \dd{t} + \pdv{h}{x} \Big( F_t \dd{t} + G_t \dd{B_t} \Big) + + \frac{1}{2} \pdv[2]{h}{x} \Big( F_t^2 \dd{t}^2 + 2 F_t G_t \dd{t} \dd{B_t} + G_t^2 \dd{B_t}^2 \Big) +\end{aligned}$$ + +In the limit of small $\dd{t}$, we can neglect $\dd{t}^2$, +and as it turns out, $\dd{t} \dd{B_t}$ too: + +$$\begin{aligned} + \dd{t} \dd{B_t} + &= (B_{t + \dd{t}} - B_t) \dd{t} + \sim \dd{t} \mathcal{N}(0, \dd{t}) + \sim \mathcal{N}(0, \dd{t}^3) + \longrightarrow 0 +\end{aligned}$$ + +However, due to the scaling property of $B_t$, +we cannot ignore $\dd{B_t}^2$, which has order $\dd{t}$: + +$$\begin{aligned} + \dd{B_t}^2 + &= (B_{t + \dd{t}} - B_t)^2 + \sim \big( \mathcal{N}(0, \dd{t}) \big)^2 + \sim \chi^2_1(\dd{t}) + \longrightarrow \dd{t} +\end{aligned}$$ + +Where $\chi_1^2(\dd{t})$ is the generalized chi-squared distribution +with one term of variance $\dd{t}$. +</div> +</div> + +The most important application of Itō's lemma +is to perform coordinate transformations, +to make the solution of a given Itō SDE easier. + + +## Coordinate transformations + +The simplest coordinate transformation is a scaling of the time axis. +Defining $s \equiv \alpha t$, the goal is to keep the Itō process. +We know how to scale $B_t$, be setting $W_s \equiv \sqrt{\alpha} B_{s / \alpha}$. +Let $Y_s \equiv X_t$ be the new variable on the rescaled axis, then: + +$$\begin{aligned} + \dd{Y_s} + = \dd{X_t} + &= f(X_t) \dd{t} + g(X_t) \dd{B_t} + \\ + &= \frac{1}{\alpha} f(Y_s) \dd{s} + \frac{1}{\sqrt{\alpha}} g(Y_s) \dd{W_s} +\end{aligned}$$ + +$W_s$ is a valid Wiener process, +and the other changes are small, +so this is still an Itō process. + +To solve SDEs analytically, it is usually best +to have additive noise, i.e. $g = 1$. +This can be achieved using the **Lamperti transform**: +define $Y_t \equiv h(X_t)$, where $h$ is given by: + +$$\begin{aligned} + \boxed{ + h(x) + = \int_{x_0}^x \frac{1}{g(y)} \dd{y} + } +\end{aligned}$$ + +Then, using Itō's lemma, it is straightforward +to show that the intensity becomes $1$. +Note that the lower integration limit $x_0$ does not enter: + +$$\begin{aligned} + \dd{Y_t} + &= \bigg( f(X_t) \: h'(X_t) + \frac{1}{2} g^2(X_t) \: h''(X_t) \bigg) \dd{t} + g(X_t) \: h'(X_t) \dd{B_t} + \\ + &= \bigg( \frac{f(X_t)}{g(X_t)} - \frac{1}{2} g^2(X_t) \frac{g'(X_t)}{g^2(X_t)} \bigg) \dd{t} + \frac{g(X_t)}{g(X_t)} \dd{B_t} + \\ + &= \bigg( \frac{f(X_t)}{g(X_t)} - \frac{1}{2} g'(X_t) \bigg) \dd{t} + \dd{B_t} +\end{aligned}$$ + +Similarly, we can eliminate the drift $f = 0$, +thereby making the Itō process a martingale. +This is done by defining $Y_t \equiv h(X_t)$, with $h(x)$ given by: + +$$\begin{aligned} + \boxed{ + h(x) + = \int_{x_0}^x \exp\!\bigg( \!-\!\! \int_{x_1}^y \frac{2 f(z)}{g^2(z)} \dd{z} \bigg) \dd{y} + } +\end{aligned}$$ + +The goal is to make the parenthesized first term (see above) +of Itō's lemma disappear, which this $h(x)$ does indeed do. +Note that $x_0$ and $x_1$ do not enter: + +$$\begin{aligned} + 0 + &= f(x) \: h'(x) + \frac{1}{2} g^2(x) \: h''(x) + \\ + &= \Big( f(x) - \frac{1}{2} g^2(x) \frac{2 f(x)}{g^2(x)} \Big) \exp\!\bigg( \!-\!\! \int_{x_1}^x \frac{2 f(y)}{g^2(y)} \dd{y} \bigg) +\end{aligned}$$ + + +## Existence and uniqueness + +It is worth knowing under what condition a solution to a given SDE exists, +in the sense that it is finite on the entire time axis. +Suppose the drift $f$ and intensity $g$ satisfy these inequalities, +for some known constant $K$ and for all $x$: + +$$\begin{aligned} + x f(x) \le K (1 + x^2) + \qquad \quad + g^2(x) \le K (1 + x^2) +\end{aligned}$$ + +When this is satisfied, we can find the following upper bound +on an Itō process $X_t$, +which clearly implies that $X_t$ is finite for all $t$: + +$$\begin{aligned} + \boxed{ + \mathbf{E}[X_t^2] + \le \big(X_0^2 + 3 K t\big) \exp\!\big(3 K t\big) + } +\end{aligned}$$ + +<div class="accordion"> +<input type="checkbox" id="proof-existence"/> +<label for="proof-existence">Proof</label> +<div class="hidden"> +<label for="proof-existence">Proof.</label> +If we define $Y_t \equiv X_t^2$, +then Itō's lemma tells us that the following holds: + +$$\begin{aligned} + \dd{Y_t} + = \big( 2 X_t \: f(X_t) + g^2(X_t) \big) \dd{t} + 2 X_t \: g(X_t) \dd{B_t} +\end{aligned}$$ + +Integrating and taking the expectation value +removes the Wiener term, leaving: + +$$\begin{aligned} + \mathbf{E}[Y_t] + = Y_0 + \mathbf{E}\! \int_0^t 2 X_s f(X_s) + g^2(X_s) \dd{s} +\end{aligned}$$ + +Given that $K (1 \!+\! x^2)$ is an upper bound of $x f(x)$ and $g^2(x)$, +we get an inequality: + +$$\begin{aligned} + \mathbf{E}[Y_t] + &\le Y_0 + \mathbf{E}\! \int_0^t 2 K (1 \!+\! X_s^2) + K (1 \!+\! X_s^2) \dd{s} + \\ + &\le Y_0 + \int_0^t 3 K (1 + \mathbf{E}[Y_s]) \dd{s} + \\ + &\le Y_0 + 3 K t + \int_0^t 3 K \big( \mathbf{E}[Y_s] \big) \dd{s} +\end{aligned}$$ + +We then apply the +[Grönwall-Bellman inequality](/know/concept/gronwall-bellman-inequality/), +noting that $(Y_0 \!+\! 3 K t)$ does not decrease with time, leading us to: + +$$\begin{aligned} + \mathbf{E}[Y_t] + &\le (Y_0 + 3 K t) \exp\!\bigg( \int_0^t 3 K \dd{s} \bigg) + \\ + &\le (Y_0 + 3 K t) \exp\!\big(3 K t\big) +\end{aligned}$$ +</div> +</div> + +If a solution exists, it is also worth knowing whether it is unique. +Suppose that $f$ and $g$ satisfy the following inequalities, +for some constant $K$ and for all $x$ and $y$: + +$$\begin{aligned} + \big| f(x) - f(y) \big| \le K \big| x - y \big| + \qquad \quad + \big| g(x) - g(y) \big| \le K \big| x - y \big| +\end{aligned}$$ + +Let $X_t$ and $Y_t$ both be solutions to a given SDE, +but the initial conditions need not be the same, +such that the difference is initially $X_0 \!-\! Y_0$. +Then the difference $X_t \!-\! Y_t$ is bounded by: + +$$\begin{aligned} + \boxed{ + \mathbf{E}\big[ (X_t - Y_t)^2 \big] + \le (X_0 - Y_0)^2 \exp\!\Big( \big(2 K \!+\! K^2 \big) t \Big) + } +\end{aligned}$$ + +<div class="accordion"> +<input type="checkbox" id="proof-uniqueness"/> +<label for="proof-uniqueness">Proof</label> +<div class="hidden"> +<label for="proof-uniqueness">Proof.</label> +We define $D_t \equiv X_t \!-\! Y_t$ and $Z_t \equiv D_t^2 \ge 0$, +together with $F_t \equiv f(X_t) \!-\! f(Y_t)$ and $G_t \equiv g(X_t) \!-\! g(Y_t)$, +such that Itō's lemma states: + +$$\begin{aligned} + \dd{Z_t} + = \big( 2 D_t F_t + G_t^2 \big) \dd{t} + 2 D_t G_t \dd{B_t} +\end{aligned}$$ + +Integrating and taking the expectation value +removes the Wiener term, leaving: + +$$\begin{aligned} + \mathbf{E}[Z_t] + = Z_0 + \mathbf{E}\! \int_0^t 2 D_s F_s + G_s^2 \dd{s} +\end{aligned}$$ + +The *Cauchy-Schwarz inequality* states that $|D_s F_s| \le |D_s| |F_s|$, +and then the given fact that $F_s$ and $G_s$ satisfy +$|F_s| \le K |D_s|$ and $|G_s| \le K |D_s|$ gives: + +$$\begin{aligned} + \mathbf{E}[Z_t] + &\le Z_0 + \mathbf{E}\! \int_0^t 2 K D_s^2 + K^2 D_s^2 \dd{s} + \\ + &\le Z_0 + \int_0^t (2 K \!+\! K^2) \: \mathbf{E}[Z_s] \dd{s} +\end{aligned}$$ + +Where we have implicitly used that $D_s F_s = |D_s F_s|$ +because $Z_t$ is positive for all $G_s^2$, +and that $|D_s|^2 = D_s^2$ because $D_s$ is real. +We then apply the +[Grönwall-Bellman inequality](/know/concept/gronwall-bellman-inequality/), +recognizing that $Z_0$ does not decrease with time (since it is constant): + +$$\begin{aligned} + \mathbf{E}[Z_t] + &\le Z_0 \exp\!\bigg( \int_0^t 2 K \!+\! K^2 \dd{s} \bigg) + \\ + &\le Z_0 \exp\!\Big( \big( 2 K \!+\! K^2 \big) t \Big) +\end{aligned}$$ +</div> +</div> + +Using these properties, it can then be shown +that if all of the above conditions are satisfied, +then the SDE has a unique solution, +which is $\mathcal{F}_t$-adapted, continuous, and exists for all times. + + + +## References +1. U.H. Thygesen, + *Lecture notes on diffusions and stochastic differential equations*, + 2021, Polyteknisk Kompendie. |