From 9d9693af6fb94ef4404a3c2399cb38842e5ca822 Mon Sep 17 00:00:00 2001 From: Prefetch Date: Fri, 12 May 2023 21:19:19 +0200 Subject: Improve knowledge base --- source/know/concept/gram-schmidt-method/index.md | 59 +++++++++++++++++------- 1 file changed, 43 insertions(+), 16 deletions(-) (limited to 'source/know/concept/gram-schmidt-method/index.md') diff --git a/source/know/concept/gram-schmidt-method/index.md b/source/know/concept/gram-schmidt-method/index.md index 70ad512..a62522e 100644 --- a/source/know/concept/gram-schmidt-method/index.md +++ b/source/know/concept/gram-schmidt-method/index.md @@ -8,39 +8,66 @@ categories: layout: "concept" --- -Given a set of linearly independent non-orthonormal vectors -$$\ket{V_1}, \ket{V_2}, ...$$ from a [Hilbert space](/know/concept/hilbert-space/), -the **Gram-Schmidt method** -turns them into an orthonormal set $$\ket{n_1}, \ket{n_2}, ...$$ as follows: +Given a set of $$N$$ linearly independent vectors $$\ket{V_1}, \ket{V_2}, ...$$ +from a [Hilbert space](/know/concept/hilbert-space/), +the **Gram-Schmidt method** is an algorithm that turns them +into an orthonormal set $$\ket{n_1}, \ket{n_2}, ...$$ as follows +(in [Dirac notation](/know/concept/dirac-notation/)): 1. Take the first vector $$\ket{V_1}$$ and normalize it to get $$\ket{n_1}$$: $$\begin{aligned} - \ket{n_1} = \frac{\ket{V_1}}{\sqrt{\inprod{V_1}{V_1}}} + \ket{n_1} + = \frac{\ket{V_1}}{\sqrt{\inprod{V_1}{V_1}}} \end{aligned}$$ -2. Begin loop. Take the next non-orthonormal vector $$\ket{V_j}$$, and +2. Begin loop. Take the next input vector $$\ket{V_j}$$, and subtract from it its projection onto every already-processed vector: $$\begin{aligned} - \ket{n_j'} = \ket{V_j} - \ket{n_1} \inprod{n_1}{V_j} - \ket{n_2} \inprod{n_2}{V_j} - ... - \ket{n_{j-1}} \inprod{n_{j-1}}{V_{j-1}} + \ket{g_j} + = \ket{V_j} - \ket{n_1} \inprod{n_1}{V_j} - \ket{n_2} \inprod{n_2}{V_j} - ... - \ket{n_{j-1}} \inprod{n_{j-1}}{V_j} \end{aligned}$$ - This leaves only the part of $$\ket{V_j}$$ which is orthogonal to - $$\ket{n_1}$$, $$\ket{n_2}$$, etc. This why the input vectors must be - linearly independent; otherwise $$\Ket{n_j'}$$ may become zero at some - point. + This leaves only the part of $$\ket{V_j}$$ + that is orthogonal to all previous $$\ket{n_k}$$. + This why the input vectors must be linearly independent; + otherwise $$\ket{g_j}$$ could become zero. -3. Normalize the resulting ortho*gonal* vector $$\ket{n_j'}$$ to make it + On a computer, the resulting $$\ket{g_j}$$ will + not be perfectly orthogonal due to rounding errors. + The above description of step #2 is particularly bad. + A better approach is: + + $$\begin{aligned} + \ket{g_j^{(1)}} + &= \ket{V_j} - \ket{n_1} \inprod{n_1}{V_j} + \\ + \ket{g_j^{(2)}} + &= \ket{g_j^{(1)}} - \ket{n_2} \inprod{n_2}{g_j^{(1)}} + \\ + \vdots + \\ + \ket{g_j} + = \ket{g_j^{(j-1)}} + &= \ket{g_j^{(j-2)}} - \ket{n_{j-2}} \inprod{n_{j-2}}{g_j^{(j-2)}} + \end{aligned}$$ + + In other words, instead of projecting $$\ket{V_j}$$ directly onto all $$\ket{n_k}$$, + we instead project only the part of $$\ket{V_j}$$ that has already been made orthogonal + to all previous $$\ket{n_m}$$ with $$m < k$$. + This is known as the **modified Gram-Schmidt method**. + +3. Normalize the resulting ortho*gonal* vector $$\ket{g_j}$$ to make it ortho*normal*: $$\begin{aligned} - \ket{n_j} = \frac{\ket{n_j'}}{\sqrt{\inprod{n_j'}{n_j'}}} + \ket{n_j} + = \frac{\ket{g_j}}{\sqrt{\inprod{g_j}{g_j}}} \end{aligned}$$ -4. Loop back to step 2, taking the next vector $$\ket{V_{j+1}}$$. - -If you are unfamiliar with this notation, take a look at [Dirac notation](/know/concept/dirac-notation/). +4. Loop back to step 2, taking the next vector $$\ket{V_{j+1}}$$, + until all $$N$$ have been processed. -- cgit v1.2.3