summaryrefslogtreecommitdiff
path: root/content/know/concept/calculus-of-variations/index.pdc
diff options
context:
space:
mode:
authorPrefetch2021-02-25 16:14:20 +0100
committerPrefetch2021-02-25 16:14:20 +0100
commitc705ac1d7dc74709835a8c48fae4a7dd70dc5c49 (patch)
tree5f618f86bd96649c796a5ccf4bf1542046a02561 /content/know/concept/calculus-of-variations/index.pdc
parentc157ad913aa9f975ea8c137e24175d134486f462 (diff)
Expand knowledge base
Diffstat (limited to 'content/know/concept/calculus-of-variations/index.pdc')
-rw-r--r--content/know/concept/calculus-of-variations/index.pdc40
1 files changed, 20 insertions, 20 deletions
diff --git a/content/know/concept/calculus-of-variations/index.pdc b/content/know/concept/calculus-of-variations/index.pdc
index fb043e0..c5280e5 100644
--- a/content/know/concept/calculus-of-variations/index.pdc
+++ b/content/know/concept/calculus-of-variations/index.pdc
@@ -29,18 +29,18 @@ the path $f(x)$ taken by a physical system,
the **principle of least action** states that $f$ will be a minimum of $J[f]$,
so for example the expended energy will be minimized.
-If $f(x, \alpha\!=\!0)$ is the optimal route, then a slightly
+If $f(x, \varepsilon\!=\!0)$ is the optimal route, then a slightly
different (and therefore worse) path between the same two points can be expressed
-using the parameter $\alpha$:
+using the parameter $\varepsilon$:
$$\begin{aligned}
- f(x, \alpha) = f(x, 0) + \alpha \eta(x)
+ f(x, \varepsilon) = f(x, 0) + \varepsilon \eta(x)
\qquad \mathrm{or} \qquad
- \delta f = \alpha \eta(x)
+ \delta f = \varepsilon \eta(x)
\end{aligned}$$
Where $\eta(x)$ is an arbitrary differentiable deviation.
-Since $f(x, \alpha)$ must start and end in the same points as $f(x,0)$,
+Since $f(x, \varepsilon)$ must start and end in the same points as $f(x,0)$,
we have the boundary conditions:
$$\begin{aligned}
@@ -50,16 +50,16 @@ $$\begin{aligned}
Given $L$, the goal is to find an equation for the optimal path $f(x,0)$.
Just like when finding the minimum of a real function,
the minimum $f$ of a functional $J[f]$ is a stationary point
-with respect to the deviation weight $\alpha$,
+with respect to the deviation weight $\varepsilon$,
a condition often written as $\delta J = 0$.
In the following, the integration limits have been omitted:
$$\begin{aligned}
0
&= \delta J
- = \pdv{J}{\alpha} \Big|_{\alpha = 0}
- = \int \pdv{L}{\alpha} \dd{x}
- = \int \pdv{L}{f} \pdv{f}{\alpha} + \pdv{L}{f'} \pdv{f'}{\alpha} \dd{x}
+ = \pdv{J}{\varepsilon} \Big|_{\varepsilon = 0}
+ = \int \pdv{L}{\varepsilon} \dd{x}
+ = \int \pdv{L}{f} \pdv{f}{\varepsilon} + \pdv{L}{f'} \pdv{f'}{\varepsilon} \dd{x}
\\
&= \int \pdv{L}{f} \eta + \pdv{L}{f'} \eta' \dd{x}
= \Big[ \pdv{L}{f'} \eta \Big]_{x_0}^{x_1} + \int \pdv{L}{f} \eta - \frac{d}{dx} \Big( \pdv{L}{f'} \Big) \eta \dd{x}
@@ -99,16 +99,16 @@ In this case, every $f_n(x)$ has its own deviation $\eta_n(x)$,
satisfying $\eta_n(x_0) = \eta_n(x_1) = 0$:
$$\begin{aligned}
- f_n(x, \alpha) = f_n(x, 0) + \alpha \eta_n(x)
+ f_n(x, \varepsilon) = f_n(x, 0) + \varepsilon \eta_n(x)
\end{aligned}$$
The derivation procedure is identical to the case $N = 1$ from earlier:
$$\begin{aligned}
0
- &= \pdv{J}{\alpha} \Big|_{\alpha = 0}
- = \int \pdv{L}{\alpha} \dd{x}
- = \int \sum_{n} \Big( \pdv{L}{f_n} \pdv{f_n}{\alpha} + \pdv{L}{f_n'} \pdv{f_n'}{\alpha} \Big) \dd{x}
+ &= \pdv{J}{\varepsilon} \Big|_{\varepsilon = 0}
+ = \int \pdv{L}{\varepsilon} \dd{x}
+ = \int \sum_{n} \Big( \pdv{L}{f_n} \pdv{f_n}{\varepsilon} + \pdv{L}{f_n'} \pdv{f_n'}{\varepsilon} \Big) \dd{x}
\\
&= \int \sum_{n} \Big( \pdv{L}{f_n} \eta_n + \pdv{L}{f_n'} \eta_n' \Big) \dd{x}
\\
@@ -140,9 +140,9 @@ Once again, the derivation procedure is the same as before:
$$\begin{aligned}
0
- &= \pdv{J}{\alpha} \Big|_{\alpha = 0}
- = \int \pdv{L}{\alpha} \dd{x}
- = \int \pdv{L}{f} \pdv{f}{\alpha} + \sum_{n} \pdv{L}{f^{(n)}} \pdv{f^{(n)}}{\alpha} \dd{x}
+ &= \pdv{J}{\varepsilon} \Big|_{\varepsilon = 0}
+ = \int \pdv{L}{\varepsilon} \dd{x}
+ = \int \pdv{L}{f} \pdv{f}{\varepsilon} + \sum_{n} \pdv{L}{f^{(n)}} \pdv{f^{(n)}}{\varepsilon} \dd{x}
\\
&= \int \pdv{L}{f} \eta + \sum_{n} \pdv{L}{f^{(n)}} \eta^{(n)} \dd{x}
\end{aligned}$$
@@ -187,17 +187,17 @@ $$\begin{aligned}
The arbitrary deviation $\eta$ is then also a function of multiple variables:
$$\begin{aligned}
- f(x, y; \alpha) = f(x, y; 0) + \alpha \eta(x, y)
+ f(x, y; \varepsilon) = f(x, y; 0) + \varepsilon \eta(x, y)
\end{aligned}$$
The derivation procedure starts in the exact same way as before:
$$\begin{aligned}
0
- &= \pdv{J}{\alpha} \Big|_{\alpha = 0}
- = \iint \pdv{L}{\alpha} \dd{x} \dd{y}
+ &= \pdv{J}{\varepsilon} \Big|_{\varepsilon = 0}
+ = \iint \pdv{L}{\varepsilon} \dd{x} \dd{y}
\\
- &= \iint \pdv{L}{f} \pdv{f}{\alpha} + \pdv{L}{f_x} \pdv{f_x}{\alpha} + \pdv{L}{f_y} \pdv{f_y}{\alpha} \dd{x} \dd{y}
+ &= \iint \pdv{L}{f} \pdv{f}{\varepsilon} + \pdv{L}{f_x} \pdv{f_x}{\varepsilon} + \pdv{L}{f_y} \pdv{f_y}{\varepsilon} \dd{x} \dd{y}
\\
&= \iint \pdv{L}{f} \eta + \pdv{L}{f_x} \eta_x + \pdv{L}{f_y} \eta_y \dd{x} \dd{y}
\end{aligned}$$