--- title: "Orthogonal curvilinear coordinates" sort_title: "Orthogonal curvilinear coordinates" date: 2023-05-29 # Originally 2021-03-03, major rewrite categories: - Mathematics - Physics layout: "concept" --- In a 3D coordinate system, the isosurface of a coordinate (i.e. the surface where that coordinate is constant while the others vary) is known as a **coordinate surface**, and the intersection line of two coordinates surfaces is called a **coordinate line**. A **curvilinear** coordinate system is one where at least one of the coordinate surfaces is curved: e.g. in cylindrical coordinates, the coordinate line of $$r$$ and $$z$$ is a circle. Here we limit ourselves to **orthogonal** systems, where the coordinate surfaces are always perpendicular. Examples of such orthogonal curvilinear systems are [spherical coordinates](/know/concept/spherical-coordinates/), [polar cylindrical coordinates](/know/concept/polar-cylindrical-coordinates/), [parabolic cylindrical coordinates](/know/concept/parabolic-cylindrical-coordinates/), and (trivially) [Cartesian coordinates](/know/concept/cartesian-coordinates/). ## Scale factors and basis vectors Given such a system with coordinates $$(c_1, c_2, c_3)$$. Their definition lets us convert all positions to classic Cartesian coordinates $$(x, y, z)$$ using functions $$x$$, $$y$$ and $$z$$: $$\begin{aligned} x &= x(c_1, c_2, c_3) \\ y &= y(c_1, c_2, c_3) \\ z &= z(c_1, c_2, c_3) \end{aligned}$$ A useful attribute of a coordinate system is its **line element** $$\dd{\vb{\ell}}$$, which represents the differential element of a line in any direction. Let $$\vu{e}_x$$, $$\vu{e}_y$$ and $$\vu{e}_z$$ be the Cartesian basis unit vectors: $$\begin{aligned} \dd{\vb{\ell}} \equiv \vu{e}_x \dd{x} + \: \vu{e}_y \dd{y} + \: \vu{e}_z \dd{z} \end{aligned}$$ The Cartesian differential elements can be rewritten in $$(c_1, c_2, c_3)$$ with the chain rule: $$\begin{aligned} \dd{\vb{\ell}} = \quad &\vu{e}_x \bigg( \pdv{x}{c_1} \dd{c_1} + \: \pdv{x}{c_2} \dd{c_2} + \: \pdv{x}{c_3} \dd{c_3} \!\bigg) \\ + \: &\vu{e}_y \bigg( \pdv{y}{c_1} \dd{c_1} + \: \pdv{y}{c_2} \dd{c_2} + \: \pdv{y}{c_3} \dd{c_3} \!\bigg) \\ + \: &\vu{e}_z \bigg( \pdv{z}{c_1} \dd{c_1} + \: \pdv{z}{c_2} \dd{c_2} + \: \pdv{z}{c_3} \dd{c_3} \!\bigg) \\ = \quad &\bigg( \pdv{x}{c_1} \vu{e}_x + \pdv{y}{c_1} \vu{e}_y + \pdv{z}{c_1} \vu{e}_z \bigg) \dd{c_1} \\ + &\bigg( \pdv{x}{c_2} \vu{e}_x + \pdv{y}{c_2} \vu{e}_y + \pdv{z}{c_2} \vu{e}_z \bigg) \dd{c_2} \\ + &\bigg( \pdv{x}{c_3} \vu{e}_x + \pdv{y}{c_3} \vu{e}_y + \pdv{z}{c_3} \vu{e}_z \bigg) \dd{c_3} \end{aligned}$$ From this we define the **scale factors** $$h_1$$, $$h_2$$ and $$h_3$$ and **local basis vectors** $$\vu{e}_1$$, $$\vu{e}_2$$ and $$\vu{e}_3$$: $$\begin{aligned} \boxed{ \begin{aligned} h_1 \vu{e}_1 &\equiv \pdv{x}{c_1} \vu{e}_x + \pdv{y}{c_1} \vu{e}_y + \pdv{z}{c_1} \vu{e}_z \\ h_2 \vu{e}_2 &\equiv \pdv{x}{c_2} \vu{e}_x + \pdv{y}{c_2} \vu{e}_y + \pdv{z}{c_2} \vu{e}_z \\ h_3 \vu{e}_3 &\equiv \pdv{x}{c_3} \vu{e}_x + \pdv{y}{c_3} \vu{e}_y + \pdv{z}{c_3} \vu{e}_z \end{aligned} } \end{aligned}$$ Where $$\vu{e}_1$$, $$\vu{e}_2$$ and $$\vu{e}_3$$ are normalized, and orthogonal for any orthogonal curvilinear system. They are called *local* basis vectors because they generally depend on $$(c_1, c_2, c_3)$$, i.e. their directions vary from position to position. Their definitions can also be inverted: $$\begin{aligned} \boxed{ \begin{aligned} \vu{e}_x &\equiv \pdv{c_1}{x} h_1 \vu{e}_1 + \pdv{c_2}{x} h_2 \vu{e}_2 + \pdv{c_3}{x} h_3 \vu{e}_3 \\ \vu{e}_y &\equiv \pdv{c_1}{y} h_1 \vu{e}_1 + \pdv{c_2}{y} h_2 \vu{e}_2 + \pdv{c_3}{y} h_3 \vu{e}_3 \\ \vu{e}_z &\equiv \pdv{c_1}{z} h_1 \vu{e}_1 + \pdv{c_2}{z} h_2 \vu{e}_2 + \pdv{c_3}{z} h_3 \vu{e}_3 \end{aligned} } \end{aligned}$$ In the following subsections, we use the scale factors $$h_1$$, $$h_2$$ and $$h_3$$ to derive general formulae for converting vector calculus from Cartesian coordinates to $$(c_1, c_2, c_3)$$. ## Differential elements The point of the scale factors $$h_1$$, $$h_2$$ and $$h_3$$, as can be seen from their derivation, is to correct for "distortions" of the coordinates compared to the Cartesian system, such that the line element $$\dd{\vb{\ell}}$$ retains its length. As was already established above: $$\begin{aligned} \boxed{ \dd{\vb{\ell}} = \vu{e}_1 h_1 \dd{c_1} + \: \vu{e}_2 h_2 \dd{c_2} + \: \vu{e}_3 h_3 \dd{c_3} } \end{aligned}$$ These terms are the differentials along each of the local basis vectors. Let us now introduce the following notation, e.g. for $$c_1$$: $$\begin{aligned} \dd{}_1\!\vb{x} \equiv \pdv{\vb{x}}{c_1} \dd{c_1} = \Big( \pdv{x}{c_1} \vu{e}_x + \pdv{y}{c_1} \vu{e}_y + \pdv{z}{c_1} \vu{e}_z \Big) \dd{c_1} = \vu{e}_1 h_1 \dd{c_1} \end{aligned}$$ And likewise we define $$\dd{}_2\!\vb{x}$$ and $$\dd{}_3\!\vb{x}$$. All differential elements (as found in e.g. integrals) can be expressed in terms of $$\dd{}_1\!\vb{x}$$, $$\dd{}_2\!\vb{x}$$ and $$\dd{}_3\!\vb{x}$$. The differential normal vector element $$\dd{\vu{S}}$$ in a surface integral is hence given by: $$\begin{aligned} \dd{\vb{S}} &= \dd{}_1\!\vb{x} \cross \dd{}_2\!\vb{x} + \dd{}_2\!\vb{x} \cross \dd{}_3\!\vb{x} + \dd{}_3\!\vb{x} \cross \dd{}_1\!\vb{x} \\ &= (\vu{e}_1 \cross \vu{e}_2) \: h_1 h_2 \dd{c_1} \dd{c_2} + \: (\vu{e}_2 \cross \vu{e}_3) \: h_2 h_3 \dd{c_2} \dd{c_3} + \: (\vu{e}_3 \cross \vu{e}_1) \: h_1 h_3 \dd{c_1} \dd{c_3} \end{aligned}$$ In an orthonormal basis we have $$\vu{e}_1 \cross \vu{e}_2 = \vu{e}_3$$, $$\vu{e}_2 \cross \vu{e}_3 = \vu{e}_1$$ and $$\vu{e}_3 \cross \vu{e}_1 = \vu{e}_2$$, so: $$\begin{aligned} \boxed{ \dd{\vb{S}} = \vu{e}_1 \: h_2 h_3 \dd{c_2} \dd{c_3} + \: \vu{e}_2 \: h_1 h_3 \dd{c_1} \dd{c_3} + \: \vu{e}_3 \: h_1 h_2 \dd{c_1} \dd{c_2} } \end{aligned}$$ Next, the differential volume $$\dd{V}$$ must also be corrected by the scale factors: $$\begin{aligned} \dd{V} = \dd{}_1\!\vb{x} \cross \dd{}_2\!\vb{x} \cdot \dd{}_3\!\vb{x} = (\vu{e}_1 \cross \vu{e}_2 \cdot \vu{e}_3) \: h_1 h_2 h_3 \dd{c_1} \dd{c_2} \dd{c_3} \end{aligned}$$ Once again $$\vu{e}_1 \cross \vu{e}_2 = \vu{e}_3$$, so the vectors disappear from the expression, leaving: $$\begin{aligned} \boxed{ \dd{V} = h_1 h_2 h_3 \dd{c_1} \dd{c_2} \dd{c_3} } \end{aligned}$$ ## Basis vector derivatives Orthonormality tells us that $$\vu{e}_j \cdot \vu{e}_j = 1$$ for $$j = 1,2,3$$. Differentiating with respect to $$c_k$$: $$\begin{aligned} \pdv{}{c_k} (\vu{e}_j \cdot \vu{e}_j) = 2 \pdv{\vu{e}_j}{c_k} \cdot \vu{e}_j = \pdv{}{c_k} 1 = 0 \end{aligned}$$ This means that the $$c_k$$-derivative of $$\vu{e}_j$$ will always be orthogonal to $$\vu{e}_j$$, for all $$j$$ and $$k$$. Indeed, the general expression for the derivative of a local basis vector is: $$\begin{aligned} \boxed{ \pdv{\vu{e}_j}{c_k} = \frac{1}{h_j} \pdv{h_k}{c_j} \vu{e}_k - \delta_{jk} \sum_{l} \frac{1}{h_l} \pdv{h_j}{c_l} \vu{e}_l } \end{aligned}$$ Where $$\delta_{jk}$$ is the Kronecker delta. For example, if $$j = 1$$, writing this out gives: $$\begin{aligned} \pdv{\vu{e}_1}{c_1} &= - \frac{1}{h_2} \pdv{h_1}{c_2} \vu{e}_2 - \frac{1}{h_3} \pdv{h_1}{c_3} \vu{e}_3 \\ \pdv{\vu{e}_1}{c_2} &= \frac{1}{h_1} \pdv{h_2}{c_1} \vu{e}_2 \\ \pdv{\vu{e}_1}{c_3} &= \frac{1}{h_1} \pdv{h_3}{c_1} \vu{e}_3 \end{aligned}$$ {% include proof/start.html id="proof-deriv-basis" -%} In this proof we set $$j = 1$$ and $$k = 2$$ for clarity, but the approach is valid for any $$j \neq k$$. We know the definitions of $$h_1 \vu{e}_1$$ and $$h_2 \vu{e}_2$$, and that differentiations can be reordered: $$\begin{aligned} \pdv{}{c_2} (h_1 \vu{e}_1) &= \pdv{}{c_2} \pdv{}{c_1} \big( x \vu{e}_x + y \vu{e}_y + z \vu{e}_z \big) = \pdv{}{c_1} (h_2 \vu{e}_2) \end{aligned}$$ Expanding this according to the product rule of differentiation: $$\begin{aligned} \pdv{h_1}{c_2} \vu{e}_1 + h_1 \pdv{\vu{e}_1}{c_2} = \pdv{h_2}{c_1} \vu{e}_2 + h_2 \pdv{\vu{e}_2}{c_1} \end{aligned}$$ We rearrange this in two different ways. Indeed, these two equations are identical: $$\begin{aligned} h_1 \pdv{\vu{e}_1}{c_2} &= \pdv{h_2}{c_1} \vu{e}_2 + \Big( h_2 \pdv{\vu{e}_2}{c_1} - \pdv{h_1}{c_2} \vu{e}_1 \Big) \\ h_2 \pdv{\vu{e}_2}{c_1} &= \pdv{h_1}{c_2} \vu{e}_1 + \Big( h_1 \pdv{\vu{e}_1}{c_2} - \pdv{h_2}{c_1} \vu{e}_2 \Big) \end{aligned}$$ Recall that all derivatives of $$\vu{e}_j$$ are orthogonal to $$\vu{e}_j$$. Therefore, the first equation's right-hand side must be orthogonal to $$\vu{e}_1$$, and the second's to $$\vu{e}_2$$. We deduce that the parenthesized expressions are proportional to $$\vu{e}_3$$, and call the proportionality factors $$\lambda_{123}$$ and $$\lambda_{213}$$: $$\begin{aligned} h_1 \pdv{\vu{e}_1}{c_2} &= \pdv{h_2}{c_1} \vu{e}_2 + \lambda_{213} \vu{e}_3 \\ h_2 \pdv{\vu{e}_2}{c_1} &= \pdv{h_1}{c_2} \vu{e}_1 + \lambda_{123} \vu{e}_3 \end{aligned}$$ Since these equations are identical, by comparing the definition of $$\lambda_{123}$$ to the other side of the equation, we see that $$\lambda_{123} = \lambda_{213}$$: $$\begin{aligned} \lambda_{123} \vu{e}_3 &= h_1 \pdv{\vu{e}_1}{c_2} - \pdv{h_2}{c_1} \vu{e}_2 = h_2 \pdv{\vu{e}_2}{c_1} - \pdv{h_1}{c_2} \vu{e}_1 = \lambda_{213} \vu{e}_3 \end{aligned}$$ In general, $$\lambda_{jkl} = \lambda_{kjl}$$ for $$j \neq k \neq l$$. Next, we dot-multiply $$\lambda_{123}$$'s equation by $$\vu{e}_3$$, using that $$\vu{e}_2 \cdot \vu{e}_3 = 0$$ and consequently $$\ipdv{(\vu{e}_2 \cdot \vu{e}_3)}{c_1} = 0$$: $$\begin{aligned} \lambda_{123} &= h_2 \pdv{\vu{e}_2}{c_1} \cdot \vu{e}_3 - \pdv{h_1}{c_2} \vu{e}_1 \cdot \vu{e}_3 = h_2 \pdv{\vu{e}_2}{c_1} \cdot \vu{e}_3 = - h_2 \frac{h_3}{h_3} \pdv{\vu{e}_3}{c_1} \cdot \vu{e}_2 = - \frac{h_2}{h_3} \lambda_{132} \end{aligned}$$ In general, $$\lambda_{jkl} = - h_k \lambda_{jlk} / h_l$$ for $$j \neq k \neq l$$. Combining this fact with $$\lambda_{jkl} = \lambda_{kjl}$$ gives: $$\begin{aligned} \lambda_{jkl} = - \frac{h_k}{h_l} \lambda_{jlk} = - \frac{h_k}{h_l} \lambda_{ljk} = \frac{h_k}{h_l} \frac{h_j}{h_k} \lambda_{lkj} = \frac{h_j}{h_l} \lambda_{klj} = - \frac{h_j}{h_l} \frac{h_l}{h_j} \lambda_{kjl} = - \lambda_{jkl} \end{aligned}$$ But $$\lambda_{jkl} = -\lambda_{jkl}$$ is only possible if $$\lambda_{jkl}$$ is zero. Thus $$\lambda_{123}$$'s equation reduces to: $$\begin{aligned} h_2 \pdv{\vu{e}_2}{c_1} &= \pdv{h_1}{c_2} \vu{e}_1 \qquad \implies \qquad \pdv{\vu{e}_2}{c_1} = \frac{1}{h_2} \pdv{h_1}{c_2} \vu{e}_1 \end{aligned}$$ This gives us the general expression for $$\ipdv{\vu{e}_j}{c_k}$$ when $$j \neq k$$, but what about $$j = k$$? Well, from orthogonality we know: $$\begin{aligned} 0 = \vu{e}_2 \cdot \vu{e}_1 = \pdv{}{c_1} (\vu{e}_2 \cdot \vu{e}_1) = \pdv{\vu{e}_2}{c_1} \cdot \vu{e}_1 + \vu{e}_2 \cdot \pdv{\vu{e}_1}{c_1} \end{aligned}$$ We just calculated one of those terms, so this equation gives us the other: $$\begin{aligned} \vu{e}_2 \cdot \pdv{\vu{e}_1}{c_1} = - \pdv{\vu{e}_2}{c_1} \cdot \vu{e}_1 = - \frac{1}{h_2} \pdv{h_1}{c_2} \end{aligned}$$ Now we have the $$\vu{e}_2$$-component of $$\ipdv{\vu{e}_1}{c_1}$$, and can find the $$\vu{e}_3$$-component in the same way: $$\begin{aligned} \vu{e}_3 \cdot \pdv{\vu{e}_1}{c_1} = - \pdv{\vu{e}_3}{c_1} \cdot \vu{e}_1 = - \frac{1}{h_3} \pdv{h_1}{c_3} \end{aligned}$$ Adding up the $$\vu{e}_2$$- and $$\vu{e}_3$$-components gives the desired formula. There is no $$\vu{e}_1$$-component because $$\ipdv{\vu{e}_1}{c_1}$$ must be orthogonal to $$\vu{e}_1$$. {% include proof/end.html id="proof-deriv-basis" -%} ## Gradient of a scalar The gradient $$\nabla f$$ of a scalar field $$f$$ has the following components in $$(c_1, c_2, c_3)$$: $$\begin{aligned} \boxed{ (\nabla f)_j = \frac{1}{h_j} \pdv{f}{c_j} } \end{aligned}$$ When this index notation is written out in full, the gradient $$\nabla f$$ becomes: $$\begin{aligned} \nabla f = \frac{1}{h_1} \pdv{f}{c_1} \vu{e}_1 + \frac{1}{h_2} \pdv{f}{c_2} \vu{e}_2 + \frac{1}{h_3} \pdv{f}{c_3} \vu{e}_3 \end{aligned}$$ {% include proof/start.html id="proof-grad-scalar" -%} For any unit vector $$\vu{u}$$, we can project $$\nabla f$$ onto it to get the component of $$\nabla f$$ along $$\vu{u}$$. Let us choose $$\vu{u} = \vu{e}_1$$, then such a projection gives: $$\begin{aligned} \nabla f \cdot \vu{e}_1 &= \bigg( \pdv{f}{x} \vu{e}_x + \pdv{f}{y} \vu{e}_y + \pdv{f}{z} \vu{e}_z \bigg) \cdot \frac{1}{h_1} \bigg( \pdv{x}{c_1} \vu{e}_x + \pdv{y}{c_1} \vu{e}_y + \pdv{z}{c_1} \vu{e}_z \bigg) \\ &= \frac{1}{h_1} \bigg( \pdv{f}{x} \pdv{x}{c_1} + \pdv{f}{y} \pdv{y}{c_1} + \pdv{f}{z} \pdv{z}{c_1} \bigg) \\ &= \frac{1}{h_1} \pdv{f}{c_1} \end{aligned}$$ And we can do the same for $$\vu{e}_2$$ and $$\vu{e}_3$$, yielding analogous results: $$\begin{aligned} \nabla f \cdot \vu{e}_2 = \frac{1}{h_2} \pdv{f}{c_2} \qquad \qquad \nabla f \cdot \vu{e}_3 = \frac{1}{h_3} \pdv{f}{c_3} \end{aligned}$$ Finally, to express $$\nabla f$$ in the new coordinate system $$(c_1, c_2, c_3)$$, we simply combine these projections for all the basis vectors: $$\begin{aligned} \nabla f = (\nabla f \cdot \vu{e}_1) \vu{e}_1 + (\nabla f \cdot \vu{e}_2) \vu{e}_2 + (\nabla f \cdot \vu{e}_3) \vu{e}_3 \end{aligned}$$ {% include proof/end.html id="proof-grad-scalar" %} ## Divergence of a vector The divergence of a vector field $$\vb{V} = V_1 \vu{e}_1 + V_2 \vu{e}_2 + V_3 \vu{e}_3$$ is given in $$(c_1, c_2, c_3)$$ by: $$\begin{aligned} \boxed{ \nabla \cdot \vb{V} = \sum_{j} \frac{1}{H} \pdv{}{c_j} \bigg( \frac{H V_j}{h_j} \bigg) } \end{aligned}$$ Where $$H \equiv h_1 h_2 h_3$$. When this index notation is written out in full, it becomes: $$\begin{aligned} \nabla \cdot \vb{V} = \frac{1}{h_1 h_2 h_3} \bigg( \pdv{}{c_1} (h_2 h_3 V_1) + \pdv{}{c_2} (h_1 h_3 V_2) + \pdv{}{c_3} (h_1 h_2 V_3) \bigg) \end{aligned}$$ {% include proof/start.html id="proof-div-vector-1" label="Proof 1" -%} From our earlier calculation of $$\nabla f$$, we know how to express the del $$\nabla$$ in $$(c_1, c_2, c_3)$$. Now we simply take the dot product of $$\nabla$$ and $$\vb{V}$$: $$\begin{aligned} \nabla \cdot \vb{V} &= \bigg( \vu{e}_1 \frac{1}{h_1} \pdv{}{c_1} + \vu{e}_2 \frac{1}{h_2} \pdv{}{c_2} + \vu{e}_3 \frac{1}{h_3} \pdv{}{c_3} \bigg) \cdot \bigg( V_1 \vu{e}_1 + V_2 \vu{e}_2 + V_3 \vu{e}_3 \bigg) \\ &= \bigg( \sum_{j} \vu{e}_j \frac{1}{h_j} \pdv{}{c_j} \bigg) \cdot \bigg( \sum_{k} V_k \vu{e}_k \bigg) \\ &= \sum_{jk} \vu{e}_j \cdot \frac{1}{h_j} \pdv{}{c_j} (V_k \vu{e}_k) \\ &= \sum_{jk} (\vu{e}_j \cdot \vu{e}_k) \frac{1}{h_j} \pdv{V_k}{c_j} + \sum_{jk} \Big( \vu{e}_j \cdot \pdv{\vu{e}_k}{c_j} \Big) \frac{V_k}{h_j} \end{aligned}$$ Substituting our expression for the derivatives of the local basis vectors, we find: $$\begin{aligned} \nabla \cdot \vb{V} &= \sum_{jk} (\vu{e}_j \cdot \vu{e}_k) \frac{1}{h_j} \pdv{V_k}{c_j} + \sum_{jk} \vu{e}_j \cdot \bigg( \frac{1}{h_k} \pdv{h_j}{c_k} \vu{e}_j - \delta_{jk} \sum_{l} \frac{1}{h_l} \pdv{h_k}{c_l} \vu{e}_l \bigg) \frac{V_k}{h_j} \\ &= \sum_{jk} (\vu{e}_j \cdot \vu{e}_k) \frac{1}{h_j} \pdv{V_k}{c_j} + \sum_{jk} (\vu{e}_j \cdot \vu{e}_j) \frac{V_k}{h_j h_k} \pdv{h_j}{c_k} - \sum_{jl} (\vu{e}_j \cdot \vu{e}_l) \frac{V_j}{h_j h_l} \pdv{h_j}{c_l} \\ &= \sum_{j} \frac{1}{h_j} \pdv{V_j}{c_j} + \sum_{jk} \frac{V_k}{h_j h_k} \pdv{h_j}{c_k} - \sum_{j} \frac{V_j}{h_j h_j} \pdv{h_j}{c_j} \\ &= \sum_{j} \frac{1}{h_j} \pdv{V_j}{c_j} + \sum_{j} \sum_{k \neq j} \frac{V_k}{h_j h_k} \pdv{h_j}{c_k} \\ \end{aligned}$$ Where we noticed that the latter two terms cancel out if $$k = j$$. Now, to proceed, it is easiest to just write out the index notation: $$\begin{aligned} \nabla \cdot \vb{V} &= \sum_{j} \frac{1}{h_j} \pdv{V_j}{c_j} + \sum_{j} \sum_{k \neq j} \frac{V_k}{h_j h_k} \pdv{h_j}{c_k} \\ &= \quad \frac{1}{h_1} \pdv{V_1}{c_1} + \frac{V_1}{h_1 h_2} \pdv{h_2}{c_1} + \frac{V_1}{h_1 h_3} \pdv{h_3}{c_1} \\ &\quad\: + \frac{V_2}{h_1 h_2} \pdv{h_1}{c_2} + \frac{1}{h_2} \pdv{V_2}{c_2} + \frac{V_2}{h_2 h_3} \pdv{h_3}{c_2} \\ &\quad\: + \frac{V_3}{h_1 h_3} \pdv{h_1}{c_3} + \frac{V_3}{h_2 h_3} \pdv{h_2}{c_3} + \frac{1}{h_3} \pdv{V_3}{c_3} \\ &= \frac{1}{h_1 h_2 h_3} \bigg( h_2 h_3 \pdv{V_1}{c_1} + h_3 V_1 \pdv{h_2}{c_1} + h_2 V_1 \pdv{h_3}{c_1} \\ &\qquad\qquad + h_3 V_2 \pdv{h_1}{c_2} + h_1 h_3 \pdv{V_2}{c_2} + h_1 V_2 \pdv{h_3}{c_2} \\ &\qquad\qquad + h_2 V_3 \pdv{h_1}{c_3} + h_1 V_3 \pdv{h_2}{c_3} + h_1 h_2 \pdv{V_3}{c_3} \bigg) \end{aligned}$$ Which can clearly be rewritten with the product rule, leading to the desired formula. {% include proof/end.html id="proof-div-vector-1" label="Proof 1" %} Boas gives an alternative proof, which is shorter but more specialized: {% include proof/start.html id="proof-div-vector-2" label="Proof 2" -%} We take the divergence of the $$c_1$$-component of $$\vb{V}$$ and expand it: $$\begin{aligned} \nabla \cdot (V_1 \vu{e}_1) &= \nabla \cdot \bigg( \Big( h_2 h_3 V_1 \Big) \Big( \frac{\vu{e}_1}{h_2 h_3} \Big) \bigg) \\ &= \nabla (h_2 h_3 V_1) \cdot \Big( \frac{\vu{e}_1}{h_2 h_3} \Big) + (h_2 h_3 V_1) \bigg( \nabla \cdot \Big( \frac{\vu{e}_1}{h_2 h_3} \Big) \bigg) \end{aligned}$$ The latter term is zero, because in any orthogonal basis $$\vu{e}_2 \cross \vu{e}_3 = \vu{e}_1$$, and according to our gradient formula we have $$\nabla c_2 = \vu{e}_2 / h_2$$ etc., so: $$\begin{aligned} \nabla \cdot \bigg( \frac{\vu{e}_1}{h_2 h_3} \bigg) &= \nabla \cdot \bigg( \frac{\vu{e}_2}{h_2} \cross \frac{\vu{e}_3}{h_3} \bigg) \\ &= \nabla \cdot \big( \nabla c_2 \cross \nabla c_3 \big) \\ &= \nabla c_3 \cdot (\nabla \cross \nabla c_2) - \nabla c_2 \cdot (\nabla \cross \nabla c_3) \\ &= 0 \end{aligned}$$ Where we used a vector identity and the fact that the curl of a gradient must vanish. We are thus left with the former term, to which we apply our gradient formula again, where only the $$\vu{e}_1$$-term survives due to the dot product and orthogonality: $$\begin{aligned} \nabla \cdot (V_1 \vu{e}_1) &= \nabla (h_2 h_3 V_1) \cdot \frac{\vu{e}_1}{h_2 h_3} \\ &= \frac{1}{h_1 h_2 h_3} \pdv{}{c_1} (h_2 h_3 V_1) \end{aligned}$$ We then repeat this procedure for $$\vb{V}$$'s other components, and simply add up the results to get the desired formula. {% include proof/end.html id="proof-div-vector-2" label="Proof 2" %} ## Curl of a vector The curl of a vector field $$\vb{V}$$ has the following components in $$(c_1, c_2, c_3)$$, where $$\varepsilon_{j k l}$$ is the *Levi-Civita symbol* in 3D: $$\begin{aligned} \boxed{ \begin{aligned} (\nabla \cross \vb{V})_j = \sum_{k l} \frac{\varepsilon_{j k l}}{h_k h_l} \pdv{}{c_k} (h_l V_l) \end{aligned} } \end{aligned}$$ When this index notation is written out in full, the curl $$\nabla \cross \vb{V}$$ becomes: $$\begin{aligned} \nabla \times \vb{V} = \quad\: &\frac{1}{h_2 h_3} \bigg( \pdv{(h_3 V_3)}{c_2} - \pdv{(h_2 V_2)}{c_3} \bigg) \vu{e}_1 \\ + \: &\frac{1}{h_1 h_3} \bigg( \pdv{(h_1 V_1)}{c_3} - \pdv{(h_3 V_3)}{c_1} \bigg) \vu{e}_2 \\ + \: &\frac{1}{h_1 h_2} \bigg( \pdv{(h_2 V_2)}{c_1} - \pdv{(h_1 V_1)}{c_2} \bigg) \vu{e}_3 \end{aligned}$$ {% include proof/start.html id="proof-curl-vector-1" label="Proof 1" -%} From our earlier calculation of $$\nabla f$$, we know how to express the del $$\nabla$$ in $$(c_1, c_2, c_3)$$. Now we simply take the cross product of $$\nabla$$ and $$\vb{V}$$: $$\begin{aligned} \nabla \cross \vb{V} &= \bigg( \vu{e}_1 \frac{1}{h_1} \pdv{}{c_1} + \vu{e}_2 \frac{1}{h_2} \pdv{}{c_2} + \vu{e}_3 \frac{1}{h_3} \pdv{}{c_3} \bigg) \cross \bigg( V_1 \vu{e}_1 + V_2 \vu{e}_2 + V_3 \vu{e}_3 \bigg) \\ &= \bigg( \sum_{j} \vu{e}_j \frac{1}{h_j} \pdv{}{c_j} \bigg) \cross \bigg( \sum_{k} V_k \vu{e}_k \bigg) \\ &= \sum_{jk} \vu{e}_j \cross \frac{1}{h_j} \pdv{}{c_j} (V_k \vu{e}_k) \\ &= \sum_{jk} \frac{1}{h_j} \pdv{V_k}{c_j} (\vu{e}_j \cross \vu{e}_k) + \sum_{jk} \frac{V_k}{h_j} \Big( \vu{e}_j \cross \pdv{\vu{e}_k}{c_j} \Big) \end{aligned}$$ Substituting our expression for the derivatives of the local basis vectors, we find: $$\begin{aligned} \nabla \cross \vb{V} &= \sum_{jk} \frac{1}{h_j} \pdv{V_k}{c_j} (\vu{e}_j \cross \vu{e}_k) + \sum_{jk} \frac{V_k}{h_j} \vu{e}_j \cross \bigg( \frac{1}{h_k} \pdv{h_j}{c_k} \vu{e}_j - \delta_{jk} \sum_{l} \frac{1}{h_l} \pdv{h_k}{c_l} \vu{e}_l \bigg) \\ &= \sum_{jk} \frac{1}{h_j} \pdv{V_k}{c_j} (\vu{e}_j \cross \vu{e}_k) + \sum_{jk} \frac{V_k}{h_j h_k} \pdv{h_j}{c_k} (\vu{e}_j \cross \vu{e}_j) - \sum_{jl} \frac{V_j}{h_j h_l} \pdv{h_j}{c_l} (\vu{e}_j \cross \vu{e}_l) \\ &= \sum_{jk} \frac{1}{h_j} \pdv{V_k}{c_j} (\vu{e}_j \cross \vu{e}_k) - \sum_{jl} \frac{V_j}{h_j h_l} \pdv{h_j}{c_l} (\vu{e}_j \cross \vu{e}_l) \end{aligned}$$ Because the cross product of a vector with itself is always zero. Now, in an orthonormal basis we have $$\vu{e}_1 \cross \vu{e}_2 = \vu{e}_3$$, $$\vu{e}_2 \cross \vu{e}_3 = \vu{e}_1$$ and $$\vu{e}_3 \cross \vu{e}_1 = \vu{e}_2$$. This is written in index notation by summing over $$l$$ and multiplying by the Levi-Civita symbol $$\varepsilon_{jkl}$$: $$\begin{aligned} \nabla \cross \vb{V} &= \sum_{jk} \bigg( \frac{1}{h_j} \pdv{V_k}{c_j} - \frac{V_j}{h_j h_k} \pdv{h_j}{c_k} \bigg) (\vu{e}_j \cross \vu{e}_k) \\ &= \sum_{jkl} \varepsilon_{jkl} \bigg( \frac{1}{h_j} \pdv{V_k}{c_j} - \frac{V_j}{h_j h_k} \pdv{h_j}{c_k} \bigg) \vu{e}_l \\ &= \sum_{jkl} \varepsilon_{jkl} \bigg( \frac{1}{h_j} \frac{h_k}{h_k} \pdv{V_k}{c_j} + \frac{V_k}{h_j h_k} \pdv{h_k}{c_j} \bigg) \vu{e}_l \\ &= \sum_{jkl} \varepsilon_{jkl} \bigg( \frac{1}{h_j h_k} \pdv{}{c_j} (h_k V_k) \bigg) \vu{e}_l \end{aligned}$$ Where we have used that $$\varepsilon_{jkl} = -\varepsilon_{kjl}$$, thereby arriving at the desired formula. {% include proof/end.html id="proof-curl-vector-1" label="Proof 1" %} Boas gives an alternative proof, which is shorter but more specialized: {% include proof/start.html id="proof-curl-vector-2" label="Proof 2" -%} We take the curl of the $$c_1$$-component of $$\vb{V}$$ and apply the product rule: $$\begin{aligned} \nabla \cross (V_1 \vu{e}_1) &= \nabla \cross \bigg( \Big( h_1 V_1 \Big) \Big( \frac{\vu{e}_1}{h_1} \Big) \bigg) \\ &= \nabla (h_1 V_1) \cross \Big( \frac{\vu{e}_1}{h_1} \Big) + (h_1 V_1) \Big( \nabla \cross \frac{\vu{e}_1}{h_1} \Big) \end{aligned}$$ The latter term disappears, because $$\nabla c_1 = \vu{e}_1 / h_1 $$ and the curl of a gradient is always zero. Applying our gradient formula to the remaining term, we find: $$\begin{aligned} \nabla \cross (V_1 \vu{e}_1) &= \nabla (h_1 V_1) \cross \Big( \frac{\vu{e}_1}{h_1} \Big) \\ &= \bigg( \frac{1}{h_1} \pdv{(h_1 V_1)}{c_1} \vu{e}_1 + \frac{1}{h_2} \pdv{(h_1 V_1)}{c_2} \vu{e}_2 + \frac{1}{h_3} \pdv{(h_1 V_1)}{c_3} \vu{e}_3 \bigg) \cross \Big( \frac{\vu{e}_1}{h_1} \Big) \\ &= 0 - \frac{1}{h_1 h_2} \pdv{(h_1 V_1)}{c_2} \vu{e}_3 + \frac{1}{h_1 h_3} \pdv{(h_1 V_1)}{c_3} \vu{e}_2 \end{aligned}$$ Where we have used the fact that $$\vu{e}_1$$, $$\vu{e}_2$$ and $$\vu{e}_3$$ are related to each other by cross products thanks to orthonormality, e.g. $$\vu{e}_2 \cross \vu{e}_3 = \vu{e}_1$$. We then repeat this procedure for $$\vb{V}$$'s other components, and simply add up the results to get the desired formula. {% include proof/end.html id="proof-curl-vector-2" label="Proof 2" %} ## Laplacian of a scalar The Laplacian $$\nabla^2 f$$ of a scalar field $$f$$ is calculated as follows in $$(c_1, c_2, c_3)$$: $$\begin{aligned} \boxed{ \nabla^2 f = \sum_{j} \frac{1}{H} \pdv{}{c_j} \bigg( \frac{H}{h_j^2} \pdv{f}{c_j} \bigg) } \end{aligned}$$ Where $$H \equiv h_1 h_2 h_3$$. When this index notation is written out in full, it becomes: $$\begin{aligned} \nabla^2 f = \frac{1}{h_1 h_2 h_3} \bigg( \pdv{}{c_1} \Big( \frac{h_2 h_3}{h_1} \pdv{f}{c_1} \Big) + \pdv{}{c_2} \Big( \frac{h_1 h_3}{h_2} \pdv{f}{c_2} \Big) + \pdv{}{c_3} \Big( \frac{h_1 h_2}{h_3} \pdv{f}{c_3} \Big) \bigg) \end{aligned}$$ This is trivial to prove: $$\nabla^2 f = \nabla \cdot (\nabla f)$$, so combining our previous formulae is enough. ## Gradient of a divergence The gradient of a divergence $$\nabla (\nabla \cdot \vb{V})$$ has the following components in $$(c_1, c_2, c_3)$$: $$\begin{aligned} \boxed{ \big( \nabla (\nabla \cdot \vb{V}) \big)_j = \frac{1}{h_j} \pdv{}{c_j} \bigg( \sum_{k} \frac{1}{H} \pdv{}{c_k} \Big( \frac{H V_k}{h_k} \Big) \bigg) } \end{aligned}$$ Where $$H \equiv h_1 h_2 h_3$$. This is trivial to prove: $$\nabla \cdot \vb{V}$$ is a scalar, which we insert into our gradient formula. We no longer write out the index notation, as the formulae become quite long. ## Gradient of a vector It also possible to take the gradient of a vector $$\vb{V} = V_1 \vu{e}_1 + V_2 \vu{e}_2 + V_3 \vu{e}_3$$, yielding a 2nd-order tensor with the following components in $$(c_1, c_2, c_3)$$, for $$j \neq k$$: $$\begin{aligned} \boxed{ \begin{aligned} (\nabla \vb{V})_{jj} &= \frac{1}{h_j} \pdv{V_j}{c_j} + \sum_{k \neq j} \frac{V_k}{h_j h_k} \pdv{h_j}{c_k} \\ (\nabla \vb{V})_{jk} &= \frac{1}{h_j} \pdv{V_k}{c_j} - \frac{V_j}{h_j h_k} \pdv{h_j}{c_k} \end{aligned} } \end{aligned}$$ {% comment %} When this index notation is written out in full, the gradient $$\nabla \vb{V}$$ becomes: $$\begin{aligned} \nabla \vb{V} = \quad \: &\bigg( \frac{1}{h_1} \pdv{V_1}{c_1} + \frac{V_2}{h_1 h_2} \pdv{h_1}{c_2} + \frac{V_3}{h_1 h_3} \pdv{h_1}{c_3} \bigg) \vu{e}_1 \vu{e}_1 \\ + \: &\bigg( \frac{1}{h_1} \pdv{V_2}{c_1} - \frac{V_1}{h_1 h_2} \pdv{h_1}{c_2} \bigg) \vu{e}_1 \vu{e}_2 + \bigg( \frac{1}{h_1} \pdv{V_3}{c_1} - \frac{V_1}{h_1 h_3} \pdv{h_1}{c_3} \bigg) \vu{e}_1 \vu{e}_3 \\ + \: &\bigg( \frac{1}{h_2} \pdv{V_2}{c_2} + \frac{V_1}{h_1 h_2} \pdv{h_2}{c_1} + \frac{V_3}{h_2 h_3} \pdv{h_2}{c_3} \bigg) \vu{e}_2 \vu{e}_2 \\ + \: &\bigg( \frac{1}{h_2} \pdv{V_1}{c_2} - \frac{V_2}{h_1 h_2} \pdv{h_2}{c_1} \bigg) \vu{e}_2 \vu{e}_1 + \bigg( \frac{1}{h_2} \pdv{V_3}{c_2} - \frac{V_2}{h_2 h_3} \pdv{h_2}{c_3} \bigg) \vu{e}_2 \vu{e}_3 \\ + \: &\bigg( \frac{1}{h_3} \pdv{V_3}{c_3} + \frac{V_1}{h_1 h_3} \pdv{h_3}{c_1} + \frac{V_2}{h_2 h_3} \pdv{h_3}{c_2} \bigg) \vu{e}_3 \vu{e}_3 \\ + \: &\bigg( \frac{1}{h_3} \pdv{V_1}{c_3} - \frac{V_3}{h_1 h_3} \pdv{h_3}{c_1} \bigg) \vu{e}_3 \vu{e}_1 + \bigg( \frac{1}{h_3} \pdv{V_2}{c_3} - \frac{V_3}{h_2 h_3} \pdv{h_3}{c_2} \bigg) \vu{e}_3 \vu{e}_2 \end{aligned}$$ {% endcomment %} {% include proof/start.html id="proof-grad-vector" -%} From our earlier calculation of $$\nabla f$$, we know how to express the del $$\nabla$$ in $$(c_1, c_2, c_3)$$. Now we simply take the dyadic product of $$\nabla$$ and $$\vb{V}$$: $$\begin{aligned} \nabla \vb{V} &= \bigg( \vu{e}_1 \frac{1}{h_1} \pdv{}{c_1} + \vu{e}_2 \frac{1}{h_2} \pdv{}{c_2} + \vu{e}_3 \frac{1}{h_3} \pdv{}{c_3} \bigg) \bigg( V_1 \vu{e}_1 + V_2 \vu{e}_2 + V_3 \vu{e}_3 \bigg) \\ &= \bigg( \sum_{j} \vu{e}_j \frac{1}{h_j} \pdv{}{c_j} \bigg) \bigg( \sum_{k} V_k \vu{e}_k \bigg) \\ &= \sum_{jk} \vu{e}_j \frac{1}{h_j} \pdv{}{c_j} (V_k \vu{e}_k) \\ &= \sum_{jk} \frac{1}{h_j} \pdv{V_k}{c_j} \vu{e}_j \vu{e}_k + \sum_{jk} \frac{V_k}{h_j} \vu{e}_j \pdv{\vu{e}_k}{c_j} \end{aligned}$$ Substituting our expression for the derivatives of the local basis vectors, we find: $$\begin{aligned} \nabla \vb{V} &= \sum_{jk} \frac{1}{h_j} \pdv{V_k}{c_j} \vu{e}_j \vu{e}_k + \sum_{jk} \frac{V_k}{h_j} \vu{e}_j \bigg( \frac{1}{h_k} \pdv{h_j}{c_k} \vu{e}_j - \delta_{jk} \sum_{l} \frac{1}{h_l} \pdv{h_k}{c_l} \vu{e}_l \bigg) \\ &= \sum_{jk} \frac{1}{h_j} \pdv{V_k}{c_j} \vu{e}_j \vu{e}_k + \sum_{jk} \frac{V_k}{h_j h_k} \pdv{h_j}{c_k} \vu{e}_j \vu{e}_j - \sum_{jl} \frac{V_j}{h_j h_l} \pdv{h_j}{c_l} \vu{e}_j \vu{e}_l \\ &= \sum_{jk} \bigg( \frac{1}{h_j} \pdv{V_k}{c_j} \vu{e}_j \vu{e}_k + \frac{V_k}{h_j h_k} \pdv{h_j}{c_k} \vu{e}_j \vu{e}_j - \frac{V_j}{h_j h_k} \pdv{h_j}{c_k} \vu{e}_j \vu{e}_k \bigg) \end{aligned}$$ This is a 2nd-order tensor, whose diagonal components $$(\nabla \vb{V})_{ll}$$ are given by: $$\begin{aligned} (\nabla \vb{V})_{ll} &= \vu{e}_l \cdot (\nabla \vb{V}) \cdot \vu{e}_l \\ &= \sum_{jk} \frac{1}{h_j} \pdv{V_k}{c_j} \delta_{jl} \delta_{kl} + \sum_{jk} \frac{V_k}{h_j h_k} \pdv{h_j}{c_k} \delta_{jl} \delta_{jl} - \sum_{jk} \frac{V_j}{h_j h_k} \pdv{h_j}{c_k} \delta_{jl} \delta_{kl} \\ &= \frac{1}{h_l} \pdv{V_l}{c_l} + \sum_{k} \frac{V_k}{h_l h_k} \pdv{h_l}{c_k} - \frac{V_l}{h_l h_l} \pdv{h_l}{c_l} \\ &= \frac{1}{h_l} \pdv{V_l}{c_l} + \sum_{k \neq l} \frac{V_k}{h_l h_k} \pdv{h_l}{c_k} \end{aligned}$$ Meanwhile, the off-diagonal components $$(\nabla \vb{V})_{lm}$$ are as follows, with $$l \neq m$$: $$\begin{aligned} (\nabla \vb{V})_{lm} &= \vu{e}_l \cdot (\nabla \vb{V}) \cdot \vu{e}_m \\ &= \sum_{jk} \frac{1}{h_j} \pdv{V_k}{c_j} \delta_{jl} \delta_{km} + \sum_{jk} \frac{V_k}{h_j h_k} \pdv{h_j}{c_k} \delta_{jl} \delta_{jm} - \sum_{jk} \frac{V_j}{h_j h_k} \pdv{h_j}{c_k} \delta_{jl} \delta_{km} \\ &= \frac{1}{h_l} \pdv{V_m}{c_l} + \sum_{k} \frac{V_k}{h_l h_k} \pdv{h_l}{c_k} \delta_{lm} - \frac{V_l}{h_l h_m} \pdv{h_l}{c_m} \\ &= \frac{1}{h_l} \pdv{V_m}{c_l} - \frac{V_l}{h_l h_m} \pdv{h_l}{c_m} \end{aligned}$$ {% include proof/end.html id="proof-grad-vector" %} ## Advection of a vector In physics, a common quantity is the *advection* $$(\vb{U} \cdot \nabla) \vb{V}$$ of a vector $$\vb{V}$$ according to a velocity field $$\vb{U}$$, as found in e.g. a [material derivative](/know/concept/material-derivative/). In $$(c_1, c_2, c_3)$$ its $$c_j$$-component is: $$\begin{aligned} \boxed{ \big( (\vb{U} \cdot \nabla) \vb{V} \big)_j = \sum_{k} \frac{U_k}{h_k} \pdv{V_j}{c_k} + \sum_{k \neq j} \frac{V_k}{h_j h_k} \bigg( U_j \pdv{h_j}{c_k} - U_k \pdv{h_k}{c_j} \bigg) } \end{aligned}$$ {% include proof/start.html id="proof-adv-vector" -%} From our earlier calculation of $$\nabla f$$, we know how to express the del $$\nabla$$ in $$(c_1, c_2, c_3)$$. Thanks to orthogonality, $$\vb{U} \cdot \nabla$$ is therefore simply: $$\begin{aligned} \vb{U} \cdot \nabla &= \bigg( U_1 \vu{e}_1 + U_2 \vu{e}_2 + U_3 \vu{e}_3 \bigg) \cdot \bigg( \vu{e}_1 \frac{1}{h_1} \pdv{}{c_1} + \vu{e}_2 \frac{1}{h_2} \pdv{}{c_2} + \vu{e}_3 \frac{1}{h_3} \pdv{}{c_3} \bigg) \\ &= \bigg( \sum_{j} U_j \vu{e}_j \bigg) \cdot \bigg( \sum_{k} \vu{e}_k \frac{1}{h_k} \pdv{}{c_k} \bigg) \\ &= \sum_{jk} (\vu{e}_j \cdot \vu{e}_k) \frac{U_j}{h_k} \pdv{}{c_k} \\ &= \sum_{j} \frac{U_j}{h_j} \pdv{}{c_j} \end{aligned}$$ We apply this to $$\vb{V}$$ and use the product rule of differentiation: $$\begin{aligned} (\vb{U} \cdot \nabla) \vb{V} &= \bigg( \sum_{j} \frac{U_j}{h_j} \pdv{}{c_j} \bigg) \bigg( \sum_{k} V_k \vu{e}_k \bigg) \\ &= \sum_{jk} \frac{U_j}{h_j} \pdv{}{c_j} (V_k \vu{e}_k) \\ &= \sum_{jk} \frac{U_j}{h_j} \bigg( \pdv{V_k}{c_j} \vu{e}_k + V_k \pdv{\vu{e}_k}{c_j} \bigg) \end{aligned}$$ Substituting our expression for the derivatives of the local basis vectors, we find: $$\begin{aligned} (\vb{U} \cdot \nabla) \vb{V} &= \sum_{jk} \frac{U_j}{h_j} \bigg( \pdv{V_k}{c_j} \vu{e}_k + \frac{V_k}{h_k} \pdv{h_j}{c_k} \vu{e}_j - \delta_{jk} \sum_{l} \frac{V_k}{h_l} \pdv{h_k}{c_l} \vu{e}_l \bigg) \\ &= \sum_{jk} \frac{U_j}{h_j} \pdv{V_k}{c_j} \vu{e}_k + \sum_{jk} \frac{U_j V_k}{h_j h_k} \pdv{h_j}{c_k} \vu{e}_j - \sum_{jl} \frac{U_j V_j}{h_j h_l} \pdv{h_j}{c_l} \vu{e}_l \end{aligned}$$ We rename the indices such that each term contains $$\vu{e}_j$$. Note that when $$k = j$$, the latter two terms cancel out, so we only need to sum for $$k \neq j$$: $$\begin{aligned} (\vb{U} \cdot \nabla) \vb{V} &= \sum_{jk} \frac{U_k}{h_k} \pdv{V_j}{c_k} \vu{e}_j + \sum_{jk} \frac{U_j V_k}{h_j h_k} \pdv{h_j}{c_k} \vu{e}_j - \sum_{jk} \frac{U_k V_k}{h_j h_k} \pdv{h_k}{c_j} \vu{e}_j \\ &= \sum_{jk} \frac{U_k}{h_k} \pdv{V_j}{c_k} \vu{e}_j + \sum_{j} \sum_{k \neq j} \bigg( \frac{U_j V_k}{h_j h_k} \pdv{h_j}{c_k} \vu{e}_j - \frac{U_k V_k}{h_j h_k} \pdv{h_k}{c_j} \vu{e}_j \bigg) \end{aligned}$$ Dot-multiplying by $$\vu{e}_j$$ isolates the $$c_j$$-component and gives the desired formula. {% include proof/end.html id="proof-adv-vector" %} ## Laplacian of a vector The Laplacian $$\nabla^2 \vb{V}$$ of a vector $$\vb{V}$$ has the following components in $$(c_1, c_2, c_3)$$: $$\begin{aligned} \boxed{ \begin{aligned} (\nabla^2 \vb{V})_j &= \sum_{k} \frac{1}{H} \pdv{}{c_k} \bigg( \frac{H}{h_k^2} \pdv{V_j}{c_k} \bigg) \\ &\quad\: + \sum_{k \neq j} \frac{1}{H} \bigg( \pdv{}{c_j} \Big( \frac{H V_k}{h_j^2 h_k} \pdv{h_j}{c_k} \Big) - \pdv{}{c_k} \Big( \frac{H V_k}{h_j h_k^2} \pdv{h_k}{c_j} \Big) \bigg) \\ &\quad\: + \sum_{k \neq j} \frac{1}{h_j h_k} \bigg( \frac{1}{h_j} \pdv{V_k}{c_j} \pdv{h_j}{c_k} - \frac{1}{h_k} \pdv{V_k}{c_k} \pdv{h_k}{c_j} \bigg) \\ &\quad\:- \sum_{k \neq j} \frac{1}{h_j h_k} \bigg( \frac{V_j}{h_j h_k} \Big( \pdv{h_j}{c_k} \Big)^2 + \sum_{l \neq k} \frac{V_l}{h_k h_l} \pdv{h_k}{c_l} \pdv{h_k}{c_j} \bigg) \end{aligned} } \end{aligned}$$ {% include proof/start.html id="proof-lap-vector" -%} We already know how to calculate the Laplacian $$\nabla^2 f$$ of a scalar. From that, we read out the $$\nabla^2$$-operator and apply it to a vector $$\vb{V}$$ instead: $$\begin{aligned} \nabla^2 \vb{V} &= \bigg( \sum_{j} \frac{1}{H} \pdv{}{c_j} \Big( \frac{H}{h_j^2} \pdv{}{c_j} \Big) \bigg) \bigg( \sum_{k} V_k \vu{e}_k \bigg) \\ &= \sum_{jk} \frac{1}{H} \pdv{}{c_j} \bigg( \frac{H}{h_j^2} \pdv{}{c_j} (V_k \vu{e}_k) \bigg) \\ &= \sum_{jk} \frac{1}{H} \pdv{}{c_j} \bigg( \frac{H}{h_j^2} \pdv{V_k}{c_j} \vu{e}_k + \frac{H V_k}{h_j^2} \pdv{\vu{e}_k}{c_j} \bigg) \\ &= \sum_{jk} \frac{1}{H} \pdv{}{c_j} \bigg( \frac{H}{h_j^2} \pdv{V_k}{c_j} \vu{e}_k + \frac{H V_k}{h_j^2} \Big( \frac{1}{h_k} \pdv{h_j}{c_k} \vu{e}_j - \delta_{jk} \sum_{l} \frac{1}{h_l} \pdv{h_k}{c_l} \vu{e}_l \Big) \bigg) \\ &= \sum_{j} \frac{1}{H} \pdv{}{c_j} \bigg( \sum_{k} \frac{H}{h_j^2} \pdv{V_k}{c_j} \vu{e}_k + \sum_{k} \frac{H V_k}{h_j^2 h_k} \pdv{h_j}{c_k} \vu{e}_j - \sum_{l} \frac{H V_j}{h_j^2 h_l} \pdv{h_j}{c_l} \vu{e}_l \bigg) \\ &= \sum_{j} \frac{1}{H} \pdv{}{c_j} \bigg( \sum_{k} \frac{H}{h_j^2} \pdv{V_k}{c_j} \vu{e}_k + \sum_{k \neq j} \frac{H V_k}{h_j^2 h_k} \pdv{h_j}{c_k} \vu{e}_j - \sum_{k \neq j} \frac{H V_j}{h_j^2 h_k} \pdv{h_j}{c_k} \vu{e}_k \bigg) \end{aligned}$$ Where we have noticed that the latter two terms cancel out if $$j = k$$. We expand according to the product rule of differentiation: $$\begin{aligned} \nabla^2 \vb{V} &= \sum_{jk} \frac{1}{H} \pdv{}{c_j} \bigg( \frac{H}{h_j^2} \pdv{V_k}{c_j} \bigg) \vu{e}_k + \sum_{jk} \frac{1}{H} \frac{H}{h_j^2} \pdv{V_k}{c_j} \pdv{\vu{e}_k}{c_j} \\ &\quad\: + \sum_{j} \sum_{k \neq j} \frac{1}{H} \pdv{}{c_j} \bigg( \frac{H V_k}{h_j^2 h_k} \pdv{h_j}{c_k} \bigg) \vu{e}_j + \sum_{j} \sum_{k \neq j} \frac{1}{H} \frac{H V_k}{h_j^2 h_k} \pdv{h_j}{c_k} \pdv{\vu{e}_j}{c_j} \\ &\quad\: - \sum_{j} \sum_{k \neq j} \frac{1}{H} \pdv{}{c_j} \bigg( \frac{H V_j}{h_j^2 h_k} \pdv{h_j}{c_k} \bigg) \vu{e}_k - \sum_{j} \sum_{k \neq j} \frac{1}{H} \frac{H V_j}{h_j^2 h_k} \pdv{h_j}{c_k} \pdv{\vu{e}_k}{c_j} \end{aligned}$$ Substituting our expression for the derivatives of the local basis vectors, we find: $$\begin{aligned} \nabla^2 \vb{V} &= \sum_{jk} \frac{1}{H} \pdv{}{c_j} \bigg( \frac{H}{h_j^2} \pdv{V_k}{c_j} \bigg) \vu{e}_k + \sum_{jk} \frac{1}{h_j^2} \pdv{V_k}{c_j} \bigg( \frac{1}{h_k} \pdv{h_j}{c_k} \vu{e}_j - \delta_{jk} \sum_{l} \frac{1}{h_l} \pdv{h_k}{c_l} \vu{e}_l \bigg) \\ &\quad\: + \sum_{j} \sum_{k \neq j} \frac{1}{H} \pdv{}{c_j} \bigg( \frac{H V_k}{h_j^2 h_k} \pdv{h_j}{c_k} \bigg) \vu{e}_j - \sum_{j} \sum_{k \neq j} \frac{V_k}{h_j^2 h_k} \pdv{h_j}{c_k} \bigg( \sum_{l \neq j} \frac{1}{h_l} \pdv{h_j}{c_l} \vu{e}_l \bigg) \\ &\quad\: - \sum_{j} \sum_{k \neq j} \frac{1}{H} \pdv{}{c_j} \bigg( \frac{H V_j}{h_j^2 h_k} \pdv{h_j}{c_k} \bigg) \vu{e}_k - \sum_{j} \sum_{k \neq j} \frac{V_j}{h_j^2 h_k} \pdv{h_j}{c_k} \bigg( \frac{1}{h_k} \pdv{h_j}{c_k} \vu{e}_j \bigg) \\ &= \sum_{jk} \frac{1}{H} \pdv{}{c_j} \bigg( \frac{H}{h_j^2} \pdv{V_k}{c_j} \bigg) \vu{e}_k + \sum_{jk} \frac{1}{h_j^2 h_k} \pdv{V_k}{c_j} \pdv{h_j}{c_k} \vu{e}_j - \sum_{jl} \frac{1}{h_j^2 h_l} \pdv{V_j}{c_j} \pdv{h_j}{c_l} \vu{e}_l \\ &\quad\: + \sum_{j} \sum_{k \neq j} \frac{1}{H} \pdv{}{c_j} \bigg( \frac{H V_k}{h_j^2 h_k} \pdv{h_j}{c_k} \bigg) \vu{e}_j - \sum_{j} \sum_{k \neq j} \sum_{l \neq j} \frac{V_k}{h_j^2 h_k h_l} \pdv{h_j}{c_k} \pdv{h_j}{c_l} \vu{e}_l \\ &\quad\: - \sum_{j} \sum_{k \neq j} \frac{1}{H} \pdv{}{c_j} \bigg( \frac{H V_j}{h_j^2 h_k} \pdv{h_j}{c_k} \bigg) \vu{e}_k - \sum_{j} \sum_{k \neq j} \frac{V_j}{h_j^2 h_k^2} \bigg( \pdv{h_j}{c_k} \bigg)^2 \vu{e}_j \\ &= \sum_{jk} \frac{1}{H} \pdv{}{c_j} \bigg( \frac{H}{h_j^2} \pdv{V_k}{c_j} \bigg) \vu{e}_k + \sum_{j} \sum_{k \neq j} \frac{1}{h_j^2 h_k} \pdv{V_k}{c_j} \pdv{h_j}{c_k} \vu{e}_j - \sum_{j} \sum_{k \neq j} \frac{1}{h_j^2 h_k} \pdv{V_j}{c_j} \pdv{h_j}{c_k} \vu{e}_k \\ &\quad\: + \sum_{j} \sum_{k \neq j} \frac{1}{H} \pdv{}{c_j} \bigg( \frac{H V_k}{h_j^2 h_k} \pdv{h_j}{c_k} \bigg) \vu{e}_j - \sum_{j} \sum_{k \neq j} \sum_{l \neq j} \frac{V_k}{h_j^2 h_k h_l} \pdv{h_j}{c_k} \pdv{h_j}{c_l} \vu{e}_l \\ &\quad\: - \sum_{j} \sum_{k \neq j} \frac{1}{H} \pdv{}{c_j} \bigg( \frac{H V_j}{h_j^2 h_k} \pdv{h_j}{c_k} \bigg) \vu{e}_k - \sum_{j} \sum_{k \neq j} \frac{V_j}{h_j^2 h_k^2} \bigg( \pdv{h_j}{c_k} \bigg)^2 \vu{e}_j \end{aligned}$$ Where we have once again noticed that terms #2 and #3 cancel out if $$j = k$$. Next, we isolate the $$c_m$$-component by dot-multiplying with $$\vu{e}_m$$: $$\begin{aligned} (\nabla^2 \vb{V})_m &= (\nabla^2 \vb{V}) \cdot \vu{e}_m \\ &= \sum_{jk} \frac{\delta_{km}}{H} \pdv{}{c_j} \bigg( \frac{H}{h_j^2} \pdv{V_k}{c_j} \bigg) + \sum_{j} \sum_{k \neq j} \frac{\delta_{jm}}{h_j^2 h_k} \pdv{V_k}{c_j} \pdv{h_j}{c_k} - \sum_{j} \sum_{k \neq j} \frac{\delta_{km}}{h_j^2 h_k} \pdv{V_j}{c_j} \pdv{h_j}{c_k} \\ &\quad\: + \sum_{j} \sum_{k \neq j} \delta_{jm} \frac{1}{H} \pdv{}{c_j} \bigg( \frac{H V_k}{h_j^2 h_k} \pdv{h_j}{c_k} \bigg) - \sum_{j} \sum_{k \neq j} \sum_{l \neq j} \delta_{lm} \frac{V_k}{h_j^2 h_k h_l} \pdv{h_j}{c_k} \pdv{h_j}{c_l} \\ &\quad\: - \sum_{j} \sum_{k \neq j} \delta_{km} \frac{1}{H} \pdv{}{c_j} \bigg( \frac{H V_j}{h_j^2 h_k} \pdv{h_j}{c_k} \bigg) - \sum_{j} \sum_{k \neq j} \delta_{jm} \frac{V_j}{h_j^2 h_k^2} \bigg( \pdv{h_j}{c_k} \bigg)^2 \\ &= \sum_{j} \frac{1}{H} \pdv{}{c_j} \bigg( \frac{H}{h_j^2} \pdv{V_m}{c_j} \bigg) + \sum_{k \neq m} \frac{1}{h_k h_m^2} \pdv{V_k}{c_m} \pdv{h_m}{c_k} - \sum_{j \neq m} \frac{1}{h_j^2 h_m} \pdv{V_j}{c_j} \pdv{h_j}{c_m} \\ &\quad\: + \sum_{k \neq m} \frac{1}{H} \pdv{}{c_m} \bigg( \frac{H V_k}{h_m^2 h_k} \pdv{h_m}{c_k} \bigg) - \sum_{j \neq m} \sum_{k \neq j} \frac{V_k}{h_j^2 h_k h_m} \pdv{h_j}{c_k} \pdv{h_j}{c_m} \\ &\quad\: - \sum_{j \neq m} \frac{1}{H} \pdv{}{c_j} \bigg( \frac{H V_j}{h_j^2 h_m} \pdv{h_j}{c_m} \bigg) - \sum_{k \neq m} \frac{V_m}{h_m^2 h_k^2} \bigg( \pdv{h_m}{c_k} \bigg)^2 \end{aligned}$$ Which gives the desired formula after some simple index renaming and rearranging. {% include proof/end.html id="proof-lap-vector" %} ## Divergence of a tensor It also possible to take the divergence of a 2nd-order tensor $$\overline{\overline{\mathbf{T}}}$$, yielding a vector with these components in $$(c_1, c_2, c_3)$$: $$\begin{aligned} \boxed{ (\nabla \cdot \overline{\overline{\mathbf{T}}})_j = \sum_{k} \frac{1}{h_k} \pdv{T_{kj}}{c_k} + \sum_{k \neq j} \frac{T_{jk}}{h_j h_k} \pdv{h_j}{c_k} - \sum_{k \neq j} \frac{T_{kk}}{h_j h_k} \pdv{h_k}{c_j} + \sum_{k} \sum_{l \neq k} \frac{T_{lj}}{h_k h_l} \pdv{h_k}{c_l} } \end{aligned}$$ {% include proof/start.html id="proof-div-tensor" -%} From our earlier calculation of $$\nabla f$$, we know how to express the del $$\nabla$$ in $$(c_1, c_2, c_3)$$. Now we simply take the dot product and evaluate: $$\begin{aligned} \nabla \cdot \overline{\overline{\mathbf{T}}} &= \bigg( \vu{e}_1 \frac{1}{h_1} \pdv{}{c_1} + \vu{e}_2 \frac{1}{h_2} \pdv{}{c_2} + \vu{e}_3 \frac{1}{h_3} \pdv{}{c_3} \bigg) \\ &\quad\:\:\: \cdot \Big( T_{11} \vu{e}_1 \vu{e}_1 + T_{12} \vu{e}_1 \vu{e}_2 + T_{13} \vu{e}_1 \vu{e}_3 \\ &\qquad + T_{21} \vu{e}_2 \vu{e}_1 + T_{22} \vu{e}_2 \vu{e}_2 + T_{23} \vu{e}_2 \vu{e}_3 \\ &\qquad + T_{31} \vu{e}_3 \vu{e}_1 + T_{32} \vu{e}_3 \vu{e}_2 + T_{33} \vu{e}_3 \vu{e}_3 \Big) \\ &= \bigg( \sum_{j} \vu{e}_j \frac{1}{h_j} \pdv{}{c_j} \bigg) \cdot \bigg( \sum_{kl} T_{kl} \vu{e}_k \vu{e}_l \bigg) \\ &= \sum_{jkl} \vu{e}_j \cdot \frac{1}{h_j} \pdv{}{c_j} (T_{kl} \vu{e}_k \vu{e}_l) \end{aligned}$$ We apply the product rule of differentiation and use that $$\vb{c} \cdot (\vb{a} \vb{b}) = (\vb{c} \cdot \vb{a}) \vb{b}$$: $$\begin{aligned} \nabla \cdot \overline{\overline{\mathbf{T}}} &= \sum_{jkl} \bigg( (\vu{e}_j \cdot \vu{e}_k) \frac{1}{h_j} \pdv{T_{kl}}{c_j} \vu{e}_l + (\vu{e}_j \cdot \vu{e}_k) \frac{T_{kl}}{h_j} \pdv{\vu{e}_l}{c_j} + \Big( \vu{e}_j \cdot \pdv{\vu{e}_k}{c_j} \Big) \frac{T_{kl}}{h_j} \vu{e}_l \bigg) \\ &= \sum_{jkl} \bigg( \delta_{jk} \frac{1}{h_j} \pdv{T_{kl}}{c_j} \vu{e}_l + \delta_{jk} \frac{T_{kl}}{h_j} \pdv{\vu{e}_l}{c_j} + \Big( \vu{e}_j \cdot \pdv{\vu{e}_k}{c_j} \Big) \frac{T_{kl}}{h_j} \vu{e}_l \bigg) \\ &= \sum_{jl} \bigg( \frac{1}{h_j} \pdv{T_{jl}}{c_j} \vu{e}_l + \frac{T_{jl}}{h_j} \pdv{\vu{e}_l}{c_j} + \sum_{k} \Big( \vu{e}_j \cdot \pdv{\vu{e}_k}{c_j} \Big) \frac{T_{kl}}{h_j} \vu{e}_l \bigg) \end{aligned}$$ Inserting our expressions for the derivatives of the basis vectors in the last term, we find: $$\begin{aligned} \nabla \cdot \overline{\overline{\mathbf{T}}} &= \sum_{jl} \bigg( \frac{1}{h_j} \pdv{T_{jl}}{c_j} \vu{e}_l + \frac{T_{jl}}{h_j} \pdv{\vu{e}_l}{c_j} + \sum_{k} \vu{e}_j \cdot \Big( \frac{1}{h_k} \pdv{h_j}{c_k} \vu{e}_j - \delta_{jk} \sum_{m} \frac{1}{h_m} \pdv{h_k}{c_m} \vu{e}_m \Big) \frac{T_{kl}}{h_j} \vu{e}_l \bigg) \\ &= \sum_{jl} \bigg( \frac{1}{h_j} \pdv{T_{jl}}{c_j} \vu{e}_l + \frac{T_{jl}}{h_j} \pdv{\vu{e}_l}{c_j} + \sum_{k} \frac{T_{kl}}{h_j h_k} \pdv{h_j}{c_k} \vu{e}_l - \sum_{m} (\vu{e}_j \cdot \vu{e}_m) \frac{T_{jl}}{h_j h_m} \pdv{h_j}{c_m} \vu{e}_l \bigg) \\ &= \sum_{jl} \bigg( \frac{1}{h_j} \pdv{T_{jl}}{c_j} \vu{e}_l + \frac{T_{jl}}{h_j} \pdv{\vu{e}_l}{c_j} + \sum_{k} \frac{T_{kl}}{h_j h_k} \pdv{h_j}{c_k} \vu{e}_l - \frac{T_{jl}}{h_j h_j} \pdv{h_j}{c_j} \vu{e}_l \bigg) \\ &= \sum_{jl} \bigg( \frac{1}{h_j} \pdv{T_{jl}}{c_j} \vu{e}_l + \frac{T_{jl}}{h_j} \pdv{\vu{e}_l}{c_j} + \sum_{k \neq j} \frac{T_{kl}}{h_j h_k} \pdv{h_j}{c_k} \vu{e}_l \bigg) \end{aligned}$$ Where we noticed that the latter two terms cancel out if $$k = j$$. Next, rewriting $$\ipdv{\vu{e}_l}{c_j}$$: $$\begin{aligned} \nabla \cdot \overline{\overline{\mathbf{T}}} &= \sum_{jl} \bigg( \frac{1}{h_j} \pdv{T_{jl}}{c_j} \vu{e}_l + \frac{T_{jl}}{h_j} \Big( \frac{1}{h_l} \pdv{h_j}{c_l} \vu{e}_j - \delta_{jl} \sum_{m} \frac{1}{h_m} \pdv{h_l}{c_m} \vu{e}_m \Big) + \sum_{k \neq j} \frac{T_{kl}}{h_j h_k} \pdv{h_j}{c_k} \vu{e}_l \bigg) \\ &= \sum_{jl} \bigg( \frac{1}{h_j} \pdv{T_{jl}}{c_j} \vu{e}_l + \frac{T_{jl}}{h_j h_l} \pdv{h_j}{c_l} \vu{e}_j - \delta_{jl} \sum_{m} \frac{T_{jl}}{h_j h_m} \pdv{h_l}{c_m} \vu{e}_m + \sum_{k \neq j} \frac{T_{kl}}{h_j h_k} \pdv{h_j}{c_k} \vu{e}_l \bigg) \\ &= \sum_{jl} \frac{1}{h_j} \pdv{T_{jl}}{c_j} \vu{e}_l + \sum_{jl} \frac{T_{jl}}{h_j h_l} \pdv{h_j}{c_l} \vu{e}_j - \sum_{jm} \frac{T_{jj}}{h_j h_m} \pdv{h_j}{c_m} \vu{e}_m + \sum_{jl} \sum_{k \neq j} \frac{T_{kl}}{h_j h_k} \pdv{h_j}{c_k} \vu{e}_l \end{aligned}$$ Renaming the indices such that each term contains $$\vu{e}_l$$, we arrive at the full result: $$\begin{aligned} \nabla \cdot \overline{\overline{\mathbf{T}}} &= \sum_{jl} \bigg( \frac{1}{h_j} \pdv{T_{jl}}{c_j} + \frac{T_{lj}}{h_j h_l} \pdv{h_l}{c_j} - \frac{T_{jj}}{h_j h_l} \pdv{h_j}{c_l} + \sum_{k \neq j} \frac{T_{kl}}{h_j h_k} \pdv{h_j}{c_k} \bigg) \vu{e}_l \end{aligned}$$ To isolate the $$c_m$$-component, we dot-multiply by $$\vu{e}_m$$ and resolve the Kronecker delta $$\delta_{lm}$$: $$\begin{aligned} (\nabla \cdot \overline{\overline{\mathbf{T}}})_m &= (\nabla \cdot \overline{\overline{\mathbf{T}}}) \cdot \vu{e}_m \\ &= \sum_{jl} \delta_{lm} \bigg( \frac{1}{h_j} \pdv{T_{jl}}{c_j} + \frac{T_{lj}}{h_j h_l} \pdv{h_l}{c_j} - \frac{T_{jj}}{h_j h_l} \pdv{h_j}{c_l} + \sum_{k \neq j} \frac{T_{kl}}{h_j h_k} \pdv{h_j}{c_k} \bigg) \\ &= \sum_{j} \frac{1}{h_j} \pdv{T_{jm}}{c_j} + \sum_{j} \frac{T_{mj}}{h_j h_m} \pdv{h_m}{c_j} - \sum_{j} \frac{T_{jj}}{h_j h_m} \pdv{h_j}{c_m} + \sum_{j} \sum_{k \neq j} \frac{T_{km}}{h_j h_k} \pdv{h_j}{c_k} \end{aligned}$$ The second and third terms cancel out for $$j = m$$, so we can sum over $$j \neq m$$ instead. {% include proof/end.html id="proof-div-tensor" %} ## References 1. M.L. Boas, *Mathematical methods in the physical sciences*, 2nd edition, Wiley. 2. B. Lautrup, *Physics of continuous matter: exotic and everyday phenomena in the macroscopic world*, 2nd edition, CRC Press. 3. B. Lautrup, [Orthogonal curvilinear coordinates](https://library.wolfram.com/infocenter/MathSource/5545/vectoranalysis.pdf), 2004, unpublished.