\subsection{Kronecker \texorpdfstring{\( \delta \)}{𝛿} and Levi-Civita \texorpdfstring{\( \varepsilon \)}{𝜀}}
The Kronecker \(\delta\) is defined by
\[
	\delta_{ij} = \begin{cases}
		1 & \text{if } i = j    \\
		0 & \text{if } i \neq j
	\end{cases}
\]
Then \(\vb e_i \vb e_j = \delta_{ij}\).
We can also use \(\delta\) to rewrite indices: \(\sum_i \delta_{ij} \vb a_i = \vb a_j\).
So
\begin{align*}
	\vb a \cdot \vb b & = \left( \sum_i \vb a_i \vb e_i \right) \cdot \left( \sum_j \vb b_j \vb e_j \right) \\
	                  & = \sum_{ij} \vb a_i \vb b_j (\vb e_i \cdot \vb e_j)                                 \\
	                  & = \sum_{ij} \vb a_i \vb b_j \delta_{ij}                                             \\
	                  & = \sum_i \vb a_i \vb b_i
\end{align*}

The Levi-Civita \(\varepsilon\) is defined by
\[
	\varepsilon_{ijk} = \begin{cases}
		+1 & \text{if } ijk \text{ is an even permutation of } [1, 2, 3] \\
		-1 & \text{if } ijk \text{ is an odd permutation of } [1, 2, 3]  \\
		0  & \text{otherwise}
	\end{cases}
\]
Then
\begin{align*}
	\varepsilon_{123} = \varepsilon_{231} = \varepsilon_{312} & = +1 \\
	\varepsilon_{132} = \varepsilon_{321} = \varepsilon_{213} & = -1
\end{align*}
and all other permutations of \([1, 2, 3]\) yield 0.
This shows that \(\varepsilon\) is totally antisymmetric; exchanging any pair of indices changes the sign.
We now have:
\begin{align*}
	\vb e_i \times \vb e_j & = \sum_k \varepsilon_{ijk} \vb e_k                                                   \\
	\intertext{And:}
	\vb a \times \vb b     & = \left( \sum_i \vb a_i \vb e_i \right) \times \left( \sum_j \vb b_j \vb e_j \right) \\
	\vb a \times \vb b     & = \sum_{ij} \vb a_i \vb b_j \left( \vb e_i \times \vb e_j \right)                    \\
	\vb a \times \vb b     & = \sum_{ijk} \vb a_i \vb b_j \varepsilon_{ijk} \vb e_k
\end{align*}
So the individual terms of the cross product can be written
\[
	(\vb a \times \vb b)_k = \sum_{ij} \vb a_i \vb b_j \varepsilon_{ijk}
\]

We use the `summation convention' to abbreviate the many summation symbols used throughout linear algebra.
\begin{enumerate}
	\item An index which occurs exactly once in some term, called a `free' index, must appear once in every term in that equation.
	\item An index which occurs exactly twice in a given term, called a `repeated', `contracted', or `dummy' index, is implicitly summed over.
	\item No index can occur more than twice in a given term.
\end{enumerate}

\subsection{Identities}
The most general \(\varepsilon\varepsilon\) identity is as follows:
\begin{align*}
	\varepsilon_{ijk} \varepsilon_{pqr}
	 & = \delta_{ip}\delta_{jq}\delta_{kr} - \delta_{jp}\delta_{iq}\delta_{kr} \\
	 & + \delta_{jp}\delta_{kq}\delta_{ir} - \delta_{kp}\delta_{jq}\delta_{ir} \\
	 & + \delta_{kp}\delta_{iq}\delta_{jr} - \delta_{ip}\delta_{kq}\delta_{jr}
\end{align*}
This is, however, very verbose and not used often throughout the course.
It is provable by noting the total antisymmetry in \(i,j,k\) and \(p,q,r\) on both sides of the equation implies that both sides agree up to a constant factor.
We can check that this factor is 1 by substituting in values such as \(i=p=1\), \(j=q=2\) and \(k=r=3\).

The next most generic form is a very useful identity.
\[
	\varepsilon_{ijk}\varepsilon_{pqk} = \delta_{ip}\delta_{jq} - \delta_{iq}\delta_{jp}
\]
This is essentially the first line of the above identity, noting that \(k=r\).
We can prove this is true by observing the antisymmetry, and that both sides vanish under \(i=j\) or \(p=q\).
So it suffices to check two cases: \(i=p, j=q\) and \(i=q, j=p\).

We can now continue making more indices equal to each other to get even more specific identities:
\[
	\varepsilon_{ijk}\varepsilon_{pjk} = 2\delta_{ip}
\]
This is easy to prove by noting that \(\delta_{jj} = \sum_j \delta_{jj} = 3\), and using the \(\delta\) rewrite rule.

Finally, we have
\[
	\varepsilon_{ijk}\varepsilon_{ijk} = 6
\]
No indices are free here, so the values of \(i, j, k\) themselves are predetermined by the fact that we are in three-dimensional space.

Using the summation convention (as will now be implied for the remainder of the course), we can prove the vector triple product identity
\begin{align*}
	\left[ \vb a \times (\vb b \times \vb c) \right]_i
	 & = \varepsilon_{ijk} \vb a_j (\vb b \times \vb c)_k                                                  \\
	 & = \varepsilon_{ijk} \vb a_j \varepsilon_{pqk} \vb b_p \vb c_q                                       \\
	 & = \varepsilon_{ijk}\varepsilon_{pqk} \vb a_j \vb b_p \vb c_q                                        \\
	 & = (\delta_{ip}\delta_{jq})\vb a_j \vb b_p \vb c_q - (\delta_{iq}\delta_{jp})\vb a_j \vb b_p \vb c_q \\
	 & = (\vb a \cdot \vb c) \vb b_i - (\vb a \cdot \vb b) \vb c_i
\end{align*}
