\section{Skew Matrix of Rotated Vector}

The problem comes from the fact that cross product is invariant under rotation, namely

\begin{equation}\nonumber
\textbf{R}(\textbf{v}\times\textbf{w}) = (\textbf{Rv})\times(\textbf{Rw})
\end{equation}

And the problem we wanna prove is actually

\begin{equation}\nonumber
\textbf{R}\hat{\textbf{w}}\textbf{R}^T = \hat{\textbf{Rw}}
\end{equation}

Operator $\hat{\cdot}$ is a linear map from an order 1 tensor to an order 2 tensor, which can be defined as follow

\begin{equation}\nonumber
\hat{\textbf{w}}_{ij} \triangleq - \boldsymbol{\epsilon}_{ijk}\textbf{w}_k
\end{equation}

where $\boldsymbol{\epsilon}_{ijk}$ is the order 3 \textit{Levi-Civita} permutation tensor. It can be also taken as a contraction between an order 3 tensor and an order 1 tensor producing an order 2 tensor. Let's start from this problem first and we will show how it can be used to prove our original problem. We will show the left hand side can be converted to the right hand side using tensor algebra.

\begin{equation}\nonumber
\begin{split}
[\textbf{R}\hat{\textbf{w}}\textbf{R}^T]_{ij} &= [\textbf{R}\hat{\textbf{w}}]_{ik}\textbf{R}^T_{kj} = [\textbf{R}\hat{\textbf{w}}]_{ik}\textbf{R}_{jk}\\
&= \textbf{R}_{il}\hat{\textbf{w}}_{lk}\textbf{R}_{jk} = \textbf{R}_{il}(- \boldsymbol{\epsilon}_{lkm}\textbf{w}_m)\textbf{R}_{jk}\\
&= - \boldsymbol{\epsilon}_{lkm}\textbf{R}_{il}\textbf{R}_{jk}\textbf{w}_m = - \boldsymbol{\epsilon}_{lkn}\boldsymbol{\delta}_{nm}\textbf{R}_{il}\textbf{R}_{jk}\textbf{w}_m\\
&= - \boldsymbol{\epsilon}_{lkn}\textbf{R}_{rn}\textbf{R}_{rm}\textbf{R}_{il}\textbf{R}_{jk}\textbf{w}_m\\
&= - \boldsymbol{\epsilon}_{lkn}\textbf{R}_{il}\textbf{R}_{jk}\textbf{R}_{rn}\textbf{R}_{rm}\textbf{w}_m
\end{split}
\end{equation}

As you may know, matrix determinant can also be phrased in terms of tensor algebra,

\begin{equation}\nonumber
det(\textbf{A}) = \boldsymbol{\epsilon}_{ijk}\textbf{a}_{i1}\textbf{a}_{j2}\textbf{a}_{k3}
\end{equation}

If we permutate the order of the above index \textit{1, 2, 3}, we will actually get an extra sign due to the determinant of the permutation matrix. Moreover, if we duplicate the index, we will get zero for the determinant. Actually such mannar follows the definition of the order 3 \textit{Levi-Civita} permutation tensor. Then we have

\begin{equation}\nonumber
\begin{split}
[\textbf{R}\hat{\textbf{w}}\textbf{R}^T]_{ij} &= - \boldsymbol{\epsilon}_{lkn}\textbf{R}_{il}\textbf{R}_{jk}\textbf{R}_{rn}\textbf{R}_{rm}\textbf{w}_m\\
&= -det(\textbf{R})\boldsymbol{\epsilon}_{ijr}\textbf{R}_{rm}\textbf{w}_m = - \boldsymbol{\epsilon}_{ijr}\textbf{R}_{rm}\textbf{w}_m\\
&= [\hat{\textbf{Rw}}]_{ij}
\end{split}
\end{equation}

Having proved this problem, we can then show the original problem easily

\begin{equation}\nonumber
\begin{split}
(\textbf{Rv})\times(\textbf{Rw}) = \hat{\textbf{Rv}}\textbf{Rw}\\
= \textbf{R}\hat{\textbf{v}}\textbf{R}^{T}\textbf{Rw}\\
= \textbf{R}(\textbf{v}\times\textbf{w})
\end{split}
\end{equation}