Datasets:
Tasks:
Text Generation
Modalities:
Text
Sub-tasks:
language-modeling
Languages:
English
Size:
100K - 1M
License:
\chapter{Dual space and trace} | |
You may have learned in high school that given a matrix | |
\[ | |
\begin{bmatrix} | |
a & c \\ | |
b & d | |
\end{bmatrix} | |
\] | |
the trace is the sum along the diagonals $a+d$ | |
and the determinant is $ad-bc$. | |
But we know that a matrix is somehow | |
just encoding a linear map using a choice of basis. | |
Why would these random formulas somehow not | |
depend on the choice of a basis? | |
In this chapter, we are going to | |
give an intrinsic definition of $\Tr T$, | |
where $T \colon V \to V$ and $\dim V < \infty$. | |
This will give a coordinate-free definition | |
which will in particular imply the trace $a+d$ | |
doesn't change if we take a different basis. | |
In doing so, we will introduce two new constructions: | |
the \emph{tensor product} $V \otimes W$ | |
(which is a sort of product of two spaces, | |
with dimension $\dim V \cdot \dim W$) | |
and the \emph{dual space} $V^\vee$, | |
which is the set of linear maps $V \to k$ (a $k$-vector space). | |
Later on, when we upgrade from a vector space $V$ | |
to an inner product space, | |
we will see that the dual space $V^\vee$ gives a nice | |
interpretation of the ``transpose'' of a matrix. | |
You'll already see some of that come through here. | |
The trace is only defined for finite-dimensional | |
vector spaces, so if you want you can restrict | |
your attention to finite-dimensional vector spaces for this chapter. | |
(On the other hand we do not need the | |
ground field to be algebraically closed.) | |
The next chapter will then do the same for the determinant. | |
\section{Tensor product} | |
\prototype{$\RR[x] \otimes \RR[y] = \RR[x,y]$.} | |
We know that $\dim (V \oplus W) = \dim V + \dim W$, | |
even though as sets $V \oplus W$ looks like $V \times W$. | |
What if we wanted a real ``product'' of spaces, | |
with multiplication of dimensions? | |
For example, let's pull out | |
my favorite example of a real vector space, namely | |
\[ V = \left\{ ax^2 + bx + c \mid a,b,c \in \RR \right\}. \] | |
Here's another space, a little smaller: | |
\[ W = \left\{ dy + e \mid d,e \in \RR \right\}. \] | |
If we take the direct sum, then we would get some rather unnatural | |
vector space of dimension five | |
(whose elements can be thought of as pairs $(ax^2+bx+c,dy+e)$). | |
But suppose we want a vector space | |
whose elements are \emph{products} of polynomials in $V$ and $W$; | |
it would contain elements like $4x^2y + 5xy + y + 3$. | |
In particular, the basis would be | |
\[ \left\{ x^2y, x^2, xy, x, y, 1 \right\} \] | |
and thus have dimension six. | |
For this we resort to the \emph{tensor product}. | |
It does exactly this, except that the ``multiplication'' | |
is done by a scary\footnote{% | |
Seriously, $\otimes$ looks \emph{terrifying} to non-mathematicians, | |
and even to many math undergraduates.} | |
symbol $\otimes$: | |
think of it as a ``wall'' that separates the elements | |
between the two vector spaces. | |
For example, the above example might be written as | |
\[ 4x^2 \otimes y + 5x \otimes y + 1 \otimes y + 3 \otimes 1. \] | |
(This should be read as $(4x^2 \otimes y) + (5x \otimes y) + \dots$; | |
addition comes after $\otimes$.) | |
Of course there should be no distinction | |
between writing $4x^2 \otimes y$ and $x^2 \otimes 4y$ | |
or even $2x^2 \otimes 2y$. | |
While we want to keep the $x$ and $y$ separate, | |
the scalars should be free to float around. | |
Of course, there's no need to do everything | |
in terms of just the monomials. | |
We are free to write | |
\[ (x + 1) \otimes (y + 1). \] | |
If you like, you can expand this as | |
\[ x \otimes y + 1 \otimes y + x \otimes 1 + 1 \otimes 1. \] | |
Same thing. | |
The point is that we can take any two of our polynomials | |
and artificially ``tensor'' them together. | |
The definition of the tensor product does exactly this, | |
and nothing else.\footnote{I'll only define this | |
for vector spaces for simplicity. | |
The definition for modules over a commutative ring $R$ is exactly the same.} | |
\begin{definition} | |
Let $V$ and $W$ be vector spaces over the same field $k$. | |
The \vocab{tensor product} $V \otimes_k W$ is the abelian group | |
generated by elements of the form $v \otimes w$, subject to relations | |
\begin{align*} | |
(v_1 + v_2) \otimes w &= v_1 \otimes w + v_2 \otimes w \\ | |
v \otimes (w_1 + w_2) &= v \otimes w_1 + v \otimes w_2 \\ | |
(c \cdot v) \otimes w &= v \otimes (c \cdot w). | |
\end{align*} | |
As a vector space, | |
its action is given by | |
$c \cdot (v \otimes w) = (c \cdot v) \otimes w = v \otimes (c \cdot w)$. | |
\end{definition} | |
Here's another way to phrase the same idea. | |
We define a \vocab{pure tensor} as an | |
element of the form $v \otimes w$ for $v \in V$ and $w \in W$. | |
But we let the $\otimes$ wall be ``permeable'' in the sense that | |
\[ (c \cdot v) \otimes w = v \otimes (c \cdot w) = c \cdot (v \otimes w) \] | |
and we let multiplication and addition distribute as we expect. | |
Then $V \otimes W$ consists of sums of pure tensors. | |
\begin{example} | |
[Infinite-dimensional example of tensor product: two-variable polynomials] | |
Although it's not relevant to this chapter, | |
this definition works equally well with infinite-dimensional | |
vector spaces. | |
The best example might be | |
\[ \RR[x] \otimes_\RR \RR[y] = \RR[x,y]. \] | |
That is, the tensor product of polynomials in $x$ | |
with real polynomials in $y$ | |
turns out to just be two-variable polynomials $\RR[x,y]$. | |
\end{example} | |
\begin{remark} | |
[Warning on sums of pure tensors] | |
Remember the elements of $V \otimes_k W$ | |
really are \emph{sums} of these pure tensors! | |
If you liked the previous example, | |
this fact has a nice interpretation --- | |
not every polynomial in $\RR[x,y] = \RR[x] \otimes_\RR \RR[y]$ factors | |
as a polynomial in $x$ times a polynomial in $y$ | |
(i.e.\ as pure tensors $f(x) \otimes g(y)$). | |
But they all can be written as sums of pure tensors $x^a \otimes y^b$. | |
\end{remark} | |
As the example we gave suggested, | |
the basis of $V \otimes_k W$ is literally the ``product'' | |
of the bases of $V$ and $W$. | |
In particular, this fulfills our desire that | |
$\dim (V \otimes_k W) = \dim V \cdot \dim W$. | |
\begin{proposition}[Basis of $V \otimes W$] | |
Let $V$ and $W$ be finite-dimensional $k$-vector spaces. | |
If $e_1, \dots, e_m$ is a basis of $V$ and $f_1, \dots, f_n$ is a basis of $W$, | |
then the basis of $V \otimes_k W$ | |
is precisely $e_i \otimes f_j$, where $i=1,\dots,m$ and $j=1,\dots,n$. | |
\end{proposition} | |
\begin{proof} | |
Omitted; it's easy at least to see that this basis is spanning. | |
\end{proof} | |
\begin{example}[Explicit computation] | |
Let $V$ have basis $e_1$, $e_2$ and $W$ have basis $f_1, f_2$. | |
Let $v = 3e_1 + 4e_2 \in V$ and $w = 5f_1 + 6f_2 \in W$. | |
Let's write $v \otimes w$ in this basis for $V \otimes_k W$: | |
\begin{align*} | |
v \otimes w &= (3e_1+4e_2) \otimes (5f_1+6f_2) \\ | |
&= (3e_1) \otimes (5f_1) + (4e_2) \otimes (5f_1) | |
+ (3e_1) \otimes (6f_2) + (4e_2) \otimes (6f_2) \\ | |
&= 15 (e_1 \otimes f_1) + 20(e_2 \otimes f_1) | |
+ 18 (e_1 \otimes f_2) + 24(e_2 \otimes f_2). | |
\end{align*} | |
So you can see why tensor products are a nice ``product'' to | |
consider if we're really interested in $V \times W$ | |
in a way that's more intimate than just a direct sum. | |
\end{example} | |
\begin{abuse} | |
Moving forward, we'll almost always abbreviate $\otimes_k$ to just $\otimes$, | |
since $k$ is usually clear. | |
\end{abuse} | |
\begin{remark} | |
Observe that to define a linear map $V \otimes W \to X$, | |
I only have to say what happens to each pure tensor $v \otimes w$, | |
since the pure tensors \emph{generate} $V \otimes W$. | |
But again, keep in mind that | |
$V \otimes W$ consists of \emph{sums} of these pure tensors! | |
In other words, $V \otimes W$ is generated by pure tensors. | |
\end{remark} | |
\begin{remark} | |
Much like the Cartesian product $A \times B$ of sets, | |
you can tensor together any two vector spaces $V$ and $W$ over the same field $k$; | |
the relationship between $V$ and $W$ is completely irrelevant. | |
One can think of the $\otimes$ as a ``wall'' through which one can pass | |
scalars in $k$, but otherwise keeps the elements of $V$ and $W$ separated. | |
Thus, $\otimes$ is \textbf{content-agnostic}. | |
This also means that even if $V$ and $W$ have some relation to each other, | |
the tensor product doesn't remember this. | |
So for example $v \otimes 1 \neq 1 \otimes v$, | |
just like $(g,1_G) \neq (1_G,g)$ in the group $G \times G$. | |
\end{remark} | |
\section{Dual space} | |
\prototype{Rotate a column matrix by $90$ degrees.} | |
Consider the following vector space: | |
\begin{example} | |
[Functions from $\RR^3 \to \RR$] | |
The set of real functions $f(x,y,z)$ is an | |
infinite-dimensional real vector space. | |
Indeed, we can add two functions to get $f+g$, | |
and we can think of functions like $2f$. | |
\end{example} | |
This is a terrifyingly large vector space, | |
but you can do some reasonable reductions. | |
For example, you can restrict your attention to just | |
the \emph{linear maps} from $\RR^3$ to $\RR$. | |
That's exactly what we're about to do. | |
This definition might seem strange at first, but bear with me. | |
\begin{definition} | |
Let $V$ be a $k$-vector space. | |
Then $V^\vee$, the \vocab{dual space} of $V$, is defined | |
as the vector space whose elements are \emph{linear maps from $V$ to $k$}. | |
\end{definition} | |
The addition and multiplication are pointwise: | |
it's the same notation we use when we write $cf+g$ to mean $c \cdot f(x) + g(x)$. | |
The dual space itself is less easy to think about. | |
Let's try to find a basis for $V^\vee$. | |
First, here is a very concrete interpretation of the vector space. | |
Suppose for example $V = \RR^3$. | |
We can think of elements of $V$ as column matrices, like | |
\[ v = \begin{bmatrix} | |
2 \\ 5 \\ 9 | |
\end{bmatrix} | |
\in V. \] | |
Then a linear map $f \colon V \to k$ can be interpreted as a \emph{row matrix}: | |
\[ | |
f = \begin{bmatrix} | |
3 & 4 & 5 | |
\end{bmatrix} | |
\in V^\vee. \] | |
Then | |
\[ | |
f(v) = \begin{bmatrix} | |
3 & 4 & 5 | |
\end{bmatrix} | |
\begin{bmatrix} | |
2 \\ 5 \\ 9 | |
\end{bmatrix} | |
= 71. \] | |
More precisely: \textbf{to specify a linear map $V \to k$, | |
I only have to tell you where each basis element of $V$ goes}. | |
In the above example, $f$ sends $e_1$ to $3$, $e_2$ to $4$, and $e_3$ to $5$. | |
So $f$ sends \[ 2e_1 + 5e_2 + 9e_3 \mapsto 2 \cdot 3 + 5 \cdot 4 + 9 \cdot 5 = 71. \] | |
Let's make all this precise. | |
\begin{proposition}[The dual basis for $V^\vee$] | |
Let $V$ be a finite-dimensional vector space with basis $e_1, \dots, e_n$. | |
For each $i$ consider the function $e_i^\vee \colon V \to k$ | |
defined by | |
\[ | |
e_i^\vee(e_j) | |
= \begin{cases} | |
1 & i=j \\ | |
0 & i \neq j. | |
\end{cases} | |
\] | |
In more humane terms, $e_i^\vee(v)$ | |
gives the coefficient of $e_i$ in $v$. | |
Then $e_1^\vee$, $e_2^\vee$, \dots, $e_n^\vee$ is a basis of $V^\vee$. | |
\end{proposition} | |
\begin{example}[Explicit example of element in $V^\vee$] | |
In this notation, $f = 3e_1^\vee + 4e_2^\vee + 5e_3^\vee$. | |
Do you see why the ``sum'' notation works as expected here? | |
Indeed | |
\begin{align*} | |
f(e_1) &= (3e_1^\vee + 4e_2^\vee + 5e_3^\vee)(e_1) \\ | |
&= 3e_1^\vee(e_1) + 4e_2^\vee(e_1) + 5e_3^\vee(e_1) \\ | |
&= 3 \cdot 1 + 4 \cdot 0 + 5 \cdot 0 = 3. | |
\end{align*} | |
That's exactly what we wanted. | |
\end{example} | |
You might be inclined to point out that $V \cong V^\vee$ at this point, | |
with an isomorphism given by $e_i \mapsto e_i^\vee$. | |
You might call it ``rotating the column matrix by $90\dg$''. | |
This statement is technically true, | |
but for a generic vector space $V$ without any extra information, | |
you can just think of this as an artifact of the $\dim V = \dim V^\vee$ | |
(as \emph{any} two vector spaces of equal dimension are isomorphic). | |
Most importantly, the isomorphism given above depends | |
on what basis you picked. | |
\begin{remark*} | |
[Explicit example showing that the isomorphism $V \to V^\vee$ | |
given above is unnatural] | |
Alice or Bob are looking at the same two-dimensional real vector space | |
\[ V = \left\{ (x,y,z) \mid x+y+z = 0 \right\}. \] | |
Also, let $v_{\text{example}} = (3,5,-8)$ | |
be an example of an arbitrary element of $V$ for concreteness. | |
Suppose Alice chooses the following basis vectors for $V$. | |
\begin{align*} | |
e_1 &= (1,0,-1) \\ | |
e_2 &= (0,1,-1). | |
\end{align*} | |
Alice uses this to construct an isomorphism $A \colon V \to V^\vee$ | |
as described above, and considers $e_1^\vee = A(e_1^\vee)$. | |
The element $e_1^\vee \in V^\vee$ is a | |
function $e_1^\vee \colon V \to \RR$, | |
meaning Alice can plug any vector in $V$ into it. | |
As an example, for $v_{\text{example}}$ | |
\[ e_1^\vee(v_{\text{example}}) | |
= e_1\left( (3,5,-8) \right) | |
= e_1^\vee\left( 3e_1 - 5e_2 \right) = 3. \] | |
Meanwhile, Bob chooses the different basis vectors | |
\begin{align*} | |
f_1 &= (1,0,-1) \\ | |
f_2 &= (1,-1,0). | |
\end{align*} | |
This gives Bob an isomorphism $B \colon V \to V^\vee$, | |
and a corresponding $f_1^\vee = B(f_1)$. | |
Bob can also evaluate it anywhere, e.g. | |
\[ f_1^\vee\left( v_{\text{example}} \right) | |
f_1^\vee\left( (3, 5, -8) \right) | |
= f_1^\vee\left( 8f_1 - 5f_2 \right) = 8. \] | |
It follows that $e_1^\vee = A\left( (1,0,-1) \right)$ | |
and $f_1^\vee = B\left( (1,0,-1) \right)$ | |
are different elements of $V^\vee$. | |
In other words Alice and Bob got different isomorphisms | |
because they picked different bases. | |
\end{remark*} | |
\section{$V^\vee \otimes W$ gives matrices from $V$ to $W$} | |
Goal of this section: | |
\begin{moral} | |
If $V$ and $W$ are finite-dimensional $k$-vector spaces | |
then $V^\vee \otimes W$ represents linear maps $V \to W$. | |
\end{moral} | |
Here's the intuition. | |
If $V$ is three-dimensional and $W$ is five-dimensional, then we can think | |
of the maps $V \to W$ as a $5 \times 3$ array of numbers. | |
We want to think of these maps as a vector space: | |
(since one can add or scale matrices). | |
So it had better be a vector space with dimension $15$, | |
but just saying ``$k^{\oplus 15}$'' is not really that satisfying | |
(what is the basis?). | |
To do better, we consider the tensor product | |
\[ V^\vee \otimes W \] | |
which somehow is a product of maps out of $V$ and the target space $W$. | |
We claim that this is in fact the space we want: | |
i.e.\ \textbf{there is a natural bijection between elements of $V^\vee \otimes W$ | |
and linear maps from $V$ to $W$}. | |
First, how do we interpret an element of $V^\vee \otimes W$ as a map $V \to W$? | |
For concreteness, suppose $V$ has a basis $e_1$, $e_2$, $e_3$, | |
and $W$ has a basis $f_1$, $f_2$, $f_3$, $f_4$, $f_5$. | |
Consider an element of $V^\vee \otimes W$, say | |
\[ e_1^\vee \otimes (f_2 + 2f_4) + 4e_2^\vee \otimes f_5. \] | |
We want to interpret this element as a function $V \to W$: | |
so given a $v \in V$, | |
we want to output an element of $W$. | |
There's really only one way to do this: | |
feed in $v \in V$ into the $V^\vee$ guys on the left. | |
That is, take the map | |
\[ v \mapsto e_1^\vee(v) \cdot (f_2 + 2f_4) + 4e_2^\vee(v) \cdot f_5 \in W. \] | |
So, there's a natural way to interpret any element | |
$\xi_1 \otimes w_1 + \dots + \xi_m \otimes w_m \in V^\vee \otimes W$ | |
as a linear map $V \to W$. | |
The claim is that in fact, every linear map $V \to W$ has | |
such an interpretation. | |
First, for notational convenience, | |
\begin{definition} | |
Let $\Hom(V,W)$ denote the set of linear maps from $V$ to $W$ | |
(which one can interpret as matrices which send $V$ to $W$), | |
viewed as a vector space over $k$. | |
(The ``$\Hom$'' stands for homomorphism.) | |
\end{definition} | |
\begin{ques} | |
Identify $\Hom(V,k)$ by name. | |
\end{ques} | |
We can now write down something that's more true generally. | |
\begin{theorem}[$V^\vee \otimes W$ $\iff$ linear maps $V \to W$] | |
\label{thm:vect_hom_dualization} | |
Let $V$ and $W$ be finite-dimensional vector spaces. | |
We described a map | |
\[ \Psi \colon V^\vee \otimes W \to \Hom(V,W) \] | |
by sending $\xi_1 \otimes w_1 + \dots + \xi_m \otimes w_m$ to the linear map | |
\[ v \mapsto \xi_1(v) w_1 + \dots + \xi_m(v) w_m. \] | |
Then $\Psi$ is an isomorphism of vector spaces, i.e.\ every linear map $V \to W$ | |
can be uniquely represented as an element of $V^\vee \otimes W$ in this way. | |
\end{theorem} | |
The above is perhaps a bit dense, so here is a concrete example. | |
\begin{example}[Explicit example] | |
Let $V = \RR^2$ and take a basis $e_1$, $e_2$ of $V$. | |
Then define $T : V \to V$ by | |
\[ T = \begin{bmatrix} | |
1 & 2 \\ 3 & 4 | |
\end{bmatrix}. \] | |
Then we have | |
\[ \Psi(e_1^\vee \otimes e_1 + 2e_2^\vee \otimes e_1 | |
+ 3e_1^\vee \otimes e_2 + 4e_2^\vee \otimes e_2) = T. \] | |
The beauty is that the $\Psi$ definition is basis-free; | |
thus even if we change the basis, | |
although the above expression will look completely different, | |
the \emph{actual element} in $V^\vee \otimes V$ doesn't change. | |
\end{example} | |
Despite this, we'll indulge ourselves in using coordinates for the proof. | |
\begin{proof} | |
[Proof of \Cref{thm:vect_hom_dualization}] | |
This looks intimidating, but it's actually not difficult. | |
We proceed in two steps: | |
\begin{enumerate} | |
\ii First, we check that $\Psi$ is \emph{surjective}; | |
every linear map has at least one representation in $V^\vee \otimes W$. | |
To see this, take any $T : V \to W$. | |
Suppose $V$ has basis $e_1$, $e_2$, $e_3$ and that | |
$T(e_1) = w_1$, $T(e_2) = w_2$ and $T(e_3) = w_3$. | |
Then the element | |
\[ e_1^\vee \otimes w_1 + e_2^\vee \otimes w_2 + e_3^\vee \otimes w_3 \] | |
works, as it is contrived to agree with $T$ on the basis elements $e_i$. | |
\ii So it suffices to check now that $\dim V^\vee \otimes W = \dim \Hom(V,W)$. | |
Certainly, $V^\vee \otimes W$ has dimension $\dim V \cdot \dim W$. | |
But by viewing $\Hom(V,W)$ as $\dim V \cdot \dim W$ matrices, we see that | |
it too has dimension $\dim V \cdot \dim W$. \qedhere | |
\end{enumerate} | |
\end{proof} | |
So there is a \textbf{natural isomorphism} $V^\vee \otimes W \cong \Hom(V,W)$. | |
While we did use a basis liberally in the | |
\emph{proof that it works}, this doesn't change the | |
fact that the isomorphism is ``God-given'', | |
depending only on the spirit of $V$ and $W$ itself | |
and not which basis we choose to express the vector spaces in. | |
\section{The trace} | |
We are now ready to give the definition of a trace. | |
Recall that a square matrix $T$ can be thought of as a map $T \colon V \to V$. | |
According to the above theorem, | |
\[ \Hom(V, V) \cong V^\vee \otimes V \] | |
so every map $V \to V$ can be thought of as an element of $V^\vee \otimes V$. | |
But we can also define an | |
\emph{evaluation map} $\opname{ev} : V^\vee \otimes V \to k$ | |
by ``collapsing'' each pure tensor: $f \otimes v \mapsto f(v)$. | |
So this gives us a composed map | |
\begin{center} | |
\begin{tikzcd} | |
\Hom(V, V) \ar[r, "\cong"] & V^\vee \otimes V \ar[r, "\opname{ev}"] & k. | |
\end{tikzcd} | |
\end{center} | |
This result is called the \vocab{trace} of a matrix $T$. | |
\begin{example}[Example of a trace] | |
Continuing the previous example, | |
\[ \Tr T = e_1^\vee(e_1) + 2e_2^\vee(e_1) | |
+ 3e_1^\vee(e_2) + 4e_2^\vee(e_2) | |
= 1 + 0 + 0 + 4 = 5. \] | |
And that is why the trace is the sum of the diagonal entries. | |
\end{example} | |
\section{\problemhead} | |
\begin{problem} | |
[Trace is sum of eigenvalues] | |
Let $V$ be an $n$-dimensional vector space | |
over an algebraically closed field $k$. | |
Let $T \colon V \to V$ be a linear map with | |
eigenvalues $\lambda_1$, $\lambda_2$, \dots, $\lambda_n$ | |
(counted with algebraic multiplicity). | |
Show that $\Tr T = \lambda_1 + \dots + \lambda_n$. | |
\begin{hint} | |
Follows by writing $T$ in an eigenbasis: | |
then the diagonal entries are the eigenvalues. | |
\end{hint} | |
\end{problem} | |
\begin{dproblem} | |
[Product of traces] | |
Let $T \colon V \to V$ and $S \colon W \to W$ be linear maps | |
of finite-dimensional vector spaces $V$ and $W$. | |
Define $T \otimes S \colon V \otimes W \to V \otimes W$ | |
by $v \otimes w \mapsto T(v) \otimes S(w)$. | |
Prove that \[ \Tr(T \otimes S) = \Tr(T) \Tr(S). \] | |
\begin{hint} | |
Again one can just take a basis. | |
\end{hint} | |
\end{dproblem} | |
\begin{dproblem} | |
[Traces kind of commute] | |
\gim | |
Let $T \colon V \to W$ and $S \colon W \to V$ be linear maps | |
between finite-dimensional vector spaces $V$ and $W$. | |
Show that \[ \Tr(T \circ S) = \Tr(S \circ T). \] | |
\begin{hint} | |
One solution is to just take a basis. | |
Otherwise, interpret $T \otimes S \mapsto \Tr(T \circ S)$ as a | |
linear map $(V^\vee \otimes W) \otimes (W^\vee \otimes V) \to k$, | |
and verify that it is commutative. | |
\end{hint} | |
\begin{sol} | |
Although we could give a coordinate calculation, | |
we instead opt to give a cleaner proof. | |
This amounts to drawing the diagram | |
\begin{center} | |
\begin{tikzcd}[column sep=tiny] | |
& (W^\vee \otimes V) \otimes (V^\vee \otimes W) \ar[d] | |
\ar[r, equals] \ar[ld, "\text{compose}"'] | |
& (V^\vee \otimes W) \otimes (W^\vee \otimes V) \ar[d] | |
\ar[rd, "\text{compose}"] | |
\\ | |
\Hom(W, W) \ar[r, leftrightarrow] \ar[rd, "\Tr"'] | |
& W^\vee \otimes W \ar[d, "\opname{ev}"] | |
& V^\vee \otimes V \ar[r, leftrightarrow] \ar[d, "\opname{ev}"] | |
& \Hom(V, V) \ar[ld, "\Tr"'] | |
\\ | |
& k \ar[r, equals] & k | |
\end{tikzcd} | |
\end{center} | |
It is easy to check that the center rectangle commutes, | |
by checking it on pure tensors $\xi_W \otimes v \otimes \xi_V \otimes w$. | |
So the outer hexagon commutes and we're done. | |
This is really the same as the proof with bases; | |
what it amounts to is checking the assertion is true for | |
matrices that have a $1$ somewhere and $0$ elsewhere, | |
then extending by linearity. | |
\end{sol} | |
\end{dproblem} | |
\begin{problem} | |
[Putnam 1988] | |
\gim | |
Let $V$ be an $n$-dimensional vector space. | |
Let $T : V \to V$ be a linear map and suppose there exists $n+1$ eigenvectors, | |
any $n$ of which are linearly independent. | |
Does it follow that $T$ is a scalar multiple of the identity? | |
\begin{hint} | |
Look at the trace of $T$. | |
\end{hint} | |
\begin{sol} | |
See \url{https://mks.mff.cuni.cz/kalva/putnam/psoln/psol886.html}. | |
\end{sol} | |
\end{problem} | |