{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Attention Mechanism"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "In :numref:chapter_seq2seq, we encode the source sequence input information in the recurrent unit state and then pass it to the decoder to generate the target sequence. A token in the target sequence may closely relate to some tokens in the source sequence instead of the whole source sequence. For example, when translating \"Hello world.\" to \"Bonjour le monde.\", \"Bonjour\" maps to \"Hello\" and \"monde\" maps to \"world\". In the seq2seq model, the decoder may implicitly select the corresponding information from the state passed by the decoder. The attention mechanism, however, makes this selection explicit.\n",
    "\n",
    "Attention is a generalized pooling method with bias alignment over inputs. The core component in the attention mechanism is the attention layer, or called attention for simplicity. An input of the attention layer is called a query. For a query, the attention layer returns the output based on its memory, which is a set of key-value pairs. To be more specific, assume a query $\\mathbf{q}\\in\\mathbb R^{d_q}$, and the memory contains $n$ key-value pairs, $(\\mathbf{k}_1, \\mathbf{v}_1), \\ldots, (\\mathbf{k}_n, \\mathbf{v}_n)$, with $\\mathbf{k}_i\\in\\mathbb R^{d_k}$, $\\mathbf{v}_i\\in\\mathbb R^{d_v}$. The attention layer then returns an output $\\mathbf o\\in\\mathbb R^{d_v}$ with the same shape as a value."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 244,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "image/svg+xml": [
       "<svg height=\"140pt\" version=\"1.1\" viewBox=\"0 0 188 140\" width=\"188pt\" xmlns=\"http://www.w3.org/2000/svg\" xmlns:xlink=\"http://www.w3.org/1999/xlink\">\n",
       "<defs>\n",
       "<g>\n",
       "<symbol id=\"glyph0-0\" overflow=\"visible\">\n",
       "<path d=\"M 1.125 0 L 1.125 -5.625 L 5.625 -5.625 L 5.625 0 Z M 1.265625 -0.140625 L 5.484375 -0.140625 L 5.484375 -5.484375 L 1.265625 -5.484375 Z M 1.265625 -0.140625 \" style=\"stroke:none;\"/>\n",
       "</symbol>\n",
       "<symbol id=\"glyph0-1\" overflow=\"visible\">\n",
       "<path d=\"M -0.015625 0 L 2.46875 -6.4375 L 3.375 -6.4375 L 6.015625 0 L 5.046875 0 L 4.296875 -1.953125 L 1.59375 -1.953125 L 0.890625 0 Z M 1.84375 -2.640625 L 4.03125 -2.640625 L 3.359375 -4.4375 C 3.148438 -4.976562 3 -5.421875 2.90625 -5.765625 C 2.820312 -5.347656 2.703125 -4.9375 2.546875 -4.53125 Z M 1.84375 -2.640625 \" style=\"stroke:none;\"/>\n",
       "</symbol>\n",
       "<symbol id=\"glyph0-2\" overflow=\"visible\">\n",
       "<path d=\"M 2.328125 -0.703125 L 2.4375 -0.015625 C 2.207031 0.0351562 2.007812 0.0625 1.84375 0.0625 C 1.550781 0.0625 1.328125 0.015625 1.171875 -0.078125 C 1.015625 -0.171875 0.898438 -0.289062 0.828125 -0.4375 C 0.765625 -0.582031 0.734375 -0.890625 0.734375 -1.359375 L 0.734375 -4.046875 L 0.15625 -4.046875 L 0.15625 -4.671875 L 0.734375 -4.671875 L 0.734375 -5.828125 L 1.53125 -6.296875 L 1.53125 -4.671875 L 2.328125 -4.671875 L 2.328125 -4.046875 L 1.53125 -4.046875 L 1.53125 -1.328125 C 1.53125 -1.097656 1.539062 -0.953125 1.5625 -0.890625 C 1.59375 -0.828125 1.640625 -0.773438 1.703125 -0.734375 C 1.765625 -0.691406 1.851562 -0.671875 1.96875 -0.671875 C 2.0625 -0.671875 2.179688 -0.679688 2.328125 -0.703125 Z M 2.328125 -0.703125 \" style=\"stroke:none;\"/>\n",
       "</symbol>\n",
       "<symbol id=\"glyph0-3\" overflow=\"visible\">\n",
       "<path d=\"M 3.78125 -1.5 L 4.609375 -1.40625 C 4.472656 -0.925781 4.226562 -0.550781 3.875 -0.28125 C 3.53125 -0.0195312 3.085938 0.109375 2.546875 0.109375 C 1.867188 0.109375 1.328125 -0.0976562 0.921875 -0.515625 C 0.523438 -0.941406 0.328125 -1.535156 0.328125 -2.296875 C 0.328125 -3.078125 0.53125 -3.679688 0.9375 -4.109375 C 1.34375 -4.546875 1.867188 -4.765625 2.515625 -4.765625 C 3.140625 -4.765625 3.644531 -4.550781 4.03125 -4.125 C 4.425781 -3.707031 4.625 -3.113281 4.625 -2.34375 C 4.625 -2.289062 4.625 -2.21875 4.625 -2.125 L 1.140625 -2.125 C 1.171875 -1.613281 1.316406 -1.222656 1.578125 -0.953125 C 1.835938 -0.679688 2.164062 -0.546875 2.5625 -0.546875 C 2.851562 -0.546875 3.097656 -0.617188 3.296875 -0.765625 C 3.503906 -0.921875 3.664062 -1.164062 3.78125 -1.5 Z M 1.1875 -2.78125 L 3.796875 -2.78125 C 3.765625 -3.175781 3.664062 -3.472656 3.5 -3.671875 C 3.25 -3.972656 2.921875 -4.125 2.515625 -4.125 C 2.148438 -4.125 1.84375 -4 1.59375 -3.75 C 1.351562 -3.507812 1.21875 -3.1875 1.1875 -2.78125 Z M 1.1875 -2.78125 \" style=\"stroke:none;\"/>\n",
       "</symbol>\n",
       "<symbol id=\"glyph0-4\" overflow=\"visible\">\n",
       "<path d=\"M 0.59375 0 L 0.59375 -4.671875 L 1.3125 -4.671875 L 1.3125 -4 C 1.644531 -4.507812 2.140625 -4.765625 2.796875 -4.765625 C 3.078125 -4.765625 3.332031 -4.710938 3.5625 -4.609375 C 3.800781 -4.515625 3.976562 -4.382812 4.09375 -4.21875 C 4.207031 -4.0625 4.289062 -3.867188 4.34375 -3.640625 C 4.375 -3.492188 4.390625 -3.238281 4.390625 -2.875 L 4.390625 0 L 3.59375 0 L 3.59375 -2.84375 C 3.59375 -3.164062 3.5625 -3.40625 3.5 -3.5625 C 3.4375 -3.71875 3.328125 -3.84375 3.171875 -3.9375 C 3.015625 -4.039062 2.832031 -4.09375 2.625 -4.09375 C 2.289062 -4.09375 2 -3.984375 1.75 -3.765625 C 1.507812 -3.554688 1.390625 -3.148438 1.390625 -2.546875 L 1.390625 0 Z M 0.59375 0 \" style=\"stroke:none;\"/>\n",
       "</symbol>\n",
       "<symbol id=\"glyph0-5\" overflow=\"visible\">\n",
       "<path d=\"M 0.59375 -5.53125 L 0.59375 -6.4375 L 1.390625 -6.4375 L 1.390625 -5.53125 Z M 0.59375 0 L 0.59375 -4.671875 L 1.390625 -4.671875 L 1.390625 0 Z M 0.59375 0 \" style=\"stroke:none;\"/>\n",
       "</symbol>\n",
       "<symbol id=\"glyph0-6\" overflow=\"visible\">\n",
       "<path d=\"M 0.296875 -2.328125 C 0.296875 -3.191406 0.535156 -3.832031 1.015625 -4.25 C 1.421875 -4.59375 1.910156 -4.765625 2.484375 -4.765625 C 3.128906 -4.765625 3.65625 -4.554688 4.0625 -4.140625 C 4.46875 -3.722656 4.671875 -3.144531 4.671875 -2.40625 C 4.671875 -1.800781 4.578125 -1.328125 4.390625 -0.984375 C 4.210938 -0.640625 3.953125 -0.367188 3.609375 -0.171875 C 3.265625 0.015625 2.890625 0.109375 2.484375 0.109375 C 1.828125 0.109375 1.296875 -0.0976562 0.890625 -0.515625 C 0.492188 -0.941406 0.296875 -1.546875 0.296875 -2.328125 Z M 1.109375 -2.328125 C 1.109375 -1.734375 1.238281 -1.285156 1.5 -0.984375 C 1.757812 -0.691406 2.085938 -0.546875 2.484375 -0.546875 C 2.878906 -0.546875 3.207031 -0.691406 3.46875 -0.984375 C 3.726562 -1.285156 3.859375 -1.742188 3.859375 -2.359375 C 3.859375 -2.929688 3.726562 -3.367188 3.46875 -3.671875 C 3.207031 -3.972656 2.878906 -4.125 2.484375 -4.125 C 2.085938 -4.125 1.757812 -3.972656 1.5 -3.671875 C 1.238281 -3.378906 1.109375 -2.929688 1.109375 -2.328125 Z M 1.109375 -2.328125 \" style=\"stroke:none;\"/>\n",
       "</symbol>\n",
       "<symbol id=\"glyph0-7\" overflow=\"visible\">\n",
       "<path d=\"M 0.65625 0 L 0.65625 -6.4375 L 1.515625 -6.4375 L 1.515625 -3.25 L 4.71875 -6.4375 L 5.859375 -6.4375 L 3.171875 -3.828125 L 5.984375 0 L 4.859375 0 L 2.5625 -3.265625 L 1.515625 -2.234375 L 1.515625 0 Z M 0.65625 0 \" style=\"stroke:none;\"/>\n",
       "</symbol>\n",
       "<symbol id=\"glyph0-8\" overflow=\"visible\">\n",
       "<path d=\"M 0.5625 1.796875 L 0.46875 1.0625 C 0.644531 1.101562 0.796875 1.125 0.921875 1.125 C 1.097656 1.125 1.238281 1.09375 1.34375 1.03125 C 1.445312 0.976562 1.535156 0.898438 1.609375 0.796875 C 1.648438 0.710938 1.726562 0.515625 1.84375 0.203125 C 1.863281 0.160156 1.890625 0.0976562 1.921875 0.015625 L 0.140625 -4.671875 L 1 -4.671875 L 1.96875 -1.96875 C 2.09375 -1.625 2.207031 -1.265625 2.3125 -0.890625 C 2.394531 -1.242188 2.5 -1.597656 2.625 -1.953125 L 3.625 -4.671875 L 4.421875 -4.671875 L 2.640625 0.078125 C 2.453125 0.585938 2.304688 0.941406 2.203125 1.140625 C 2.054688 1.398438 1.894531 1.585938 1.71875 1.703125 C 1.539062 1.828125 1.320312 1.890625 1.0625 1.890625 C 0.914062 1.890625 0.75 1.859375 0.5625 1.796875 Z M 0.5625 1.796875 \" style=\"stroke:none;\"/>\n",
       "</symbol>\n",
       "<symbol id=\"glyph0-9\" overflow=\"visible\">\n",
       "<path d=\"M 0.28125 -1.390625 L 1.0625 -1.515625 C 1.101562 -1.203125 1.222656 -0.960938 1.421875 -0.796875 C 1.628906 -0.628906 1.910156 -0.546875 2.265625 -0.546875 C 2.628906 -0.546875 2.898438 -0.617188 3.078125 -0.765625 C 3.253906 -0.910156 3.34375 -1.082031 3.34375 -1.28125 C 3.34375 -1.46875 3.265625 -1.609375 3.109375 -1.703125 C 2.992188 -1.773438 2.722656 -1.867188 2.296875 -1.984375 C 1.710938 -2.128906 1.304688 -2.253906 1.078125 -2.359375 C 0.859375 -2.460938 0.691406 -2.609375 0.578125 -2.796875 C 0.460938 -2.984375 0.40625 -3.191406 0.40625 -3.421875 C 0.40625 -3.628906 0.453125 -3.820312 0.546875 -4 C 0.640625 -4.175781 0.769531 -4.328125 0.9375 -4.453125 C 1.0625 -4.535156 1.226562 -4.609375 1.4375 -4.671875 C 1.65625 -4.734375 1.882812 -4.765625 2.125 -4.765625 C 2.488281 -4.765625 2.804688 -4.710938 3.078125 -4.609375 C 3.359375 -4.503906 3.566406 -4.363281 3.703125 -4.1875 C 3.835938 -4.007812 3.929688 -3.769531 3.984375 -3.46875 L 3.203125 -3.359375 C 3.171875 -3.597656 3.066406 -3.785156 2.890625 -3.921875 C 2.722656 -4.054688 2.488281 -4.125 2.1875 -4.125 C 1.820312 -4.125 1.5625 -4.0625 1.40625 -3.9375 C 1.25 -3.820312 1.171875 -3.679688 1.171875 -3.515625 C 1.171875 -3.410156 1.203125 -3.320312 1.265625 -3.25 C 1.328125 -3.15625 1.429688 -3.082031 1.578125 -3.03125 C 1.648438 -3 1.878906 -2.929688 2.265625 -2.828125 C 2.828125 -2.679688 3.21875 -2.5625 3.4375 -2.46875 C 3.664062 -2.375 3.84375 -2.234375 3.96875 -2.046875 C 4.09375 -1.867188 4.15625 -1.644531 4.15625 -1.375 C 4.15625 -1.101562 4.078125 -0.851562 3.921875 -0.625 C 3.765625 -0.394531 3.539062 -0.210938 3.25 -0.078125 C 2.96875 0.046875 2.640625 0.109375 2.265625 0.109375 C 1.660156 0.109375 1.195312 -0.015625 0.875 -0.265625 C 0.5625 -0.523438 0.363281 -0.898438 0.28125 -1.390625 Z M 0.28125 -1.390625 \" style=\"stroke:none;\"/>\n",
       "</symbol>\n",
       "<symbol id=\"glyph0-10\" overflow=\"visible\">\n",
       "<path d=\"M 5.578125 -0.6875 C 5.972656 -0.414062 6.335938 -0.21875 6.671875 -0.09375 L 6.421875 0.5 C 5.953125 0.332031 5.488281 0.0664062 5.03125 -0.296875 C 4.550781 -0.0234375 4.023438 0.109375 3.453125 0.109375 C 2.867188 0.109375 2.335938 -0.03125 1.859375 -0.3125 C 1.390625 -0.59375 1.023438 -0.988281 0.765625 -1.5 C 0.515625 -2.007812 0.390625 -2.582031 0.390625 -3.21875 C 0.390625 -3.851562 0.515625 -4.429688 0.765625 -4.953125 C 1.023438 -5.472656 1.394531 -5.867188 1.875 -6.140625 C 2.351562 -6.421875 2.882812 -6.5625 3.46875 -6.5625 C 4.0625 -6.5625 4.597656 -6.414062 5.078125 -6.125 C 5.554688 -5.84375 5.921875 -5.445312 6.171875 -4.9375 C 6.421875 -4.4375 6.546875 -3.863281 6.546875 -3.21875 C 6.546875 -2.695312 6.460938 -2.222656 6.296875 -1.796875 C 6.140625 -1.367188 5.898438 -1 5.578125 -0.6875 Z M 3.703125 -1.78125 C 4.191406 -1.644531 4.597656 -1.441406 4.921875 -1.171875 C 5.421875 -1.617188 5.671875 -2.300781 5.671875 -3.21875 C 5.671875 -3.75 5.582031 -4.207031 5.40625 -4.59375 C 5.226562 -4.976562 4.96875 -5.28125 4.625 -5.5 C 4.28125 -5.71875 3.894531 -5.828125 3.46875 -5.828125 C 2.832031 -5.828125 2.304688 -5.609375 1.890625 -5.171875 C 1.472656 -4.734375 1.265625 -4.082031 1.265625 -3.21875 C 1.265625 -2.382812 1.472656 -1.742188 1.890625 -1.296875 C 2.304688 -0.847656 2.832031 -0.625 3.46875 -0.625 C 3.78125 -0.625 4.070312 -0.679688 4.34375 -0.796875 C 4.070312 -0.960938 3.789062 -1.082031 3.5 -1.15625 Z M 3.703125 -1.78125 \" style=\"stroke:none;\"/>\n",
       "</symbol>\n",
       "<symbol id=\"glyph0-11\" overflow=\"visible\">\n",
       "<path d=\"M 3.65625 0 L 3.65625 -0.6875 C 3.289062 -0.15625 2.796875 0.109375 2.171875 0.109375 C 1.898438 0.109375 1.644531 0.0546875 1.40625 -0.046875 C 1.164062 -0.160156 0.984375 -0.296875 0.859375 -0.453125 C 0.742188 -0.609375 0.664062 -0.800781 0.625 -1.03125 C 0.59375 -1.1875 0.578125 -1.4375 0.578125 -1.78125 L 0.578125 -4.671875 L 1.359375 -4.671875 L 1.359375 -2.078125 C 1.359375 -1.660156 1.378906 -1.382812 1.421875 -1.25 C 1.460938 -1.039062 1.5625 -0.875 1.71875 -0.75 C 1.882812 -0.632812 2.085938 -0.578125 2.328125 -0.578125 C 2.566406 -0.578125 2.789062 -0.632812 3 -0.75 C 3.207031 -0.875 3.351562 -1.039062 3.4375 -1.25 C 3.519531 -1.457031 3.5625 -1.765625 3.5625 -2.171875 L 3.5625 -4.671875 L 4.359375 -4.671875 L 4.359375 0 Z M 3.65625 0 \" style=\"stroke:none;\"/>\n",
       "</symbol>\n",
       "<symbol id=\"glyph0-12\" overflow=\"visible\">\n",
       "<path d=\"M 0.578125 0 L 0.578125 -4.671875 L 1.296875 -4.671875 L 1.296875 -3.953125 C 1.472656 -4.285156 1.640625 -4.503906 1.796875 -4.609375 C 1.953125 -4.710938 2.125 -4.765625 2.3125 -4.765625 C 2.570312 -4.765625 2.84375 -4.679688 3.125 -4.515625 L 2.84375 -3.78125 C 2.65625 -3.894531 2.460938 -3.953125 2.265625 -3.953125 C 2.097656 -3.953125 1.941406 -3.898438 1.796875 -3.796875 C 1.660156 -3.691406 1.5625 -3.546875 1.5 -3.359375 C 1.414062 -3.078125 1.375 -2.769531 1.375 -2.4375 L 1.375 0 Z M 0.578125 0 \" style=\"stroke:none;\"/>\n",
       "</symbol>\n",
       "<symbol id=\"glyph0-13\" overflow=\"visible\">\n",
       "<path d=\"M 2.53125 0 L 0.046875 -6.4375 L 0.96875 -6.4375 L 2.640625 -1.765625 C 2.773438 -1.390625 2.882812 -1.035156 2.96875 -0.703125 C 3.070312 -1.054688 3.191406 -1.410156 3.328125 -1.765625 L 5.0625 -6.4375 L 5.9375 -6.4375 L 3.40625 0 Z M 2.53125 0 \" style=\"stroke:none;\"/>\n",
       "</symbol>\n",
       "<symbol id=\"glyph0-14\" overflow=\"visible\">\n",
       "<path d=\"M 3.640625 -0.578125 C 3.347656 -0.328125 3.066406 -0.148438 2.796875 -0.046875 C 2.523438 0.0546875 2.234375 0.109375 1.921875 0.109375 C 1.410156 0.109375 1.015625 -0.015625 0.734375 -0.265625 C 0.460938 -0.515625 0.328125 -0.835938 0.328125 -1.234375 C 0.328125 -1.460938 0.378906 -1.671875 0.484375 -1.859375 C 0.585938 -2.046875 0.722656 -2.195312 0.890625 -2.3125 C 1.054688 -2.425781 1.242188 -2.515625 1.453125 -2.578125 C 1.609375 -2.609375 1.84375 -2.644531 2.15625 -2.6875 C 2.800781 -2.757812 3.273438 -2.851562 3.578125 -2.96875 C 3.578125 -3.070312 3.578125 -3.140625 3.578125 -3.171875 C 3.578125 -3.492188 3.503906 -3.71875 3.359375 -3.84375 C 3.148438 -4.03125 2.847656 -4.125 2.453125 -4.125 C 2.078125 -4.125 1.800781 -4.054688 1.625 -3.921875 C 1.445312 -3.796875 1.316406 -3.566406 1.234375 -3.234375 L 0.46875 -3.328125 C 0.53125 -3.660156 0.640625 -3.925781 0.796875 -4.125 C 0.960938 -4.332031 1.195312 -4.488281 1.5 -4.59375 C 1.8125 -4.707031 2.164062 -4.765625 2.5625 -4.765625 C 2.96875 -4.765625 3.289062 -4.71875 3.53125 -4.625 C 3.78125 -4.53125 3.960938 -4.410156 4.078125 -4.265625 C 4.203125 -4.128906 4.285156 -3.953125 4.328125 -3.734375 C 4.359375 -3.597656 4.375 -3.359375 4.375 -3.015625 L 4.375 -1.953125 C 4.375 -1.222656 4.390625 -0.757812 4.421875 -0.5625 C 4.453125 -0.363281 4.519531 -0.175781 4.625 0 L 3.796875 0 C 3.710938 -0.164062 3.660156 -0.359375 3.640625 -0.578125 Z M 3.578125 -2.34375 C 3.285156 -2.226562 2.851562 -2.128906 2.28125 -2.046875 C 1.957031 -1.992188 1.726562 -1.9375 1.59375 -1.875 C 1.457031 -1.820312 1.351562 -1.738281 1.28125 -1.625 C 1.207031 -1.507812 1.171875 -1.382812 1.171875 -1.25 C 1.171875 -1.039062 1.25 -0.863281 1.40625 -0.71875 C 1.5625 -0.582031 1.796875 -0.515625 2.109375 -0.515625 C 2.410156 -0.515625 2.679688 -0.582031 2.921875 -0.71875 C 3.160156 -0.851562 3.335938 -1.035156 3.453125 -1.265625 C 3.535156 -1.441406 3.578125 -1.703125 3.578125 -2.046875 Z M 3.578125 -2.34375 \" style=\"stroke:none;\"/>\n",
       "</symbol>\n",
       "<symbol id=\"glyph0-15\" overflow=\"visible\">\n",
       "<path d=\"M 0.578125 0 L 0.578125 -6.4375 L 1.359375 -6.4375 L 1.359375 0 Z M 0.578125 0 \" style=\"stroke:none;\"/>\n",
       "</symbol>\n",
       "<symbol id=\"glyph0-16\" overflow=\"visible\">\n",
       "<path d=\"M 0.4375 -3.140625 C 0.4375 -4.203125 0.722656 -5.035156 1.296875 -5.640625 C 1.867188 -6.253906 2.609375 -6.5625 3.515625 -6.5625 C 4.109375 -6.5625 4.644531 -6.414062 5.125 -6.125 C 5.601562 -5.84375 5.96875 -5.445312 6.21875 -4.9375 C 6.46875 -4.425781 6.59375 -3.851562 6.59375 -3.21875 C 6.59375 -2.5625 6.460938 -1.972656 6.203125 -1.453125 C 5.941406 -0.941406 5.566406 -0.550781 5.078125 -0.28125 C 4.597656 -0.0195312 4.078125 0.109375 3.515625 0.109375 C 2.910156 0.109375 2.367188 -0.0351562 1.890625 -0.328125 C 1.410156 -0.617188 1.046875 -1.019531 0.796875 -1.53125 C 0.554688 -2.039062 0.4375 -2.578125 0.4375 -3.140625 Z M 1.3125 -3.125 C 1.3125 -2.34375 1.519531 -1.726562 1.9375 -1.28125 C 2.351562 -0.84375 2.878906 -0.625 3.515625 -0.625 C 4.148438 -0.625 4.675781 -0.847656 5.09375 -1.296875 C 5.507812 -1.742188 5.71875 -2.382812 5.71875 -3.21875 C 5.71875 -3.738281 5.628906 -4.191406 5.453125 -4.578125 C 5.273438 -4.972656 5.015625 -5.28125 4.671875 -5.5 C 4.328125 -5.71875 3.945312 -5.828125 3.53125 -5.828125 C 2.925781 -5.828125 2.40625 -5.617188 1.96875 -5.203125 C 1.53125 -4.785156 1.3125 -4.09375 1.3125 -3.125 Z M 1.3125 -3.125 \" style=\"stroke:none;\"/>\n",
       "</symbol>\n",
       "<symbol id=\"glyph0-17\" overflow=\"visible\">\n",
       "<path d=\"M 0.59375 1.78125 L 0.59375 -4.671875 L 1.3125 -4.671875 L 1.3125 -4.0625 C 1.476562 -4.300781 1.664062 -4.476562 1.875 -4.59375 C 2.09375 -4.707031 2.359375 -4.765625 2.671875 -4.765625 C 3.066406 -4.765625 3.414062 -4.660156 3.71875 -4.453125 C 4.019531 -4.253906 4.25 -3.96875 4.40625 -3.59375 C 4.5625 -3.21875 4.640625 -2.8125 4.640625 -2.375 C 4.640625 -1.894531 4.550781 -1.460938 4.375 -1.078125 C 4.207031 -0.691406 3.960938 -0.394531 3.640625 -0.1875 C 3.316406 0.0078125 2.972656 0.109375 2.609375 0.109375 C 2.347656 0.109375 2.113281 0.0507812 1.90625 -0.0625 C 1.695312 -0.175781 1.523438 -0.316406 1.390625 -0.484375 L 1.390625 1.78125 Z M 1.3125 -2.3125 C 1.3125 -1.707031 1.429688 -1.257812 1.671875 -0.96875 C 1.921875 -0.6875 2.21875 -0.546875 2.5625 -0.546875 C 2.90625 -0.546875 3.203125 -0.691406 3.453125 -0.984375 C 3.710938 -1.285156 3.84375 -1.75 3.84375 -2.375 C 3.84375 -2.96875 3.71875 -3.410156 3.46875 -3.703125 C 3.226562 -4.003906 2.9375 -4.15625 2.59375 -4.15625 C 2.257812 -4.15625 1.960938 -3.992188 1.703125 -3.671875 C 1.441406 -3.359375 1.3125 -2.90625 1.3125 -2.3125 Z M 1.3125 -2.3125 \" style=\"stroke:none;\"/>\n",
       "</symbol>\n",
       "<symbol id=\"glyph0-18\" overflow=\"visible\">\n",
       "<path d=\"M 0.671875 0 L 0.671875 -6.4375 L 1.953125 -6.4375 L 3.46875 -1.875 C 3.613281 -1.457031 3.71875 -1.140625 3.78125 -0.921875 C 3.851562 -1.160156 3.96875 -1.503906 4.125 -1.953125 L 5.671875 -6.4375 L 6.8125 -6.4375 L 6.8125 0 L 6 0 L 6 -5.390625 L 4.125 0 L 3.359375 0 L 1.484375 -5.484375 L 1.484375 0 Z M 0.671875 0 \" style=\"stroke:none;\"/>\n",
       "</symbol>\n",
       "<symbol id=\"glyph0-19\" overflow=\"visible\">\n",
       "<path d=\"M 0.59375 0 L 0.59375 -4.671875 L 1.296875 -4.671875 L 1.296875 -4.015625 C 1.441406 -4.242188 1.632812 -4.425781 1.875 -4.5625 C 2.125 -4.695312 2.40625 -4.765625 2.71875 -4.765625 C 3.0625 -4.765625 3.34375 -4.691406 3.5625 -4.546875 C 3.78125 -4.410156 3.9375 -4.210938 4.03125 -3.953125 C 4.40625 -4.492188 4.882812 -4.765625 5.46875 -4.765625 C 5.9375 -4.765625 6.296875 -4.632812 6.546875 -4.375 C 6.796875 -4.125 6.921875 -3.734375 6.921875 -3.203125 L 6.921875 0 L 6.125 0 L 6.125 -2.9375 C 6.125 -3.257812 6.097656 -3.488281 6.046875 -3.625 C 6.003906 -3.757812 5.914062 -3.867188 5.78125 -3.953125 C 5.644531 -4.046875 5.484375 -4.09375 5.296875 -4.09375 C 4.972656 -4.09375 4.703125 -3.984375 4.484375 -3.765625 C 4.265625 -3.546875 4.15625 -3.195312 4.15625 -2.71875 L 4.15625 0 L 3.359375 0 L 3.359375 -3.03125 C 3.359375 -3.382812 3.296875 -3.648438 3.171875 -3.828125 C 3.046875 -4.003906 2.835938 -4.09375 2.546875 -4.09375 C 2.316406 -4.09375 2.109375 -4.03125 1.921875 -3.90625 C 1.734375 -3.789062 1.597656 -3.617188 1.515625 -3.390625 C 1.429688 -3.171875 1.390625 -2.847656 1.390625 -2.421875 L 1.390625 0 Z M 0.59375 0 \" style=\"stroke:none;\"/>\n",
       "</symbol>\n",
       "</g>\n",
       "</defs>\n",
       "<g id=\"surface1\">\n",
       "<path d=\"M 102.5 136.800781 L 209.167969 136.800781 C 213.585938 136.800781 217.167969 140.382812 217.167969 144.800781 L 217.167969 245.800781 C 217.167969 250.21875 213.585938 253.800781 209.167969 253.800781 L 102.5 253.800781 C 98.082031 253.800781 94.5 250.21875 94.5 245.800781 L 94.5 144.800781 C 94.5 140.382812 98.082031 136.800781 102.5 136.800781 Z M 102.5 136.800781 \" style=\"fill-rule:nonzero;fill:rgb(94.902039%,94.509888%,95.68634%);fill-opacity:1;stroke-width:1;stroke-linecap:round;stroke-linejoin:round;stroke:rgb(50.196838%,50.196838%,50.196838%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 135 219.398438 L 151 219.398438 L 151 245.398438 L 135 245.398438 Z M 135 219.398438 \" style=\"fill-rule:nonzero;fill:rgb(39.99939%,74.902344%,100%);fill-opacity:1;stroke-width:1;stroke-linecap:round;stroke-linejoin:round;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 229 145.105469 L 245 145.105469 L 245 175.105469 L 229 175.105469 Z M 229 145.105469 \" style=\"fill-rule:nonzero;fill:rgb(69.804382%,85.098267%,100%);fill-opacity:1;stroke-width:1;stroke-linecap:round;stroke-linejoin:round;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 143 219.398438 L 143 211.898438 \" style=\"fill:none;stroke-width:1;stroke-linecap:round;stroke-linejoin:round;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 143 207.898438 L 143 211.898438 M 141.5 211.898438 L 143 207.898438 L 144.5 211.898438 \" style=\"fill:none;stroke-width:1;stroke-linecap:butt;stroke-linejoin:miter;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 135 190 L 245 190 L 245 206 L 135 206 Z M 135 190 \" style=\"fill-rule:nonzero;fill:rgb(100%,100%,100%);fill-opacity:1;stroke-width:1;stroke-linecap:round;stroke-linejoin:round;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"78.2373\" xlink:href=\"#glyph0-1\" y=\"85.5\"/>\n",
       "  <use x=\"84.2403\" xlink:href=\"#glyph0-2\" y=\"85.5\"/>\n",
       "</g>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"86.7405\" xlink:href=\"#glyph0-2\" y=\"85.5\"/>\n",
       "</g>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"89.2407\" xlink:href=\"#glyph0-3\" y=\"85.5\"/>\n",
       "</g>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"94.2465\" xlink:href=\"#glyph0-4\" y=\"85.5\"/>\n",
       "</g>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"99.2523\" xlink:href=\"#glyph0-2\" y=\"85.5\"/>\n",
       "</g>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"101.7525\" xlink:href=\"#glyph0-5\" y=\"85.5\"/>\n",
       "</g>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"103.7523\" xlink:href=\"#glyph0-6\" y=\"85.5\"/>\n",
       "</g>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"108.7581\" xlink:href=\"#glyph0-4\" y=\"85.5\"/>\n",
       "</g>\n",
       "<path d=\"M 161.800781 219.398438 L 177.800781 219.398438 L 177.800781 245.398438 L 161.800781 245.398438 Z M 161.800781 219.398438 \" style=\"fill-rule:nonzero;fill:rgb(39.99939%,74.902344%,100%);fill-opacity:1;stroke-width:1;stroke-linecap:round;stroke-linejoin:round;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 169.800781 219.398438 L 169.800781 211.898438 \" style=\"fill:none;stroke-width:1;stroke-linecap:round;stroke-linejoin:round;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 169.800781 207.898438 L 169.800781 211.898438 M 168.300781 211.898438 L 169.800781 207.898438 L 171.300781 211.898438 \" style=\"fill:none;stroke-width:1;stroke-linecap:butt;stroke-linejoin:miter;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 188.800781 219.398438 L 204.800781 219.398438 L 204.800781 245.398438 L 188.800781 245.398438 Z M 188.800781 219.398438 \" style=\"fill-rule:nonzero;fill:rgb(39.99939%,74.902344%,100%);fill-opacity:1;stroke-width:1;stroke-linecap:round;stroke-linejoin:round;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 196.800781 219.398438 L 196.800781 211.898438 \" style=\"fill:none;stroke-width:1;stroke-linecap:round;stroke-linejoin:round;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 196.800781 207.898438 L 196.800781 211.898438 M 195.300781 211.898438 L 196.800781 207.898438 L 198.300781 211.898438 \" style=\"fill:none;stroke-width:1;stroke-linecap:butt;stroke-linejoin:miter;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 135 145.105469 L 151 145.105469 L 151 175.105469 L 135 175.105469 Z M 135 145.105469 \" style=\"fill-rule:nonzero;fill:rgb(39.99939%,74.902344%,100%);fill-opacity:1;stroke-width:1;stroke-linecap:round;stroke-linejoin:round;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 143 175.105469 L 143 184.101562 \" style=\"fill:none;stroke-width:1;stroke-linecap:round;stroke-linejoin:round;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 143 188.101562 L 143 184.101562 M 144.5 184.101562 L 143 188.101562 L 141.5 184.101562 \" style=\"fill:none;stroke-width:1;stroke-linecap:butt;stroke-linejoin:miter;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 161.898438 145.105469 L 177.898438 145.105469 L 177.898438 175.105469 L 161.898438 175.105469 Z M 161.898438 145.105469 \" style=\"fill-rule:nonzero;fill:rgb(39.99939%,74.902344%,100%);fill-opacity:1;stroke-width:1;stroke-linecap:round;stroke-linejoin:round;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 188.800781 145.105469 L 204.800781 145.105469 L 204.800781 175.105469 L 188.800781 175.105469 Z M 188.800781 145.105469 \" style=\"fill-rule:nonzero;fill:rgb(39.99939%,74.902344%,100%);fill-opacity:1;stroke-width:1;stroke-linecap:round;stroke-linejoin:round;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 196.710938 175.105469 L 196.660156 184.101562 \" style=\"fill:none;stroke-width:1;stroke-linecap:round;stroke-linejoin:round;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 196.636719 188.101562 L 196.660156 184.101562 M 198.160156 184.109375 L 196.636719 188.101562 L 195.160156 184.089844 \" style=\"fill:none;stroke-width:1;stroke-linecap:butt;stroke-linejoin:miter;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 169.851562 175.105469 L 169.820312 184.101562 \" style=\"fill:none;stroke-width:1;stroke-linecap:round;stroke-linejoin:round;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 169.804688 188.101562 L 169.820312 184.101562 M 171.320312 184.105469 L 169.804688 188.101562 L 168.320312 184.09375 \" style=\"fill:none;stroke-width:1;stroke-linecap:butt;stroke-linejoin:miter;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 229 219.398438 L 245 219.398438 L 245 245.398438 L 229 245.398438 Z M 229 219.398438 \" style=\"fill-rule:nonzero;fill:rgb(69.804382%,85.098267%,100%);fill-opacity:1;stroke-width:1;stroke-linecap:round;stroke-linejoin:round;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"16.7432\" xlink:href=\"#glyph0-7\" y=\"122.4\"/>\n",
       "  <use x=\"22.7462\" xlink:href=\"#glyph0-3\" y=\"122.4\"/>\n",
       "</g>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"27.752\" xlink:href=\"#glyph0-8\" y=\"122.4\"/>\n",
       "  <use x=\"32.252\" xlink:href=\"#glyph0-9\" y=\"122.4\"/>\n",
       "</g>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"154.4893\" xlink:href=\"#glyph0-10\" y=\"118.8959\"/>\n",
       "</g>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"161.4895\" xlink:href=\"#glyph0-11\" y=\"118.8959\"/>\n",
       "</g>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"166.4953\" xlink:href=\"#glyph0-3\" y=\"118.8959\"/>\n",
       "</g>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"171.5011\" xlink:href=\"#glyph0-12\" y=\"118.8959\"/>\n",
       "  <use x=\"174.4981\" xlink:href=\"#glyph0-8\" y=\"118.8959\"/>\n",
       "</g>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"9.5669\" xlink:href=\"#glyph0-13\" y=\"47.6041\"/>\n",
       "  <use x=\"14.9021\" xlink:href=\"#glyph0-14\" y=\"47.6041\"/>\n",
       "</g>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"19.9043\" xlink:href=\"#glyph0-15\" y=\"47.6041\"/>\n",
       "</g>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"21.9005\" xlink:href=\"#glyph0-11\" y=\"47.6041\"/>\n",
       "</g>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"26.9027\" xlink:href=\"#glyph0-3\" y=\"47.6041\"/>\n",
       "</g>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"31.9049\" xlink:href=\"#glyph0-9\" y=\"47.6041\"/>\n",
       "</g>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"155.0024\" xlink:href=\"#glyph0-16\" y=\"47.6041\"/>\n",
       "  <use x=\"162.0026\" xlink:href=\"#glyph0-11\" y=\"47.6041\"/>\n",
       "  <use x=\"167.0048\" xlink:href=\"#glyph0-2\" y=\"47.6041\"/>\n",
       "  <use x=\"169.505\" xlink:href=\"#glyph0-17\" y=\"47.6041\"/>\n",
       "  <use x=\"174.5072\" xlink:href=\"#glyph0-11\" y=\"47.6041\"/>\n",
       "  <use x=\"179.5094\" xlink:href=\"#glyph0-2\" y=\"47.6041\"/>\n",
       "</g>\n",
       "<path d=\"M 237 219.398438 L 237 211.898438 \" style=\"fill:none;stroke-width:1;stroke-linecap:round;stroke-linejoin:round;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 237 207.898438 L 237 211.898438 M 235.5 211.898438 L 237 207.898438 L 238.5 211.898438 \" style=\"fill:none;stroke-width:1;stroke-linecap:butt;stroke-linejoin:miter;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 237.125 190 L 237.085938 181.003906 \" style=\"fill:none;stroke-width:1;stroke-linecap:round;stroke-linejoin:round;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<path d=\"M 237.070312 177.003906 L 237.085938 181.003906 M 235.585938 181.011719 L 237.070312 177.003906 L 238.585938 180.996094 \" style=\"fill:none;stroke-width:1;stroke-linecap:butt;stroke-linejoin:miter;stroke:rgb(0%,0%,0%);stroke-opacity:1;stroke-miterlimit:10;\" transform=\"matrix(1,0,0,1,-94,-115)\"/>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"46.398\" xlink:href=\"#glyph0-18\" y=\"13.8\"/>\n",
       "  <use x=\"53.895\" xlink:href=\"#glyph0-3\" y=\"13.8\"/>\n",
       "</g>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"58.9008\" xlink:href=\"#glyph0-19\" y=\"13.8\"/>\n",
       "  <use x=\"66.3978\" xlink:href=\"#glyph0-6\" y=\"13.8\"/>\n",
       "</g>\n",
       "<g style=\"fill:rgb(0%,0%,0%);fill-opacity:1;\">\n",
       "  <use x=\"71.4036\" xlink:href=\"#glyph0-12\" y=\"13.8\"/>\n",
       "  <use x=\"74.4006\" xlink:href=\"#glyph0-8\" y=\"13.8\"/>\n",
       "</g>\n",
       "</g>\n",
       "</svg>"
      ],
      "text/plain": [
       "<IPython.core.display.SVG object>"
      ]
     },
     "execution_count": 244,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "from IPython.display import SVG\n",
    "SVG('./img/attention.svg')"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "To compute the output, we first assume there is a score function $\\alpha$ which measures the similarity between the query and a key. Then we compute all $n$ scores $a_1, \\ldots, a_n$ by\n",
    "\n",
    "$$a_i = \\alpha(\\mathbf q, \\mathbf k_i).$$\n",
    "\n",
    "Next we use softmax to obtain the attention weights\n",
    "\n",
    "$$b_1, \\ldots, b_n = \\textrm{softmax}(a_1, \\ldots, a_n).$$\n",
    "\n",
    "The output is then a weighted sum of the values\n",
    "\n",
    "$$\\mathbf o = \\sum_{i=1}^n b_i \\mathbf v_i.$$\n",
    "\n",
    "Different choices of the score function lead to different attention layers. We will discuss two commonly used attention layers in the rest of this section. Before diving into the implementation, we first introduce a masked version of the softmax operator and explain a specialized dot operator nd.batched_dot."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 259,
   "metadata": {},
   "outputs": [],
   "source": [
    "import math\n",
    "import torch \n",
    "import torch.nn as nn"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The masked softmax takes a 3-dim input and allows us to filter out some elements by specifying valid lengths for the last dimension. (Refer to :numref:chapter_machine_translation for the definition of a valid length.)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 260,
   "metadata": {},
   "outputs": [],
   "source": [
    "def SequenceMask(X, X_len,value=0):\n",
    "    maxlen = X.size(1)\n",
    "    mask = torch.arange((maxlen),dtype=torch.float)[None, :] < X_len[:, None]    \n",
    "    X[~mask]=value\n",
    "    return X"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 261,
   "metadata": {},
   "outputs": [],
   "source": [
    "# Save to the d2l package.\n",
    "def masked_softmax(X, valid_length):\n",
    "    # X: 3-D tensor, valid_length: 1-D or 2-D tensor\n",
    "    softmax = nn.Softmax()\n",
    "    if valid_length is None:\n",
    "        return softmax(X)\n",
    "    else:\n",
    "        shape = X.shape\n",
    "        if valid_length.dim() == 1:\n",
    "            valid_length = torch.FloatTensor(valid_length.numpy().repeat(shape[1], axis=0))\n",
    "        else:\n",
    "            valid_length = valid_length.reshape((-1,))\n",
    "        # fill masked elements with a large negative, whose exp is 0\n",
    "        X = SequenceMask(X.reshape((-1, shape[-1])), valid_length)\n",
    "        return softmax(X).reshape(shape)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Construct two examples, where each example is a 2-by-4 matrix, as the input. If we specify the valid length for the first example to be 2, then only the first two columns of this example are used to compute softmax."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 262,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[[0.4611, 0.1822, 0.1783, 0.1783],\n",
       "         [0.3462, 0.2195, 0.2172, 0.2172]],\n",
       "\n",
       "        [[0.2712, 0.3071, 0.2475, 0.1741],\n",
       "         [0.2264, 0.3306, 0.3155, 0.1276]]])"
      ]
     },
     "execution_count": 262,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "masked_softmax(torch.rand((2,2,4),dtype=torch.float), torch.FloatTensor([2,3]))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The operator nd.batched_dot takes two inputs  𝑋  and  𝑌  with shapes  (𝑏,𝑛,𝑚)  and  (𝑏,𝑚,𝑘) , respectively. It computes  𝑏  dot products, with Z[i,:,:]=dot(X[i,:,:], Y[i,:,:] for  𝑖=1,…,𝑛 .\n",
    "\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 263,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[[3., 3.]],\n",
       "\n",
       "        [[3., 3.]]])"
      ]
     },
     "execution_count": 263,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "torch.bmm(torch.ones((2,1,3), dtype = torch.float), torch.ones((2,3,2), dtype = torch.float))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Dot Product Attention"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The dot product assumes the query has the same dimension as the keys, namely  𝐪,𝐤𝑖∈ℝ𝑑  for all  𝑖 . It computes the score by an inner product between the query and a key, often then divided by  𝑑‾‾√  to make the scores less sensitive to the dimension  𝑑 . In other words,\n",
    "\n",
    "𝛼(𝐪,𝐤)=⟨𝐪,𝐤⟩/𝑑‾‾√. \n",
    "Assume  𝐐∈ℝ𝑚×𝑑  contains  𝑚  queries and  𝐊∈ℝ𝑛×𝑑  has all  𝑛  keys. We can compute all  𝑚𝑛  scores by\n",
    "\n",
    "𝛼(𝐐,𝐊)=𝐐𝐊𝑇/𝑑‾‾√. \n",
    "Now let's implement this layer that supports a batch of queries and key-value pairs. In addition, it supports randomly dropping some attention weights as a regularization."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 305,
   "metadata": {},
   "outputs": [],
   "source": [
    "# Save to the d2l package.\n",
    "class DotProductAttention(nn.Module): \n",
    "    def __init__(self, dropout, **kwargs):\n",
    "        super(DotProductAttention, self).__init__(**kwargs)\n",
    "        self.dropout = nn.Dropout(dropout)\n",
    "\n",
    "    # query: (batch_size, #queries, d)\n",
    "    # key: (batch_size, #kv_pairs, d)\n",
    "    # value: (batch_size, #kv_pairs, dim_v)\n",
    "    # valid_length: either (batch_size, ) or (batch_size, xx)\n",
    "    def forward(self, query, key, value, valid_length=None):\n",
    "        d = query.shape[-1]\n",
    "        # set transpose_b=True to swap the last two dimensions of key\n",
    "        \n",
    "        scores = torch.bmm(query, key.transpose(1,2)) / math.sqrt(d)\n",
    "        attention_weights = self.dropout(masked_softmax(scores, valid_length))\n",
    "        return torch.bmm(attention_weights, value)\n",
    "      \n",
    " "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Now we create two batches, and each batch has one query and 10 key-value pairs. We specify through valid_length that for the first batch, we will only pay attention to the first 2 key-value pairs, while for the second batch, we will check the first 6 key-value pairs. Therefore, though both batches have the same query and key-value pairs, we obtain different outputs."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 306,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[[11.8325, 12.4488, 13.0650, 13.6813]],\n",
       "\n",
       "        [[ 6.5421,  6.8987,  7.2553,  7.6118]]])"
      ]
     },
     "execution_count": 306,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "atten = DotProductAttention(dropout=0.5)\n",
    "\n",
    "keys = torch.ones((2,10,2),dtype=torch.float)\n",
    "values = torch.arange((40), dtype=torch.float).view(1,10,4).repeat(2,1,1)\n",
    "atten(torch.ones((2,1,2),dtype=torch.float), keys, values, torch.FloatTensor([2, 6]))\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Multilayer Perceptron Attention"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "In multilayer perceptron attention, we first project both query and keys into  ℝℎ .\n",
    "\n",
    "To be more specific, assume learnable parameters  𝐖𝑘∈ℝℎ×𝑑𝑘 ,  𝐖𝑞∈ℝℎ×𝑑𝑞 , and  𝐯∈ℝ𝑝 . Then the score function is defined by\n",
    "\n",
    "𝛼(𝐤,𝐪)=𝐯𝑇tanh(𝐖𝑘𝐤+𝐖𝑞𝐪). \n",
    "This concatenates the key and value in the feature dimension and feeds them into a single hidden layer perceptron with hidden layer size  ℎ  and output layer size  1 . The hidden layer activation function is tanh and no bias is applied."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 307,
   "metadata": {},
   "outputs": [],
   "source": [
    "# Save to the d2l package.\n",
    "class MLPAttention(nn.Module):  \n",
    "    def __init__(self, units, dropout, **kwargs):\n",
    "        super(MLPAttention, self).__init__(**kwargs)\n",
    "        # Use flatten=True to keep query's and key's 3-D shapes.\n",
    "        self.W_k = nn.Linear(2, units, bias=False)\n",
    "        self.W_q = nn.Linear(2, units, bias=False)\n",
    "        self.v = nn.Linear(8, 1, bias=False)\n",
    "        self.dropout = nn.Dropout(dropout)\n",
    "\n",
    "    def forward(self, query, key, value, valid_length):\n",
    "        query, key = self.W_k(query), self.W_q(key)\n",
    "        # expand query to (batch_size, #querys, 1, units), and key to\n",
    "        # (batch_size, 1, #kv_pairs, units). Then plus them with broadcast.\n",
    "        features = query.unsqueeze(2) + key.unsqueeze(1)\n",
    "        scores = self.v(features).squeeze(-1) \n",
    "        attention_weights = self.dropout(masked_softmax(scores, valid_length))\n",
    "        return torch.bmm(attention_weights, value)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Despite MLPAttention containing an additional MLP model, given the same inputs with identical keys, we obtain the same output as for DotProductAttention."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 308,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[[12.5374, 13.3251, 14.1128, 14.9006]],\n",
       "\n",
       "        [[21.3116, 22.4227, 23.5339, 24.6450]]], grad_fn=<BmmBackward>)"
      ]
     },
     "execution_count": 308,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "atten = MLPAttention(units = 8, dropout=0.1)\n",
    "atten(torch.ones((2,1,2), dtype = torch.float), keys, values, torch.FloatTensor([2, 6]))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Summary"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "- An attention layer explicitly selects related information.\n",
    "- An attention layer's memory consists of key-value pairs, so its output is close to the values whose keys are similar to the query."
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.7.3"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
