{
 "cells": [
  {
   "cell_type": "markdown",
   "id": "275067ee-11e1-4061-8fa0-6f035e5e1299",
   "metadata": {},
   "source": [
    "# **深度学习公开课 - 深度学习中的时间序列算法群**\n",
    "> 节选自《深度学习实战》第7期正课<br>\n",
    "> 作者：@菜菜TsaiTsai<br>\n",
    "> 版本号：2023/11/8<br>"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "f2dc7203-2a75-4579-938c-d7a61464c975",
   "metadata": {
    "tags": []
   },
   "source": [
    "## 0 课程规划\n",
    "\n",
    "欢迎来到《深度学习中的时间序列算法群》公开课。在这门课程中，我将带你从0认识3大类深度学习中的时间序列模型，并为你深度讲解深度时序算法众多的的精彩理念与实现方式。当你完成这门课程时，你将完成深度时序算法入门，打好进一步学习更多高级架构的基础。\n",
    "\n",
    "**DAY 1：LSTM与深度学习中的时间序列**\n",
    "1. 深度学习中的时间序列数据\n",
    "2. 时序数据 vs 非时序数据\n",
    "3. 循环神经网络如何处理时序问题\n",
    "4. LSTM的灵感起源与直觉理解\n",
    "5. LSTM的基本结构与架构设计\n",
    "\n",
    "**DAY 2：LSTM的参数全解与预测实战**\n",
    "1. PyTorch中的LSTM层与参数\n",
    "2. LSTM类的输入与输出\n",
    "3. 股价与时间序列数据预测实战\n",
    "\n",
    "**DAY 3：时序进阶：时序卷积网络TCN**\n",
    "1. 1d卷积操作与时序数据\n",
    "2. 因果卷积 Causal Convolution\n",
    "3. 1d膨胀卷积与感受野扩张\n",
    "4. 时序卷积网络的架构与实现\n",
    "5. 深度学习中的5大类时序解决方案\n",
    "\n",
    "**DAY 4：Transformer与Attention**\n",
    "1. 序列模型的基本思路与根本诉求\n",
    "2. Attention注意力机制的计算流程\n",
    "3. Transformer架构的Encoder结构层<br>\n",
    "（限于公开课时间，下面的内容没来得及讲解 ↓ 将包含在正课当中）\n",
    "4. Transformer架构的Decoder结构层\n",
    "5. Transformer用于时间序列的相关问题\n",
    "\n",
    "**DAY 5：Transformer的PyTorch与Huggingface实践**\n",
    "1. Transformer的PyTorch参数\n",
    "2. 在PyTorch中调用Transformer架构\n",
    "3. Transformer实践各类NLP任务一览\n",
    "4. Huggingface部署安装与框架结构\n",
    "5. Huggingface中的bert调用与实现\n",
    "\n",
    "**DAY 6：Informer架构解析与时序应用**\n",
    "\n",
    "**DAY 7：深度时序SOTA架构TabNet**\n",
    "\n",
    "更多后续课程请关注B站动态和小可爱私聊信息！"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "59fa2107-5ec4-4568-abdc-ce8dac06cb97",
   "metadata": {},
   "source": [
    "## 1 PyTorch中的Transformer"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "60331e8a-0e4d-4bbd-bb1b-a648bda5d76e",
   "metadata": {},
   "source": [
    "在之前的课程当中，我们已经认识了PyTorch框架的基本结构，整个PyTorch框架可以大致被分Torch和成熟AI领域两大板块，其中Torch包含各类神经网络组成元素、用于构建各类神经网络，各类AI领域中则包括Torchvision、Torchtext、Torchaudio等辅助完成图像、文字、语音方面各类任务的领域模块。\n",
    "\n",
    "在PyTorch中，Transformer算法是属于“构建循环神经网络的元素”，而非“成熟神经网络”，因此Transformer是位于PyTorch.nn这个基本模块下。为什么PyTorch中的Transformer结构是位于nn，而不是属于成熟神经网络呢？**事实上，在PyTorch中并没有完整的Transformer架构，只有用于构建Transformer的各个层**。我们一起来看一下。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "dc1338dc-f8f0-414b-b7e0-a807e13c99b1",
   "metadata": {},
   "source": [
    "<center><img src=\"https://skojiangdoc.oss-cn-beijing.aliyuncs.com/2023DL/transformer/image-1.png\" alt=\"描述文字\" width=\"400\">"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "b2340d3a-4d73-4477-b1c2-3cb0be5c49c1",
   "metadata": {},
   "source": [
    "在torch.nn模块下，存在**服务于Transformer架构的各类神经网络层和模型**，我们来看一下——\n",
    "\n",
    "- **nn.Transformer**\n",
    "\n",
    "`nn.Transformer`封装了Transformer中的包含编码器（Encoder）和解码器（Decoder）。如下图所示，它对Encoder和Decoder两部分的包装，它并没有实现输入中的Embedding和Positional Encoding和最后输出的Linear+Softmax部分。\n",
    "\n",
    "<center><img src=\"https://skojiangdoc.oss-cn-beijing.aliyuncs.com/2023DL/transformer/image-37.png\" alt=\"描述文字\" width=\"400\">"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "904e79a1-f538-4ec6-89cd-391a47117e55",
   "metadata": {},
   "source": [
    "- **分割的编码器与解码器**\n",
    "\n",
    "`nn.TransformerEncoderLayer`与`nn.TransformerDecoderLayer`: 这两个类表示Transformer编码器的单层和解码器的单层。它包含了自注意力机制（self-attention）和前馈网络（feedforward network），以及必要的归一化和残差连接。\n",
    "\n",
    "`nn.TransformerEncoder`与`nn.TransformerDecoder`: 这两个类是Transformer编码器的实现和解码器的实现，其中`nn.TransformerEncoder`包含了多个nn.TransformerEncoderLayer层的堆叠，`nn.TransformerDecoder`包含了多个nn.TransformerDecoderLayer层的堆叠。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "e12f5df6-dc81-4d09-864d-bae16613022f",
   "metadata": {},
   "source": [
    "<center><img src=\"https://skojiangdoc.oss-cn-beijing.aliyuncs.com/2023DL/transformer/image-38.png\" alt=\"描述文字\" width=\"800\">"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "2a926dc4-5956-4102-bbd0-27861229f64f",
   "metadata": {},
   "source": [
    "除此之外，我们还有：\n",
    "\n",
    "`nn.MultiheadAttention`: 这个模块实现了多头注意力机制，这是Transformer模型的核心组件之一。多头注意力允许模型在不同的位置同时处理来自序列不同部分的信息，这有助于捕捉序列内的复杂依赖关系。\n",
    "\n",
    "`nn.LayerNorm`: 层归一化（Layer Normalization）通常用在Transformer的各个子层的输出上，有助于稳定训练过程，并且提高了训练的速度和效果。\n",
    "\n",
    "`nn.PositionalEncoding`: 虽然PyTorch的nn模块中没有这个类，但在使用Transformer时通常会添加一个位置编码层来给模型提供关于词汇在序列中位置的信息，因为Transformer本身不具备处理序列顺序的能力。\n",
    "\n",
    "`nn.Embedding`：一个预训练好的语义空间，它将每个标记（如单词、字符等）映射到一个高维空间的向量。这使得模型能够处理文本数据，并为每个唯一的标记捕获丰富的语义属性。嵌入层通常是自然语言处理模型的第一层，用于将离散的文本数据转化为连续的向量表示。其输入是索引列表，输出是对应的嵌入向量。\n",
    "\n",
    "`nn.Transformer.generate_square_subsequent_mask`：掩码函数。用于生成一个方形矩阵，用作Transformer模型中自注意力机制的上三角遮罩。这个遮罩确保在序列生成任务中，例如语言模型中，任何给定的元素只会考虑到序列中先于它的元素（即它只能看到过去的信息，不能看到未来的信息）。这种掩码通常在解码器部分使用，防止在预测下一个输出时“作弊”。具体来说，该函数创建了一个方阵，其中对角线及其以下的元素为0（表示可以“看到”这些位置的元素），其余元素为负无穷大（在softmax之前应用，表示位置被屏蔽，不应该有注意力权重）。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 72,
   "id": "496ea6cd-4279-4847-9368-f626497a6e3b",
   "metadata": {},
   "outputs": [],
   "source": [
    "import torch.nn as nn"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 75,
   "id": "24530d53-3498-40e3-ad6c-e4efc5f8cc87",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[0., -inf, -inf, -inf, -inf],\n",
       "        [0., 0., -inf, -inf, -inf],\n",
       "        [0., 0., 0., -inf, -inf],\n",
       "        [0., 0., 0., 0., -inf],\n",
       "        [0., 0., 0., 0., 0.]])"
      ]
     },
     "execution_count": 75,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "nn.Transformer.generate_square_subsequent_mask(5) # 5指的是target的维度"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "5285d78b-f967-4013-9473-1cc40a00d4ec",
   "metadata": {},
   "source": [
    "不难发现，torch.nn下面配置了一系列构成transformer的元素，但是却没有构成完整的Transformer算法。这是为什么呢？事实上，在NLP的世界中，不同的任务会对Transformer架构提出不同的要求，编码器和解码器是设计成可以独立或一起使用的组件。它们可以根据不同的NLP任务需求进行组合，以适应各种场景。\n",
    "\n",
    "- **只使用编码器的任务**\n",
    "\n",
    "编码器部分的任务是从输入数据中提取特征。编码器通常用于不需要生成新文本序列的任务，比如：\n",
    "\n",
    "> 文本分类：如情感分析，垃圾邮件检测等，输入一个文本序列，编码器提取特征后进行分类。\n",
    "\n",
    "> 命名实体识别（Named Entity Recognition, NER）：在给定文本中识别出实体（如人名、地点等），这也是分类问题的一种，可以用编码器提取文本的特征。\n",
    "\n",
    "> 句子相似度：判断两个句子是否相关或相似度如何，可以通过编码器提取句子特征后计算相似度。\n",
    "\n",
    "- **只使用解码器的任务**\n",
    "\n",
    "解码器部分专注于生成新文本。通常使用自回归方式，基于之前的输出生成下一个词。适用于：\n",
    "\n",
    "> 文本生成：如GPT系列，只使用解码器生成文本，例如故事续写、文章生成等。\n",
    "\n",
    "> 文本续写或预测：训练解码器来预测下一个单词或字符。\n",
    "\n",
    "- **编码器和解码器都使用的任务**：\n",
    "\n",
    "当任务涉及到理解输入序列并生成新的输出序列时，编码器和解码器会联合使用，比如：\n",
    "\n",
    "> 机器翻译：编码器负责理解源语言，解码器负责生成目标语言。\n",
    "\n",
    "> 文本摘要：编码器理解原始文本，解码器生成摘要。\n",
    "\n",
    "> 问答系统：编码器理解给定的问题和背景材料，解码器生成答案。\n",
    "\n",
    "这种分离使用编码器或解码器的能力使得Transformer架构极为灵活，可以通过调整来适应各种不同的NLP任务需求。而具体的架构选择（编码器、解码器或两者兼用）取决于我们试图解决的问题类型。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "6b23c29f-f237-48c5-b8bb-5b6499d38a77",
   "metadata": {},
   "source": [
    "- **nn.Transformer的参数构造详解**"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "d2d3fd4a-b825-462f-b254-6e13937390a9",
   "metadata": {},
   "source": [
    "在nn.Transformer中，我们有如下的参数可以使用："
   ]
  },
  {
   "cell_type": "markdown",
   "id": "9f121ccb-ca4b-4f71-9f1e-1ef0a3614d30",
   "metadata": {},
   "source": [
    "| 参数名| 解释 |\n",
    "| ---- | ---- |\n",
    "| d_model| Encoder和Decoder输入参数的特征维度。也就是词向量的维度。默认为512 |\n",
    "| nhead| 多头注意力机制中，head的数量。默认值为8。 |\n",
    "| num_encoder_layers | TransformerEncoderLayer的数量。该值越大，网络越深，网络参数量越多，计算量越大。默认值为6 |\n",
    "| num_decoder_layers | TransformerDecoderLayer的数量。该值越大，网络越深，网络参数量越多，计算量越大。默认值为6 |\n",
    "| dim_feedforward | Feed Forward层（Attention后面的全连接网络）的隐藏层的神经元数量。该值越大，网络参数量越多，计算量越大。默认值为2048 |\n",
    "| dropout | 内dropout值。默认值为0.1 |\n",
    "| activation | 内Feed Forward层的激活函数。取值可以是string(“relu” or “gelu”)或者一个一元可调用的函数。默认值是relu |\n",
    "| custom_encoder | 自定义Encoder。若你不想用官方实现的TransformerEncoder，你可以自己实现一个。默认值为None |\n",
    "| custom_decoder | 自定义Decoder。若你不想用官方实现的TransformerDecoder，你可以自己实现一个。 |\n",
    "| layer_norm_eps| Add&Norm层中，BatchNorm的eps参数值。默认为1e-5|\n",
    "| batch_first | batch维度是否是第一个。如果为True，则输入的shape应为(batch_size, 词数，词向量维度)，否则应为(词数, batch_size, 词向量维度)。默认为False。这个要特别注意，因为大部分人的习惯都是将batch_size放在最前面，而这个参数的默认值又是False，所以会报错。|\n",
    "| norm_first | 是否要先执行norm。例如，在图中的执行顺序为 Attention -> Add -> Norm。若该值为True，则执行顺序变为：Norm -> Attention -> Add。|"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "7ed3716a-2679-421d-95fc-5f3344670d8d",
   "metadata": {},
   "source": [
    "<center><img src=\"https://skojiangdoc.oss-cn-beijing.aliyuncs.com/2023DL/transformer/image-1.png\" alt=\"描述文字\" width=\"400\">"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 76,
   "id": "36573dcf-85de-4360-a3af-c38d41ef8cc3",
   "metadata": {
    "collapsed": true,
    "jupyter": {
     "outputs_hidden": true
    },
    "tags": []
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "tensor([[[ 6.7511e-02,  1.1720e+00,  1.2293e-01,  ...,  7.8842e-01,\n",
      "           1.7112e-01,  9.6802e-01],\n",
      "         [-7.9600e-01,  9.0369e-02, -2.8433e-01,  ...,  8.8442e-01,\n",
      "           4.5074e-01,  1.2281e+00],\n",
      "         [-5.1751e-01, -1.5499e-01, -6.7926e-01,  ...,  1.7317e-01,\n",
      "           7.3845e-01,  1.1859e+00],\n",
      "         ...,\n",
      "         [-5.6413e-01,  8.6297e-02,  2.1021e-01,  ...,  2.6536e-01,\n",
      "           5.3328e-01,  1.0273e+00],\n",
      "         [ 4.5727e-01, -1.8545e-01,  8.5030e-01,  ...,  4.5394e-01,\n",
      "           1.2406e+00,  7.0520e-04],\n",
      "         [-5.8062e-01, -3.7102e-01,  3.1228e-01,  ..., -1.8074e-02,\n",
      "          -9.9500e-01,  3.6074e-01]],\n",
      "\n",
      "        [[-3.6611e-01, -1.4964e-01, -5.7366e-01,  ...,  1.1742e+00,\n",
      "           4.5474e-01,  1.5793e+00],\n",
      "         [-7.1758e-01, -1.1414e-01, -1.4992e-01,  ...,  9.3404e-01,\n",
      "           6.8837e-01,  1.4203e-01],\n",
      "         [-1.1819e+00, -1.3623e-01, -8.2696e-01,  ...,  1.1469e+00,\n",
      "           1.5560e+00,  1.9745e+00],\n",
      "         ...,\n",
      "         [ 3.4972e-01, -3.0005e-01,  5.2432e-01,  ...,  7.8982e-03,\n",
      "           1.1331e+00,  6.4621e-01],\n",
      "         [-1.4351e-01, -4.2258e-01,  9.0307e-02,  ...,  6.0866e-01,\n",
      "           5.2210e-01,  3.7324e-01],\n",
      "         [-1.5250e-01, -3.1615e-01,  1.9450e-01,  ...,  4.0237e-01,\n",
      "          -1.3889e-01,  4.9972e-02]],\n",
      "\n",
      "        [[-4.8917e-01,  2.8146e-01, -7.0046e-03,  ...,  4.1752e-01,\n",
      "           3.8535e-01,  3.7363e-01],\n",
      "         [-2.0414e-01,  5.2099e-01, -9.3748e-01,  ...,  1.2865e+00,\n",
      "           7.7141e-01,  1.1435e+00],\n",
      "         [-7.2134e-01,  2.2398e-01, -5.5931e-01,  ...,  1.0224e+00,\n",
      "           4.2697e-01,  7.4842e-01],\n",
      "         ...,\n",
      "         [ 2.4368e-01,  4.3303e-01,  4.9406e-01,  ..., -2.3506e-01,\n",
      "           1.2114e+00,  1.1117e+00],\n",
      "         [-9.8455e-02, -1.8124e-01,  4.6945e-02,  ...,  6.6501e-01,\n",
      "           8.1316e-01,  1.1048e+00],\n",
      "         [ 2.3373e-01,  3.8396e-01,  6.7586e-04,  ...,  1.5055e+00,\n",
      "           7.1205e-01,  3.3665e-01]],\n",
      "\n",
      "        ...,\n",
      "\n",
      "        [[-2.4401e-01,  9.3574e-01, -5.1521e-01,  ...,  4.5884e-01,\n",
      "           3.9784e-01, -3.0925e-02],\n",
      "         [-9.8026e-01, -2.3066e-02,  4.4282e-01,  ...,  8.5599e-01,\n",
      "           1.2696e+00,  8.2665e-01],\n",
      "         [-7.1308e-01, -7.3256e-01, -5.3670e-01,  ..., -1.9509e-01,\n",
      "           1.8948e+00,  1.4215e+00],\n",
      "         ...,\n",
      "         [-4.8203e-01,  3.4399e-01,  3.0968e-01,  ...,  1.1069e-01,\n",
      "          -1.3231e-01,  3.9003e-01],\n",
      "         [-8.1770e-02,  3.7546e-01,  6.0284e-01,  ...,  6.9671e-01,\n",
      "           1.3995e+00,  6.6472e-01],\n",
      "         [ 7.1869e-02, -1.1225e+00, -1.7045e-01,  ...,  1.3524e+00,\n",
      "           7.6455e-01, -5.9608e-01]],\n",
      "\n",
      "        [[ 1.8410e-01,  7.0883e-03,  4.9071e-01,  ...,  8.9709e-01,\n",
      "           7.7747e-01, -2.7418e-01],\n",
      "         [-2.4630e-01, -1.0426e-01,  1.8499e-01,  ...,  1.5420e+00,\n",
      "           9.5686e-01,  6.1427e-01],\n",
      "         [-3.1793e-01,  2.7415e-01, -6.1382e-01,  ...,  4.7299e-01,\n",
      "           7.8670e-01,  1.0000e+00],\n",
      "         ...,\n",
      "         [-4.0324e-01, -1.0399e-01,  1.4032e-01,  ...,  5.5401e-02,\n",
      "           1.8832e+00,  9.1992e-01],\n",
      "         [-1.5993e-02, -2.7901e-01,  7.9123e-01,  ...,  4.7693e-01,\n",
      "           1.1367e+00,  7.6699e-01],\n",
      "         [-2.4011e-01, -3.3415e-03,  3.2870e-01,  ...,  1.4946e+00,\n",
      "           1.8623e-01,  6.8294e-01]],\n",
      "\n",
      "        [[-8.4034e-02,  6.2343e-01,  1.9874e-01,  ...,  1.1997e+00,\n",
      "           2.5350e-01,  6.3979e-01],\n",
      "         [ 3.1284e-01, -6.1615e-01,  3.8653e-02,  ...,  8.9448e-01,\n",
      "           1.0150e+00,  4.3111e-01],\n",
      "         [-8.1943e-01,  8.1678e-01, -9.7485e-01,  ...,  1.3912e+00,\n",
      "           1.0981e+00,  1.5789e+00],\n",
      "         ...,\n",
      "         [-2.6207e-01,  8.0260e-02,  7.4601e-01,  ..., -5.1185e-02,\n",
      "           8.5184e-01,  5.6040e-01],\n",
      "         [-3.3620e-01, -1.9009e-01,  5.4211e-01,  ..., -2.5744e-01,\n",
      "           1.0852e+00,  4.1671e-01],\n",
      "         [-5.2090e-01, -9.6469e-01,  5.1191e-02,  ...,  1.3300e+00,\n",
      "           8.2232e-01,  6.0515e-01]]], device='cuda:0',\n",
      "       grad_fn=<NativeLayerNormBackward0>)\n"
     ]
    }
   ],
   "source": [
    "import torch\n",
    "from torch.nn import Transformer\n",
    "import torch.nn as nn\n",
    "\n",
    "# 设置设备\n",
    "device = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\n",
    "\n",
    "# Transformer 参数\n",
    "d_model = 512  # Embedding size\n",
    "nhead = 8  # 头的数量\n",
    "num_encoder_layers = 6  # 编码器层的数量\n",
    "num_decoder_layers = 6  # 解码器层的数量\n",
    "dim_feedforward = 2048  # 前馈网络模型的尺寸\n",
    "dropout = 0.1  # Dropout 比例\n",
    "\n",
    "# 实例化 nn.Transformer 模型\n",
    "transformer_model = Transformer(d_model, nhead, num_encoder_layers, num_decoder_layers, dim_feedforward, dropout).to(device)\n",
    "\n",
    "# 示例输入数据 (seq_len, batch_size, d_model)\n",
    "src = torch.rand(10, 32, d_model).to(device)  # 编码器输入\n",
    "tgt = torch.rand(10, 32, d_model).to(device)  # 解码器输入\n",
    "src_mask = transformer_model.generate_square_subsequent_mask(src.size(0)).to(device)\n",
    "tgt_mask = transformer_model.generate_square_subsequent_mask(tgt.size(0)).to(device)\n",
    "\n",
    "# 正向传播\n",
    "output = transformer_model(src, tgt, src_key_padding_mask=None, tgt_key_padding_mask=None, memory_key_padding_mask=None)\n",
    "print(output)  # 输出结果 (seq_len, batch_size, d_model)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "1dcf1334-c8aa-448e-a688-395eb705602c",
   "metadata": {},
   "source": [
    "## 2 Transformer在传统NLP领域的应用\n",
    "\n",
    "Transformer的模块化构件，使得它可以拆分使用或者“魔改”自由拼装。通常不同的NLP任务上会使用Transformer架构的不同部分，很多更优的模型也是在Transformer的基础上调整改造。\n",
    "我们只需要理解：**Encoder通常用于提取特征，Decoder通常用于生成内容**。\n",
    "\n",
    "* 同时使用encoder和decoder：机器翻译、文本摘要\n",
    "* 只使用encoder：文本分类（bert）、问答系统、实体命名识别、填充任务\n",
    "* 只使用decoder：文本生成（GPT）\n",
    "\n",
    "以下代码提供了各个任务的基础架构，但为了完全实现这些模型，还需要包括数据预处理、mask的创建、训练循环、损失函数计算、优化器配置等。这些代码片段主要是为了展示不同任务中 nn.Transformer 结构的不同使用方式。实际使用时，每个任务还需要细化和调整以适应具体的需求和数据集。在正式的《深度学习实战》第七期课程当中，我们将详细讲解这些内容。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "766f9a8c-f732-4f2a-baa5-9da6cbe0ff42",
   "metadata": {
    "tags": []
   },
   "source": [
    "### 2.1 机器翻译任务\n",
    "同时使用Transformer的encoder和decoder。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "e1ec0fca-b679-4416-852c-89453214ec1d",
   "metadata": {},
   "outputs": [],
   "source": [
    "import torch\n",
    "from torch import nn\n",
    "from torch.nn import Transformer\n",
    "\n",
    "# 假设一些超参数和数据预处理已经完成\n",
    "src_vocab_size = 1000  # 源语言词汇量\n",
    "tgt_vocab_size = 1000  # 目标语言词汇量\n",
    "embedding_size = 512  # 嵌入维度\n",
    "nhead = 8  # 多头注意力中的头数\n",
    "num_encoder_layers = 6  # 编码器层数\n",
    "num_decoder_layers = 6  # 解码器层数\n",
    "dropout = 0.1  # dropout比率\n",
    "\n",
    "class TranslationModel(nn.Module):\n",
    "    def __init__(self, src_vocab_size, tgt_vocab_size, embedding_size, nhead, num_encoder_layers, num_decoder_layers, dropout):\n",
    "        super(TranslationModel, self).__init__()\n",
    "        self.transformer = Transformer(d_model=embedding_size, nhead=nhead, num_encoder_layers=num_encoder_layers, num_decoder_layers=num_decoder_layers, dropout=dropout)\n",
    "        self.src_tok_emb = nn.Embedding(src_vocab_size, embedding_size)\n",
    "        self.tgt_tok_emb = nn.Embedding(tgt_vocab_size, embedding_size)\n",
    "        self.positional_encoding = nn.Parameter(torch.zeros(1, 5000, embedding_size))  # 假定最大句子长度为 5000\n",
    "        self.generator = nn.Linear(embedding_size, tgt_vocab_size)\n",
    "        \n",
    "    #----必填参数----\n",
    "    #src：Encoder的输入。也就是将token进行Embedding和Positional Encoding之后的tensor。必填参数。Shape为(batch size，sequence length，embedding size)\n",
    "    #tgt:与src同理，Decoder的输入。 必填参数。Shape为(sequence length，embedding size)\n",
    "    \n",
    "    #----可选参数----\n",
    "    #src_mask：对src进行mask。不常用。Shape为(sequence length, sequence length)\n",
    "    #tgt_mask：对tgt进行mask。常用。Shape为(sequence length, sequence length) \n",
    "    #src_key_padding_mask：对src的token进行mask. 常用。Shape为(batch size, sequence length) \n",
    "    #tgt_key_padding_mask：对tgt的token进行mask。常用。Shape为(batch size, sequence length) \n",
    "    \n",
    "    def forward(self, src, tgt, src_mask = None, tgt_mask = None, memory_mask = None, src_key_padding_mask = None, tgt_key_padding_mask = None,memory_key_padding_mask = None):\n",
    "        src_emb = self.src_tok_emb(src) + self.positional_encoding[:, :src.size(1)]\n",
    "        tgt_emb = self.tgt_tok_emb(tgt) + self.positional_encoding[:, :tgt.size(1)]\n",
    "        outs = self.transformer(src_emb, tgt_emb, src_mask, tgt_mask, None, src_key_padding_mask, tgt_key_padding_mask, memory_key_padding_mask)\n",
    "        return self.generator(outs)\n",
    "\n",
    "# 假定 src 和 tgt 已经是预处理并被转换成 tensor 的索引\n",
    "# src = [batch_size, src_len]\n",
    "# tgt = [batch_size, tgt_len]\n",
    "\n",
    "model = TranslationModel(src_vocab_size, tgt_vocab_size, embedding_size, nhead, num_encoder_layers, num_decoder_layers, dropout)\n",
    "\n",
    "# 接下来你需要定义mask和padding mask，并且运行模型\n",
    "# 比如 src_mask, tgt_mask, src_padding_mask, tgt_padding_mask, memory_key_padding_mask\n",
    "# 输出会是一个[batch_size, tgt_len, tgt_vocab_size]的tensor"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "8d396d7a-d434-41cb-b74b-1d781182ba2e",
   "metadata": {},
   "source": [
    "### 2.2 文本分类\n",
    "文本分类通常只使用encoder部分，我们以此为例看一下encoder部分如何单独使用。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "41704bdb-1a1a-4b97-99fd-aafbdb0ea650",
   "metadata": {},
   "outputs": [],
   "source": [
    "import torch\n",
    "from torch import nn\n",
    "from torch.nn import TransformerEncoder, TransformerEncoderLayer\n",
    "\n",
    "# 假设超参数和数据预处理已经完成\n",
    "vocab_size = 1000  # 词汇量\n",
    "embedding_size = 512  # 嵌入维度\n",
    "nhead = 8  # 多头注意力的头数\n",
    "num_encoder_layers = 6  # 编码器层数\n",
    "dropout = 0.1  # dropout比率\n",
    "num_classes = 10  # 类别数\n",
    "\n",
    "class TextClassificationModel(nn.Module):\n",
    "    def __init__(self, vocab_size, embedding_size, nhead, num_encoder_layers, dropout, num_classes):\n",
    "        super(TextClassificationModel, self).__init__()\n",
    "        #只使用了Transformer的Encoder部分\n",
    "        encoder_layers = TransformerEncoderLayer(d_model=embedding_size, nhead=nhead, dropout=dropout)\n",
    "        self.transformer_encoder = TransformerEncoder(encoder_layer=encoder_layers, num_layers=num_encoder_layers)\n",
    "\n",
    "        self.embedding = nn.Embedding(vocab_size, embedding_size)\n",
    "        self.positional_encoding = nn.Parameter(torch.zeros(1, 5000, embedding_size))  # 假定最大句子长度为 5000\n",
    "        self.linear = nn.Linear(embedding_size, num_classes)\n",
    "        \n",
    "    #----必填参数----\n",
    "    #src：Encoder的输入。也就是将token进行Embedding和Positional Encoding之后的tensor。必填参数。Shape为(batch size，sequence length，embedding size)\n",
    "    \n",
    "    #----可选参数----\n",
    "    #src_mask：对src进行mask。不常用。Shape为(sequence length, sequence length)\n",
    "    #src_key_padding_mask：对src的token进行mask. 常用。Shape为(batch size, sequence length) \n",
    "\n",
    "    def forward(self, src, src_mask, src_padding_mask):\n",
    "        src_emb = self.embedding(src) + self.positional_encoding[:, :src.size(1)]\n",
    "        encoded_src = self.transformer_encoder(src_emb, src_mask, src_padding_mask)\n",
    "        # 假设我们只关心输入序列的第一个元素\n",
    "        return self.linear(encoded_src[:, 0])\n",
    "\n",
    "# 假定 src 已经是预处理并被转换成 tensor 的索引\n",
    "# src = [batch_size, src_len]\n",
    "\n",
    "model = TextClassificationModel(vocab_size, embedding_size, nhead, num_encoder_layers, dropout, num_classes)\n",
    "\n",
    "# 接下来你需要定义mask和padding mask，并且运行模型\n",
    "# 比如 src_mask, src_padding_mask\n",
    "# 输出会是一个[batch_size, num_classes]的tensor"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "1fcf840d-c843-4a1e-9890-2bffa06ce7f3",
   "metadata": {},
   "source": [
    "对于文本分类的任务只使用encoder部分，通常会在encoder后直接连接全连接层，将编码器的输出转换为最终的分类标签。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "b2f6ff1a-5fd6-4421-a44f-90b7015a235b",
   "metadata": {},
   "source": [
    "BERT（Bidirectional Encoder Representations from Transformers，Transformers双向编码表示）就是只使用Transformer的encoder部分的重要例子。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "a9d6a758-07e1-4e14-80ee-1ebb595e0bf0",
   "metadata": {},
   "source": [
    "### 2.3 类似GPT的文本生成式模型\n",
    "目前AI前沿的GPT大家都知道它是基于Transformer的，可是它是具体如何实现的呢？我们来看一下简单的文本生成式模型是什么样的结构。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "b937cb56-e3f6-4e8e-a224-da6c5a6fbcfd",
   "metadata": {},
   "outputs": [],
   "source": [
    "import torch\n",
    "from torch import nn\n",
    "\n",
    "class GPTLikeModel(nn.Module):\n",
    "    def __init__(self, vocab_size, embedding_size, nhead, num_decoder_layers, dropout):\n",
    "        super(GPTLikeModel, self).__init__()\n",
    "        self.embedding_size = embedding_size\n",
    "        self.embeddings = nn.Embedding(vocab_size, embedding_size)\n",
    "        self.positional_encodings = nn.Parameter(torch.zeros(1, 1024, embedding_size))\n",
    "        \n",
    "        #只使用了Transformer的Decoder部分\n",
    "        self.transformer_decoder_layer = nn.TransformerDecoderLayer(d_model=embedding_size,  nhead=nhead, dropout=dropout)\n",
    "        self.transformer_decoder = nn.TransformerDecoder(self.transformer_decoder_layer, num_layers=num_decoder_layers)\n",
    "\n",
    "        self.output_layer = nn.Linear(embedding_size, vocab_size)\n",
    "    \n",
    "    #生成上三角矩阵掩码\n",
    "    def generate_square_subsequent_mask(self, sz):\n",
    "        mask = (torch.triu(torch.ones(sz, sz)) == 1).transpose(0, 1)\n",
    "        mask = mask.float().masked_fill(mask == 0, float('-inf')).masked_fill(mask == 1, float(0.0))\n",
    "        return mask\n",
    "    \n",
    "    #----必填参数---- \n",
    "    #tgt:与src同理，Decoder的输入。 必填参数。也就是将token进行Embedding和Positional Encoding之后的tensor。Shape为(batch size，sequence length，embedding size)\n",
    "    #----可选参数----\n",
    "    #tgt_mask：对tgt进行mask。常用。Shape为(sequence length, sequence length) \n",
    "    #tgt_key_padding_mask：对tgt的token进行mask。常用。Shape为(batch size, sequence length) \n",
    "    #memory_key_padding_mask：对tgt的token进行mask。不常用。Shape为(batch size, sequence length) \n",
    "    def forward(self, tgt, memory_key_padding_mask):\n",
    "        input_mask = self.generate_square_subsequent_mask(input.size(0)).to(input.device)\n",
    "        embeddings = self.embeddings(input) * math.sqrt(self.embedding_size)\n",
    "        embeddings += self.positional_encodings[:, :tgt.size(0), :]\n",
    "        transformer_output = self.transformer_decoder(embeddings, memory, tgt_mask=tgt_mask)\n",
    "        output = self.output_layer(transformer_output)\n",
    "        return output\n",
    "\n",
    "    def generate(self, input, max_length):\n",
    "        memory = None\n",
    "        generated = input\n",
    "        for _ in range(max_length):\n",
    "            \n",
    "            #进行一个字的生成\n",
    "            output = self.forward(generated, memory)\n",
    "            \n",
    "            #只取最后一个时间步的输出，包含所有可能生成的字的概率\n",
    "            next_token_logits = output[-1, :, :]  \n",
    "            \n",
    "            #使用argmax取出概率最大的字\n",
    "            next_token = torch.argmax(next_token_logits, dim=-1).unsqueeze(0) \n",
    "            \n",
    "            #合并生成的字与之前的文本\n",
    "            generated = torch.cat((generated, next_token), dim=0)\n",
    "            if next_token.item() == eos_token_id:  # 假设eos_token_id是结束标记的ID\n",
    "                break\n",
    "        return generated"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "c11ae03a-f004-44b7-bbba-0ee106f9e02a",
   "metadata": {},
   "source": [
    "生成式任务中，会多一部分generate用于生成文本序列。\n",
    "\n",
    "每次循环生成一个时间步的输出。在每个时间步，调用 forward 方法，传递当前生成的序列 generated 作为下一次的输入，并得到下一个时间步的预测结果 output。\n",
    "\n",
    "然后，从预测结果中选择概率最高的词作为下一个时间步的输入，并将其添加到生成的序列 generated 中。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "dda45fa4-6f1e-403a-8639-71b299f4628f",
   "metadata": {},
   "source": [
    "## 3 Huggingface：Transformer的高级封装与实现"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "89b7235c-87a6-4574-8a69-a6de2096db6a",
   "metadata": {},
   "source": [
    "### 3.1 认知Huggingface"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "0a831791-6200-451c-8f6e-76f0a94cf6c2",
   "metadata": {},
   "source": [
    "Hugging Face是一家领先的人工智能技术落地实践公司，它搭建并开发了围绕自然语言处理（NLP）、大语言模型（LLMs）、多模态模型等各个人工智能领域的一系列落地应用开源框架，其中最为著名的是开源库Transformers。\n",
    "    \n",
    "<center><img src=\"https://skojiangdoc.oss-cn-beijing.aliyuncs.com/2023DL/transformer/20.png\" alt=\"描述文字\"></center>\n",
    "\n",
    "随着人工智能领域的发展，模型的体量逐渐增长、数据的体量逐渐增长、在实践中应用复杂的人工智能算法的需求日益增加，越来越多的开发者会倾向于直接使用经过预训练和封装的成熟NLP算法，而非自行构建复杂的transformers或tokernizer架构。Huggingface正是把握住了这一需求的变化，开发了**封装层次极高、调用简单、节约算力、且训练流程清晰明确的Transformers库**，这个库提供了一系列与人工智能相关的预训练模型，如BERT、GPT、T5等，这些模型都是建立在原始Transformer架构基础之上，并对其进行了扩展和优化，以适用于各种各样的NLP任务。与最初由Google提出的Transformer模型相比，Hugging Face的Transformers库提供了更为丰富、易于使用且经过精心优化的模型选择，同时支持跨多种编程语言和平台。\n",
    "\n",
    "在当代NLP领域的应用中，Huggingface的transformer库有以下三大优势：\n",
    "    \n",
    "- **高层次的封装，让Transformer及bert、GPT等更复杂的NLP模型都能被轻松调用**。Huggingface的Transformer宛如NLP领域的sklearn，为低成本、低门槛实践算法铺平了道路。\n",
    "\n",
    "- **打造了针对特定NLP任务的一系列pipeline供用户使用，让执行复杂任务的流程变得简便**。在NLP的世界中，存在着文本分类、命名实体识别、问答系统、摘要生成、文本生成、机器翻译等不同性质的任务，这些任务涉及到有监督、无监督、半监督等各类标签应用流程，同时每个任务都需要适应于任务本身的文字编码方法、模型架构、预训练权重、以及正确的预处理后处理流程。可以说在NLP的世界中，每个特定NLP任务的执行都是及其复杂而繁琐的。但在Huggingface的transformer库中，pipeline功能可以自动处理任意任务的全流程，用户只需要告诉pipeline当前执行的任务是什么，就可以轻松执行复杂任务。\n",
    "    \n",
    "![](https://skojiangdoc.oss-cn-beijing.aliyuncs.com/2023DL/transformer/21.png)\n",
    "   \n",
    "- **提供了巨量的预训练模型 & 高级NLP模型的实现**，覆盖了语言理解、生成任务、多模态任务、对话系统以及特定领域的应用。这包括了广泛使用的预训练语言模型如BERT、GPT和RoBERTa，专门用于翻译和文本摘要的序列到序列模型如BART和mT5，以及能处理文本和图像的多模态模型如CLIP和DALL-E。对话模型如DialoGPT和Blenderbot能够支持构建交互式对话系统，而DistilBERT和ALBERT等模型则提供了轻量级的解决方案，适用于资源受限的情况。此外，还有为特定领域或任务定制的模型，如SciBERT和BioBERT等，它们经过预训练能够快速适应相关的NLP任务。你可以在这个页面找到所有能够通过Huggingface的Transformer实现的模型：https://huggingface.co/models\n",
    "\n",
    "![](https://skojiangdoc.oss-cn-beijing.aliyuncs.com/2023DL/transformer/22.png)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "00f73ac5-e70d-4930-8a24-0dd0a2abded1",
   "metadata": {},
   "source": [
    "今天就让我们来了解下Huggingface的基本框架与运行结构，并使用Huggingface中的Transformer实践两个简单的任务。**注意，huggingface官网访问、模型加载依赖于魔法，建议使用漂亮国全局接口**。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "0759dc6f-16bc-432e-9404-51a6b0ffd735",
   "metadata": {},
   "source": [
    "- 安装部署与导入"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "3e5f4e1e-5aed-48ec-8b98-5398ef401ed0",
   "metadata": {},
   "outputs": [],
   "source": [
    "#!pip install transformers -i https://pypi.tuna.tsinghua.edu.cn/simple"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "a1944aec-07b3-4a01-b045-206ef46746b0",
   "metadata": {},
   "source": [
    "该代码可以通用于windows、linux和MacOs系统，注意运行pip代码的时候要避开魔法。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "id": "59010e70-4c17-46c5-b61c-fc994894e785",
   "metadata": {},
   "outputs": [],
   "source": [
    "import transformers"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "662323fc-b174-4cb5-90a5-a16c5951ec2f",
   "metadata": {
    "tags": []
   },
   "source": [
    "### 3.2 Huggingface中的功能与模型"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "721c8b24-09e5-4932-b0ff-f6fbf51227c1",
   "metadata": {},
   "source": [
    "在Transformers库当中，库所提供的功能和实际的模型&权重是分割开来的，我们在huggingface的[官方页面](https://huggingface.co)中能够查询到的API是transformers库提供各类功能，而models才是我们可以接入的模型本身。\n",
    "\n",
    "![](https://skojiangdoc.oss-cn-beijing.aliyuncs.com/2023DL/transformer/28.png)\n",
    "\n",
    "在transformers当中，我们通过一系列功能类来调用模型，并决定将模型用于具体的任务。这样的设计主要是基于软件工程的原则，即模块化和解耦。这种设计允许更灵活的研究、开发和部署。功能类完全是由Huggingface官方开发和设计的，而模型和权重则是投放给了开源社区，任意的企业和个人都可以向Huggingface投放自己设计的全新模型和自己训练好的权重。只要遵循Huggingface官方所设置的结构，我们就可以通过transformers库中的功能来调用自己上传的模型。也因此，在huggingface的页面上，我们可以找到30w个不同的模型，这也是Huggingface如此受欢迎的关键原因之一。\n",
    "\n",
    "在Transformers库当中，要调用一个模型之前，我们至少需要知道**用于调用模型的功能类**以及**实际要调用的模型的名字**。接下来就让我们看看这两部分内容具体如何获得——"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "9d69d538-0af9-47f1-9a9e-05b5ca2250f9",
   "metadata": {},
   "source": [
    "- **Huggingface中的功能类板块**"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "86897295-e04f-443e-99fd-ab1344f74688",
   "metadata": {},
   "source": [
    "![](https://skojiangdoc.oss-cn-beijing.aliyuncs.com/2023DL/transformer/23.png)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "5a51a948-92fe-440a-8fd8-c4aad4af565b",
   "metadata": {},
   "source": [
    "> **Models**\n",
    "\n",
    "Models模块提供了一系列与transformer架构（如BERT、GPT-2、T5等）相关的、用于加载各类模型的功能类，包括但不限于150+基于Transformer的语言模型，100+视觉模型、30+语音模型等等，你可以在[Huggingface官方文档页面](https://huggingface.co/docs/transformers/index)的左侧找到专用于各类模型的类的列表。\n",
    "\n",
    "![](https://skojiangdoc.oss-cn-beijing.aliyuncs.com/2023DL/transformer/25.png)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "72b0f42b-2891-4f66-9a1c-8313e6cca837",
   "metadata": {},
   "source": [
    "对于语言模型而言，在100多类模型中最值得注意的是——\n",
    "\n",
    "- **BERT** (Bidirectional Encoder Representations from Transformers): 是一个预训练模型，旨在帮助计算机理解自然语言（即人类使用的语言）。BERT的设计使其能够理解语句中单词的上下文，这对于处理诸如文本分类、命名实体识别、情感分析等任务至关重要。\n",
    "\n",
    "- **GPT-2** (Generative Pre-trained Transformer 2): 是一个由OpenAI开发的非常强大的文本生成模型。它能生成高质量的文本，并可以用于多种应用，包括写作辅助、对话生成和其他生成任务。\n",
    "\n",
    "- **RoBERTa** (A Robustly Optimized BERT Pretraining Approach): 是BERT的一个优化版本，通过更仔细的训练策略获得了更好的性能。RoBERTa在许多NLP基准测试中都显示了出色的结果，尤其是在文本分类和情感分析任务上。\n",
    "\n",
    "- **T5** (Text-to-Text Transfer Transformer): 是一种采用文本到文本框架的模型，它将NLP任务统一为一个文本到文本的格式。T5可以处理多种任务，如翻译、摘要、问答和分类，而且在许多基准测试中表现出色。\n",
    "\n",
    "- **DistilBERT** (A distilled version of BERT): 是BERT的一个简化版本，旨在减少模型大小，提高速度，同时尽量保持BERT的性能。它适用于资源受限的情况下使用，包括移动设备和小型服务器。\n",
    "\n",
    "这些模型由于其强大的性能和广泛的适用性，成为了自然语言处理领域的基石。通过预训练和微调，它们可以适用于各种NLP任务，并被用于学术研究和工业应用。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "bb60eac3-34cc-481e-95c7-a68538e83f65",
   "metadata": {
    "tags": []
   },
   "source": [
    "在[每个模型的页面中](https://huggingface.co/docs/transformers/v4.35.0/en/model_doc/bert#bert)，我们可以找到这个模型的系列框架的列表。**注意，这个列表中的类是依据不同需求、从不同角度调用模型的工具，而非模型本身**。\n",
    "\n",
    "![](https://skojiangdoc.oss-cn-beijing.aliyuncs.com/2023DL/transformer/26.png)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "7b64eb25-5377-49e7-b860-69d676ccd926",
   "metadata": {},
   "source": [
    "对Bert模型而言，这些类分别是：\n",
    "\n",
    "`BertConfig`: 用于存储BERT模型的配置信息，如模型大小、输入维度等。它为模型实例化提供了必要的参数。\n",
    "\n",
    "`BertTokenizer`: 用于将文本分割成BERT能够理解的token，将词汇转换为模型可以理解的ID。\n",
    "\n",
    "`BertTokenizerFast`: 是BertTokenizer的更快版本，它使用Rust语言重写，以便更高效地进行分词和编码操作。\n",
    "\n",
    "`TFBertTokenizer`: 是TensorFlow版本的BERT分词器，为使用TensorFlow框架的用户提供了与BERT模型配合使用的分词功能。\n",
    "\n",
    "以下是BERT模型结构的变体和专用模型：\n",
    "\n",
    "`BertModel`: BERT模型的核心，它是预训练模型的主体，可用于特定任务的基础上进行微调。\n",
    "\n",
    "`BertForPreTraining`: 包含了用于BERT预训练的头部，这个头部同时执行masked language modeling (MLM) 和 next sentence prediction (NSP)。\n",
    "\n",
    "`BertLMHeadModel`: 这个模型添加了语言模型的头部在BertModel之上，通常用于下游任务中的文本生成。\n",
    "\n",
    "`BertForMaskedLM`: 专门用于执行masked language modeling的BERT模型。它在训练时遮盖输入的一部分以学习预测这些遮盖的部分。\n",
    "\n",
    "`BertForNextSentencePrediction`: 这个模型专为预测句子是否为连续句子设计，用于BERT预训练中的下一个句子预测任务。\n",
    "\n",
    "`BertForSequenceClassification`: 用于序列分类任务，如情感分析，其中模型需要将整个输入序列分类到一个类别。\n",
    "\n",
    "`BertForMultipleChoice`: 用于多选题任务，可以给出多个可能的答案选项，并预测最可能的一个。\n",
    "\n",
    "`BertForTokenClassification`: 用于token级别的分类任务，如命名实体识别（NER），其中模型对每个输入token进行分类。\n",
    "\n",
    "`BertForQuestionAnswering`: 专为问答任务设计，能够预测文本中答案的开始和结束位置。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "0d99c672-083b-494c-a44f-49a318fe3398",
   "metadata": {},
   "source": [
    "- **查找模型列表**"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "966c232f-d2e7-4564-ba63-dd692d678a53",
   "metadata": {},
   "source": [
    "现在我们已经知道了bert相关的基本功能类有哪些，那我们就需要下面的页面了：https://huggingface.co/models"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "5f1d4703-4356-4cd2-bee5-8c17d707fc0d",
   "metadata": {},
   "source": [
    "![](https://skojiangdoc.oss-cn-beijing.aliyuncs.com/2023DL/transformer/28.png)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "2288ee71-a260-4156-85f4-5b228037b7ff",
   "metadata": {},
   "source": [
    "![](https://skojiangdoc.oss-cn-beijing.aliyuncs.com/2023DL/transformer/22.png)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "c0ed3b79-2997-43e4-b665-2ca421ac31f4",
   "metadata": {},
   "source": [
    "我们可以通过“最多下载”、“最佳趋势”以及搜索等标签来筛选我们需要的模型。只要知道模型的名字，就可以通过transformers中的功能类来调用模型。其中，最常用的模型可以与最常用的功能匹配起来——依然是bert，gpt2，RoBERTa，T5以及DistilBERT。我们可以在对话框中搜索bert，就能够得到一系列关于bert的模型，**点开任意bert相关的模型，即可在说明文档中找到bert相关的全部模型列表**（https://huggingface.co/bert-base-uncased）。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "d5412eba-94dc-4d30-b7c6-18344d38f7ee",
   "metadata": {},
   "source": [
    "![](https://skojiangdoc.oss-cn-beijing.aliyuncs.com/2023DL/transformer/29.png)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "b15e796a-e3e0-4796-99ff-639ede992616",
   "metadata": {},
   "source": [
    "在Model Card下面可以查看到模型的信息和可用模型的列表名称，在Files and version下面可以下载可用的模型。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "ca7f1358-6e5a-441c-8712-31561b4172a4",
   "metadata": {},
   "source": [
    "![](https://skojiangdoc.oss-cn-beijing.aliyuncs.com/2023DL/transformer/30.png)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "bcd44750-659a-4745-8b2c-44e08c090960",
   "metadata": {},
   "source": [
    "在这些模型中，只有bert-base-chinese与bert-base-multilingual-cased能够支持中文："
   ]
  },
  {
   "cell_type": "markdown",
   "id": "6e0aa829-be5d-4f33-9188-10101b7c5a9e",
   "metadata": {},
   "source": [
    "![](https://skojiangdoc.oss-cn-beijing.aliyuncs.com/2023DL/transformer/31.png)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "add2a5cb-c8db-4926-b1b2-63a7d0cab73e",
   "metadata": {},
   "source": [
    "- **以bert为例，尝试调用bert**"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 78,
   "id": "b27351aa-46e4-43a3-b159-34a21c0b20bb",
   "metadata": {
    "tags": []
   },
   "outputs": [
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "The argument `trust_remote_code` is to be used with Auto classes. It has no effect here and is ignored.\n"
     ]
    }
   ],
   "source": [
    "from transformers import BertTokenizer, BertModel #加载能够导入bert模型的工具，最低至少需要BertTokenizer和BertModel\n",
    "\n",
    "#加载预训练的、专用于bert的分词模型\n",
    "tokenizer = BertTokenizer.from_pretrained('bert-base-chinese',trust_remote_code=True)\n",
    "\n",
    "#加载预训练的bert\n",
    "model = BertModel.from_pretrained(\"bert-base-chinese\",trust_remote_code=True)\n",
    "\n",
    "text = \"虽然今天下雨了，但我拿到了心仪的offer，因此非常开心！\"\n",
    "\n",
    "#将text输入分词和编码模型\n",
    "encoded_input = tokenizer(text, return_tensors='pt')\n",
    "\n",
    "#将编码好的文字输入给预训练好的bert\n",
    "output = model(**encoded_input)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 79,
   "id": "63839d99-7e98-4a32-a417-62b3f4e1e80a",
   "metadata": {
    "collapsed": true,
    "jupyter": {
     "outputs_hidden": true
    },
    "tags": []
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "BaseModelOutputWithPoolingAndCrossAttentions(last_hidden_state=tensor([[[ 1.1622, -0.0790,  1.8173,  ..., -0.6126,  0.6526, -0.5678],\n",
       "         [-0.0494,  0.9221,  1.3372,  ..., -0.4971, -0.0485,  0.1589],\n",
       "         [-0.0456, -0.2386,  0.2801,  ...,  0.6094,  0.5508,  0.3712],\n",
       "         ...,\n",
       "         [ 1.8941, -0.2364,  0.8497,  ..., -0.2132,  0.6206, -0.0577],\n",
       "         [ 0.7245, -0.4633,  1.7492,  ..., -0.9154,  0.7376, -0.4374],\n",
       "         [ 0.7486, -0.3318,  1.1627,  ..., -0.6981,  0.7330, -0.5773]]],\n",
       "       grad_fn=<NativeLayerNormBackward0>), pooler_output=tensor([[ 0.9999,  0.9999,  0.9996,  0.9229,  0.9680, -0.5962, -0.9971,  0.8632,\n",
       "          0.9978, -0.9995,  1.0000,  0.9999,  0.9697, -0.9845,  0.9996, -0.9999,\n",
       "         -0.9962,  0.9950,  0.7646,  0.3401,  0.9999, -1.0000, -0.8772, -0.6877,\n",
       "         -0.7054,  0.9997,  0.9414, -0.8321, -1.0000,  0.9991,  0.9465,  0.9996,\n",
       "          0.9964, -1.0000, -1.0000,  0.9912, -0.7883,  0.9957,  0.0435, -0.8159,\n",
       "         -0.9981, -0.9976,  0.8141, -0.9997, -0.9547,  0.6489, -1.0000, -1.0000,\n",
       "          0.9258,  0.9999, -0.3776, -0.9991,  0.8158, -0.9105, -0.9120,  0.9132,\n",
       "         -0.9999,  0.9962,  1.0000,  0.8233,  0.9973, -0.9482, -0.7165, -0.9999,\n",
       "          0.9998, -0.9980, -0.9891,  0.8536,  0.9999,  1.0000, -0.9872,  0.9077,\n",
       "          1.0000,  0.9635, -0.4719,  0.9999, -1.0000,  0.4113, -1.0000, -0.7166,\n",
       "          1.0000,  0.9977, -0.8823, -0.5393, -0.9940, -1.0000, -0.9993,  1.0000,\n",
       "          0.4248,  0.9492,  0.9973, -0.9999, -1.0000,  0.9975, -0.9995, -0.9973,\n",
       "         -0.9748,  0.9994, -0.4766, -0.9334,  0.2085,  0.8857, -0.9998, -0.9993,\n",
       "          0.9951,  0.9968,  0.8072, -0.9993,  1.0000,  0.7904, -1.0000, -0.7090,\n",
       "         -1.0000, -0.9663, -0.9847,  0.9998,  0.8470,  0.1070,  0.9983, -0.9997,\n",
       "          0.9317, -0.9990, -0.9928, -0.9985,  0.9986,  1.0000,  0.9996, -0.9996,\n",
       "          0.9999,  1.0000,  0.9907,  0.9951, -0.9994,  0.9906,  0.8054, -0.9374,\n",
       "         -0.0613, -0.7922,  1.0000,  0.9904,  0.9998, -0.9938,  0.9996, -0.9910,\n",
       "          1.0000, -0.9999,  0.9982, -1.0000, -0.9055,  0.9995,  0.8803,  1.0000,\n",
       "         -0.8416,  1.0000, -0.9979, -0.9998,  0.9949,  0.2561,  0.9985, -1.0000,\n",
       "          0.8699,  0.2237, -0.9526,  0.9073, -1.0000,  1.0000, -0.7147,  1.0000,\n",
       "          0.9988,  0.3732, -0.9949, -0.9991,  0.8603, -1.0000, -0.9929,  0.9921,\n",
       "         -0.5737,  0.9989, -0.9994, -0.9823,  0.3361,  0.0040, -1.0000,  0.9843,\n",
       "          0.1959,  0.9858,  0.9856,  0.7158,  0.9690,  0.9363, -0.8677,  0.9999,\n",
       "         -0.1145,  0.9989,  1.0000, -0.3110, -0.3405, -0.9791, -1.0000, -0.7058,\n",
       "          1.0000, -0.3323, -0.9999,  0.9617, -1.0000,  0.9757, -0.9909,  0.1429,\n",
       "         -0.9224, -0.9999,  0.9998, -0.9088, -0.9997, -0.2368,  0.4545,  0.9170,\n",
       "         -0.9997, -0.4378,  0.9790, -0.8380,  0.9544, -0.9972, -0.9900,  0.9806,\n",
       "         -0.8900,  0.9046,  0.9347,  1.0000,  0.9999, -0.0684, -0.7911,  1.0000,\n",
       "          0.6418, -1.0000,  0.7115, -0.9858, -0.1284,  0.9999, -0.9985,  0.9055,\n",
       "          1.0000,  0.9842,  1.0000, -0.9449, -0.9265, -0.9994,  1.0000,  0.9941,\n",
       "          0.9999, -0.9999, -0.9997, -0.3267,  0.2922, -1.0000, -0.9998, -0.6018,\n",
       "          0.9975,  0.9997, -0.3665, -0.9651, -0.9964, -0.9989,  1.0000, -0.9849,\n",
       "          1.0000,  0.8819, -0.6037, -0.9992,  0.9205, -0.7637, -0.9997, -0.4513,\n",
       "         -0.9999, -0.9977, -0.9999,  0.9656, -1.0000, -1.0000,  0.1868,  1.0000,\n",
       "          0.9788, -1.0000,  0.9998,  0.9992,  0.6255, -0.9931,  0.9811, -1.0000,\n",
       "          1.0000, -0.9989,  0.7774, -0.9922, -0.9853, -0.6548,  0.9994,  0.9998,\n",
       "         -0.9995, -0.9862, -0.9945, -0.9900, -0.4590,  0.9919, -0.9124,  0.7898,\n",
       "         -0.9493, -0.9880,  0.9795, -0.9965, -0.9986,  0.4766,  1.0000, -0.9714,\n",
       "          1.0000,  0.9866,  1.0000,  0.9741, -0.9994,  0.9995, -0.3418, -0.6257,\n",
       "         -0.9769, -0.9990,  0.9897, -0.1939,  0.9039, -0.9999,  1.0000,  0.9996,\n",
       "          0.8430,  0.5315, -0.1414,  0.4149,  0.9893, -0.9955,  0.9985, -0.9998,\n",
       "          0.8975,  0.9991,  1.0000,  0.9868,  0.3208, -0.9335,  0.9995, -0.9989,\n",
       "          0.9994, -1.0000,  1.0000, -0.9997,  0.8746, -0.9834, -0.9979,  1.0000,\n",
       "          0.9926, -0.9770,  0.9999, -0.8931,  0.9742,  0.9996,  0.9928,  0.9978,\n",
       "          0.9850,  1.0000, -0.9993, -0.9907, -0.9923, -0.9974, -0.9975, -1.0000,\n",
       "          0.4436, -1.0000, -0.9882, -0.9703,  0.9198,  0.2743, -0.6266,  0.3111,\n",
       "          0.3086,  0.4562, -0.9981,  0.1838, -0.1098, -0.9868, -0.9953, -1.0000,\n",
       "         -0.9984,  0.9426,  1.0000, -1.0000,  0.9999, -1.0000, -0.9929,  0.9920,\n",
       "          0.4623,  0.4120,  0.9999, -1.0000,  0.6884,  0.9999,  1.0000,  0.9971,\n",
       "          0.9998, -0.9029, -1.0000, -0.9999, -1.0000, -1.0000, -0.9999,  0.8299,\n",
       "          0.8746, -1.0000,  0.6857,  0.8167,  1.0000,  0.9843, -0.9986,  0.6651,\n",
       "         -0.9999,  0.1387,  0.9997, -0.8381, -0.9999,  0.9953,  0.1102,  0.9999,\n",
       "         -0.9288,  0.9696,  0.9680,  0.7198,  0.9970, -1.0000,  0.9359,  1.0000,\n",
       "         -0.4057, -1.0000, -0.9893, -0.8902, -1.0000, -0.1799,  0.8824,  0.9999,\n",
       "         -1.0000, -0.9748, -0.9891,  0.7139,  0.9777,  0.9999,  0.9972,  0.9868,\n",
       "          0.8126,  0.9786, -0.0429,  1.0000,  0.6781, -0.9981,  0.9991, -0.3622,\n",
       "          0.2483, -1.0000,  0.9998, -0.7315,  1.0000,  0.9820, -0.8251, -0.9572,\n",
       "         -0.9882,  0.9951,  1.0000, -0.2685, -0.5616, -0.9992, -1.0000, -0.9974,\n",
       "          0.4538, -0.3495, -0.9666, -0.9999,  0.7218,  0.2307,  1.0000,  1.0000,\n",
       "          0.9993, -0.9451, -0.9719,  0.9970, -0.8050,  0.9988, -0.8898, -1.0000,\n",
       "         -0.9680, -1.0000,  0.9999, -0.9884, -0.9613, -0.9865, -0.7220,  0.1987,\n",
       "         -1.0000, -0.4702, -0.9981,  0.9791,  1.0000, -0.9997,  0.9865, -0.9987,\n",
       "          0.4586,  0.8345,  0.9401,  0.9905, -0.6663, -0.5311, -0.4764, -0.9152,\n",
       "          0.9791,  0.9972, -0.9939, -0.6357,  0.9999,  0.0665,  0.9990,  0.5756,\n",
       "          0.6408,  0.7963,  1.0000,  0.8529,  1.0000,  0.9591,  1.0000,  0.9997,\n",
       "         -0.9786,  0.7649,  0.7720, -0.9328,  0.7567,  0.9743,  0.9997,  0.7648,\n",
       "         -0.9935, -0.9979,  0.9992,  1.0000,  1.0000, -0.2828,  0.9712, -0.7205,\n",
       "          0.9797,  0.7427,  0.9923, -0.3301,  0.1327,  0.9912,  0.9997, -1.0000,\n",
       "         -1.0000, -1.0000,  1.0000,  0.9995, -0.5934, -1.0000,  0.9998, -0.1446,\n",
       "          0.8538,  0.9962,  0.8618, -0.9634,  0.5338, -0.9995,  0.0059,  0.9770,\n",
       "          0.9604,  0.7619,  0.9998, -0.9977,  0.2957,  1.0000, -0.0509,  1.0000,\n",
       "         -0.0818, -0.9961,  0.9978, -0.9969, -0.9998, -0.9456,  0.9999,  0.9986,\n",
       "         -0.8621, -0.9619,  0.9997, -0.9999,  0.9999, -0.9992,  0.9291, -0.9963,\n",
       "          0.9999, -0.9839, -0.9989,  0.2370,  0.9802,  0.9554, -0.9553,  1.0000,\n",
       "         -0.9811, -0.9453,  0.9361, -0.9635, -0.9959, -0.9489, -0.8201, -1.0000,\n",
       "          0.8508, -0.2446, -0.1463, -0.9993, -1.0000,  1.0000, -0.9938, -0.9420,\n",
       "          0.9999, -0.9983, -1.0000,  0.9921, -0.9999, -0.7295,  0.7488,  0.0109,\n",
       "          0.6090, -1.0000,  0.5628,  1.0000, -0.9996, -0.8930, -0.9803, -0.9127,\n",
       "         -0.1761,  0.8842,  0.9782, -0.9193,  0.8943,  0.9700,  0.9355, -0.1486,\n",
       "         -0.5998, -0.9992, -0.9606, -0.7243, -0.9995, -1.0000, -0.9999,  1.0000,\n",
       "          1.0000,  1.0000, -0.6652,  0.5267,  0.6701,  0.9887, -0.9998,  0.8237,\n",
       "          0.6136,  0.9919, -0.6469, -0.9999, -0.9167, -1.0000,  0.5655,  0.1916,\n",
       "         -0.8672,  0.9920,  1.0000,  0.9998, -0.9997, -0.9994, -0.9995, -0.9990,\n",
       "          0.9999,  0.9997,  0.9999, -0.9785, -0.4907,  0.9997,  0.9238,  0.4152,\n",
       "         -0.9999, -0.9974, -1.0000,  0.7109, -0.9638, -0.9998,  0.9997,  1.0000,\n",
       "          0.5260, -0.9999, -0.9564,  1.0000,  0.9997,  1.0000, -0.0463,  0.9999,\n",
       "         -0.9901,  0.9977, -0.9994,  1.0000, -1.0000,  1.0000,  0.9999,  0.9997,\n",
       "          0.9976, -0.9895,  0.9706, -0.9989, -0.7258,  0.9973, -0.4013, -0.9899,\n",
       "          0.9931,  0.9993, -0.9399,  1.0000,  0.9571,  0.3304,  0.5215,  0.8508,\n",
       "          0.8760, -0.7202, -0.9999,  0.1561,  0.9990,  0.9858,  1.0000,  0.9755,\n",
       "          1.0000, -0.9856, -0.9997,  0.9983, -0.6808,  0.0311, -0.9999,  1.0000,\n",
       "          1.0000, -1.0000, -0.9871,  0.6541, -0.1459,  1.0000,  0.9996,  0.9998,\n",
       "          0.9616,  0.6005,  0.9981, -0.9998,  0.2359, -0.9978, -0.9988,  1.0000,\n",
       "         -0.9917,  0.9999, -0.9967,  1.0000, -0.9640,  0.9853,  0.9951,  0.9141,\n",
       "         -0.9986,  1.0000,  0.6972, -0.9961, -0.2598, -0.9045, -0.9999,  0.0485]],\n",
       "       grad_fn=<TanhBackward0>), hidden_states=None, past_key_values=None, attentions=None, cross_attentions=None)"
      ]
     },
     "execution_count": 79,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "output #此时bert输出的是针对我们输入的文字进行的判断，注意大部分时候这不是一个生成式的结果"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "07413dba-2d11-49d1-b796-dcc471be07ad",
   "metadata": {},
   "source": [
    "通常来说，当我们运行上述代码时，huggingface会开始自动下载对应的模型。**该下载过程需要魔法，建议使用漂亮国去全局接口**，当你是直接下载模型时，trust_remote_code参数要设置为False。但是我们也可以直接将模型通过git，或手动下载到本地，这样就可以通过上述的代码来加载和运行与训练模型了。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "d74ad72d-53a1-49c8-817a-b45a642d7333",
   "metadata": {},
   "source": [
    "![](https://skojiangdoc.oss-cn-beijing.aliyuncs.com/2023DL/transformer/27.png)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "61e588c1-fc9c-4130-8f6c-1c0d9ecfa32d",
   "metadata": {},
   "source": [
    "- **尝试调用GPT**"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "2a33b570-e047-44b6-96b8-5da24ac1b901",
   "metadata": {},
   "source": [
    "目前为止Huggingface依然只支持调用GPT2，如果要调用GPT3.5甚至4.0则需要直接使用OpenAI所提供的API。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 80,
   "id": "189095be-45ed-4434-8d30-2e1f2bd8d8e6",
   "metadata": {},
   "outputs": [],
   "source": [
    "from transformers import GPT2Tokenizer, GPT2Model\n",
    "\n",
    "tokenizer = GPT2Tokenizer.from_pretrained('gpt2')\n",
    "model = GPT2Model.from_pretrained('gpt2')\n",
    "\n",
    "text = \"Replace me by any text you'd like.\"\n",
    "encoded_input = tokenizer(text, return_tensors='pt')\n",
    "output = model(**encoded_input)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 81,
   "id": "9b46417c-baa6-4df9-b57a-cd020ea0f786",
   "metadata": {
    "collapsed": true,
    "jupyter": {
     "outputs_hidden": true
    },
    "tags": []
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "BaseModelOutputWithPastAndCrossAttentions(last_hidden_state=tensor([[[ 0.1629, -0.2166, -0.1410,  ..., -0.2619, -0.0819,  0.0092],\n",
       "         [ 0.4628,  0.0248, -0.0785,  ..., -0.0859,  0.5122, -0.3939],\n",
       "         [-0.0644,  0.1551, -0.6306,  ...,  0.2488,  0.3691,  0.0833],\n",
       "         ...,\n",
       "         [-0.5591, -0.4490, -1.4540,  ...,  0.1650, -0.1302, -0.3740],\n",
       "         [ 0.1400, -0.3875, -0.7916,  ..., -0.1780,  0.1824,  0.2185],\n",
       "         [ 0.1721, -0.2420, -0.1124,  ..., -0.1068,  0.1205, -0.3213]]],\n",
       "       grad_fn=<ViewBackward0>), past_key_values=((tensor([[[[-1.0719,  2.4170,  0.9660,  ..., -0.4787, -0.3316,  1.7925],\n",
       "          [-2.2897,  2.5424,  0.8317,  ..., -0.5299, -2.4828,  1.3537],\n",
       "          [-2.2856,  2.7125,  2.4725,  ..., -1.4911, -1.8427,  1.6493],\n",
       "          ...,\n",
       "          [-3.3203,  2.3325,  2.7061,  ..., -1.1569, -1.5586,  2.4076],\n",
       "          [-2.9917,  2.2701,  2.1742,  ..., -0.8670, -1.6410,  1.9237],\n",
       "          [-2.5066,  2.6139,  2.1347,  ..., -0.0627, -2.0542,  1.6568]],\n",
       "\n",
       "         [[ 0.4796, -0.1131, -1.4854,  ...,  1.1607,  1.8412,  1.3682],\n",
       "          [-0.7273, -1.1362, -1.0850,  ..., -0.6736,  3.2618,  0.2099],\n",
       "          [-1.4441, -3.0647, -4.1612,  ..., -1.4788,  3.2718, -0.2803],\n",
       "          ...,\n",
       "          [ 0.8515, -0.1599,  0.1157,  ..., -0.8959,  4.1178,  0.7133],\n",
       "          [-0.0769, -1.7673, -1.1207,  ..., -1.6276,  3.1095,  1.0237],\n",
       "          [-0.9118, -0.3267, -2.0409,  ..., -0.3527,  1.1626,  0.3733]],\n",
       "\n",
       "         [[-0.2338, -0.8688,  1.6542,  ..., -1.5964, -1.5636,  1.0931],\n",
       "          [ 0.3698,  0.4929,  1.4155,  ..., -2.0162, -1.0246,  1.9822],\n",
       "          [ 0.4509,  1.0144,  0.1189,  ..., -3.1880,  0.4529,  1.3746],\n",
       "          ...,\n",
       "          [ 0.3303,  0.8695, -0.6507,  ..., -2.7196,  0.2950,  1.9827],\n",
       "          [ 0.5777,  0.4363,  1.1029,  ..., -3.2317,  0.9627,  2.1703],\n",
       "          [-0.2861, -0.2032,  0.7289,  ..., -2.3039,  1.2637,  1.9519]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[ 0.4012, -0.0278, -0.1031,  ...,  0.2614,  0.9767,  0.5994],\n",
       "          [ 0.2222,  0.3167, -0.2024,  ...,  0.9616,  0.3658,  1.0162],\n",
       "          [ 0.3276,  0.0629,  0.1905,  ...,  1.0855,  0.8707,  0.0940],\n",
       "          ...,\n",
       "          [ 0.2403, -0.0951,  0.1646,  ...,  0.3345,  0.2687,  0.2159],\n",
       "          [ 0.2873,  0.0887, -0.0544,  ...,  1.0306,  0.3196,  0.5268],\n",
       "          [-0.0552, -0.0461,  0.0765,  ...,  1.0465,  0.2690,  0.4687]],\n",
       "\n",
       "         [[ 0.9759,  1.3121, -0.6612,  ..., -0.3228,  1.1476, -1.2349],\n",
       "          [ 1.0862,  0.3406, -0.6767,  ..., -1.0748,  1.4611,  0.6789],\n",
       "          [ 0.6566,  0.1325, -0.5036,  ..., -1.9292,  1.4180,  0.0719],\n",
       "          ...,\n",
       "          [ 1.1746, -0.0249, -1.0666,  ..., -0.9283,  1.2044, -0.7485],\n",
       "          [ 1.2952,  0.0145, -0.4903,  ..., -1.0618,  0.9241,  0.0928],\n",
       "          [ 0.9810,  0.0274, -0.2624,  ..., -0.8447,  0.3484, -0.2251]],\n",
       "\n",
       "         [[ 0.6922,  0.4421,  0.2786,  ..., -0.2213,  0.2488,  1.8778],\n",
       "          [-0.1203, -0.2795, -0.0287,  ..., -0.2255,  0.5681,  1.2821],\n",
       "          [ 0.3923,  0.6569,  0.0967,  ..., -0.0928,  0.2676,  2.2244],\n",
       "          ...,\n",
       "          [ 0.4983, -0.2781,  0.9789,  ...,  0.5424,  1.0169,  1.1159],\n",
       "          [-0.8550,  0.5215, -0.2168,  ..., -0.1893,  0.9473,  0.7673],\n",
       "          [-0.9623,  0.1968,  1.1720,  ..., -0.4878,  0.9685, -0.6823]]]],\n",
       "       grad_fn=<PermuteBackward0>), tensor([[[[ 3.5844e-02,  4.5047e-02, -3.2349e-02,  ...,  1.1302e-01,\n",
       "            3.4111e-03, -7.3823e-02],\n",
       "          [ 2.4216e-02, -2.3168e-01,  7.9895e-02,  ..., -3.9604e-02,\n",
       "            1.3466e-01, -8.8950e-02],\n",
       "          [ 1.3182e-01,  4.2661e-02,  7.7161e-03,  ...,  1.1645e-01,\n",
       "            1.4362e-01, -3.0297e-02],\n",
       "          ...,\n",
       "          [ 5.1271e-02, -1.0683e-02,  1.3832e-01,  ...,  4.6392e-02,\n",
       "           -7.1929e-02,  3.3192e-01],\n",
       "          [ 4.0427e-02,  3.1326e-02,  7.5803e-03,  ...,  6.6739e-02,\n",
       "           -9.5121e-02,  2.2703e-02],\n",
       "          [-2.5431e-01,  9.7892e-02, -3.1401e-01,  ..., -5.9719e-02,\n",
       "            8.7119e-02, -1.5292e-01]],\n",
       "\n",
       "         [[ 4.6511e-01,  2.9299e-01, -2.6004e-01,  ..., -4.8896e-01,\n",
       "           -3.9832e-01,  5.1992e-02],\n",
       "          [ 3.6216e-01, -2.1008e-01,  2.3111e-01,  ...,  1.2695e-02,\n",
       "           -9.8508e-03, -1.9051e-01],\n",
       "          [ 5.0594e-01, -2.8700e-01, -3.7256e-02,  ...,  1.3379e-01,\n",
       "            1.4818e-01, -7.0381e-02],\n",
       "          ...,\n",
       "          [ 4.5519e-01,  2.2482e-01,  2.7037e-02,  ..., -7.6203e-02,\n",
       "            1.3018e-01,  1.1114e-01],\n",
       "          [ 4.3946e-01,  5.6357e-02, -2.8075e-01,  ...,  3.8331e-02,\n",
       "            2.8041e-01, -1.0264e-01],\n",
       "          [ 5.5593e-01, -7.1407e-02,  8.1585e-03,  ...,  7.4966e-02,\n",
       "            5.5887e-01, -1.0753e-01]],\n",
       "\n",
       "         [[-3.5264e-02,  5.7019e-02, -7.3887e-02,  ..., -1.2185e-02,\n",
       "           -8.9059e-02, -1.0759e-01],\n",
       "          [-1.4517e-01, -1.1093e-01, -3.1237e-01,  ...,  7.9633e-03,\n",
       "            1.0515e-01, -6.8205e-02],\n",
       "          [-2.9723e-01, -1.0871e-01, -3.7647e-01,  ..., -4.4998e-01,\n",
       "           -3.9353e-01, -5.8729e-02],\n",
       "          ...,\n",
       "          [ 3.6986e-01,  3.6383e-01,  8.8384e-02,  ..., -2.8474e-01,\n",
       "           -2.1211e-01, -4.2789e-01],\n",
       "          [ 5.1485e-01,  1.4955e-02,  1.9848e-01,  ...,  1.8763e-03,\n",
       "           -4.4840e-02, -1.9523e-01],\n",
       "          [-3.1557e-01, -7.7205e-02,  1.2237e-01,  ..., -9.9082e-03,\n",
       "           -9.1467e-02, -1.1786e-01]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[-1.6929e-01, -1.6863e-01, -1.3075e-01,  ..., -4.5507e-02,\n",
       "            1.6489e-02, -4.9244e-03],\n",
       "          [-1.8289e-01, -7.3011e-02,  8.9083e-02,  ...,  2.8579e-01,\n",
       "            1.6520e-01,  3.9901e-01],\n",
       "          [ 1.8149e-02,  1.4864e-01,  1.3995e-01,  ..., -2.0187e-01,\n",
       "           -3.1164e-01,  1.8832e-01],\n",
       "          ...,\n",
       "          [ 1.8829e-01,  3.7086e-01,  2.2190e-02,  ..., -4.3816e-01,\n",
       "           -4.5873e-02, -2.3247e-01],\n",
       "          [-6.9325e-02,  2.2089e-01, -5.2158e-02,  ..., -5.8068e-05,\n",
       "           -4.4377e-02, -2.3630e-02],\n",
       "          [-1.4491e-01, -7.6687e-01, -1.0080e-02,  ...,  1.4490e-01,\n",
       "           -1.4273e-01,  1.1995e-02]],\n",
       "\n",
       "         [[ 1.1663e-01, -9.5174e-02, -8.0097e-02,  ...,  1.1799e-01,\n",
       "            1.4442e-01,  8.0563e-02],\n",
       "          [-4.0979e-01,  1.7985e-01,  6.2052e-02,  ..., -4.6011e-01,\n",
       "           -1.5909e-01,  1.6538e-01],\n",
       "          [ 1.0707e-01, -1.4439e-01, -3.8615e-02,  ..., -3.1468e-01,\n",
       "           -1.1422e-01,  1.1694e-01],\n",
       "          ...,\n",
       "          [-8.9047e-02, -5.7536e-02, -1.4755e-01,  ..., -4.0699e-01,\n",
       "           -1.5711e-01, -2.4647e-01],\n",
       "          [-5.0915e-02,  8.4827e-02,  5.0269e-02,  ..., -2.7516e-02,\n",
       "           -2.4682e-01, -1.2400e-01],\n",
       "          [-4.0786e-02, -6.3005e-02, -5.9178e-02,  ...,  1.9072e-01,\n",
       "            2.5695e-01,  1.3483e-01]],\n",
       "\n",
       "         [[-1.4032e-01, -2.1724e-01,  2.1163e-01,  ...,  1.1325e-02,\n",
       "           -1.6940e-01, -5.6700e-02],\n",
       "          [ 1.6501e-01,  1.4751e-01,  1.2316e-01,  ...,  5.2810e-01,\n",
       "            3.7520e-01,  5.6596e-02],\n",
       "          [-1.6167e-01, -9.5173e-02, -3.0007e-01,  ..., -1.3899e-01,\n",
       "            7.6265e-02,  1.5000e-01],\n",
       "          ...,\n",
       "          [ 1.9036e-02, -4.1325e-01, -5.3456e-02,  ...,  1.0850e-01,\n",
       "            3.1322e-01,  7.4104e-02],\n",
       "          [ 4.7434e-02,  8.3881e-02, -5.8630e-02,  ...,  5.6656e-02,\n",
       "           -1.4553e-01,  1.3967e-01],\n",
       "          [ 1.1568e-01, -1.1512e-01, -3.7184e-02,  ...,  9.3072e-02,\n",
       "            9.4980e-02,  5.2496e-03]]]], grad_fn=<PermuteBackward0>)), (tensor([[[[-2.3234e-01,  1.7469e+00, -1.3506e+00,  ...,  1.3035e+00,\n",
       "           -1.1436e+00,  1.3027e+00],\n",
       "          [ 3.7144e-01,  1.5962e+00, -9.8959e-01,  ..., -3.7702e-01,\n",
       "           -1.6238e+00,  4.9728e-01],\n",
       "          [ 1.5475e+00,  1.7371e+00, -1.1542e+00,  ...,  2.6362e-02,\n",
       "           -2.4185e+00,  6.0651e-02],\n",
       "          ...,\n",
       "          [ 1.1640e+00,  2.7594e-01, -3.8811e-01,  ...,  9.9817e-01,\n",
       "           -2.4461e-01, -2.2513e+00],\n",
       "          [ 9.1529e-02,  1.3589e+00,  5.4384e-02,  ...,  9.0259e-01,\n",
       "           -2.1145e+00, -1.0515e+00],\n",
       "          [-2.3758e-01,  4.2933e-01,  2.5327e-01,  ...,  6.8149e-01,\n",
       "           -7.0189e-01, -1.3681e-01]],\n",
       "\n",
       "         [[-1.4716e+00, -7.0062e-01, -8.4053e-01,  ..., -3.4795e-01,\n",
       "            9.0868e-01, -5.7943e-01],\n",
       "          [-6.7887e-01, -8.5541e-01, -1.9801e+00,  ..., -1.3131e+00,\n",
       "           -2.7207e-01,  1.9432e-01],\n",
       "          [-8.6449e-01,  1.6246e-02, -1.5927e+00,  ..., -1.5577e-01,\n",
       "            4.3638e-01,  3.4525e-01],\n",
       "          ...,\n",
       "          [-8.9806e-01,  3.8957e-01, -1.8324e+00,  ..., -3.9667e-01,\n",
       "           -9.1773e-01, -5.6276e-01],\n",
       "          [-5.7540e-01,  9.9297e-01, -1.6385e+00,  ..., -2.2806e-01,\n",
       "            3.7440e-01, -1.2218e+00],\n",
       "          [-5.3996e-01,  1.2337e+00, -1.4777e+00,  ..., -1.9485e-01,\n",
       "           -7.1104e-01, -6.1031e-01]],\n",
       "\n",
       "         [[ 3.3298e-01,  1.1988e-01, -2.9469e-02,  ..., -1.2560e+00,\n",
       "            1.5321e-01, -2.1866e-01],\n",
       "          [ 3.3311e-01,  6.0988e-01, -3.2191e-01,  ..., -1.2439e+00,\n",
       "           -7.9251e-02,  7.9439e-02],\n",
       "          [-1.3959e-01,  2.9871e-01, -1.1090e-01,  ..., -8.7817e-01,\n",
       "           -2.1968e-01,  5.7344e-01],\n",
       "          ...,\n",
       "          [-1.1917e-01, -1.2050e-01,  8.8956e-02,  ..., -1.0531e+00,\n",
       "            2.7097e-01,  2.8109e-01],\n",
       "          [-1.6320e-01, -2.1190e-01, -2.8066e-01,  ..., -1.1110e+00,\n",
       "           -1.1413e-01,  3.4103e-01],\n",
       "          [-1.9088e-01, -2.2637e-01, -1.9520e-01,  ..., -1.3609e+00,\n",
       "            7.7478e-03,  5.4753e-02]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[ 4.1961e-01, -5.4149e-01, -6.5129e-01,  ..., -1.3049e-01,\n",
       "            6.4293e-01, -1.0648e+00],\n",
       "          [-1.1139e+00,  1.7472e+00,  1.9236e+00,  ..., -5.2888e-01,\n",
       "           -8.8113e-01, -1.1274e+00],\n",
       "          [-3.5029e-02,  1.6498e+00,  1.5704e+00,  ...,  1.8705e+00,\n",
       "            4.4842e-01,  1.7473e-01],\n",
       "          ...,\n",
       "          [ 2.8044e-02,  1.5038e+00,  1.7172e+00,  ..., -1.5435e-01,\n",
       "           -7.6511e-01, -8.1489e-01],\n",
       "          [-6.5316e-01,  1.2453e+00,  1.9217e+00,  ...,  6.9396e-01,\n",
       "           -1.3474e+00, -1.1424e+00],\n",
       "          [-4.9458e-01,  1.4317e+00,  1.6529e+00,  ...,  5.9429e-01,\n",
       "           -1.7269e+00, -3.5413e-01]],\n",
       "\n",
       "         [[-1.2016e+00, -2.7812e+00,  1.2903e-01,  ...,  1.6795e+00,\n",
       "            1.5993e+00, -1.5680e+00],\n",
       "          [-2.4824e-01,  9.6243e-01, -6.0818e-01,  ..., -6.4028e-01,\n",
       "            7.3015e-01,  1.6934e-03],\n",
       "          [ 3.7342e-01,  7.0263e-01, -5.0011e-01,  ..., -6.0884e-01,\n",
       "            4.8429e-01, -9.1718e-02],\n",
       "          ...,\n",
       "          [-2.1023e-01, -1.1352e-01, -8.6533e-01,  ..., -4.3357e-01,\n",
       "            1.0432e+00,  2.3951e-01],\n",
       "          [-3.5406e-01,  3.6508e-01, -7.0875e-01,  ..., -1.5284e-01,\n",
       "            9.8004e-01,  3.0699e-01],\n",
       "          [-3.1061e-01,  2.6206e-01, -7.0317e-01,  ..., -3.8614e-01,\n",
       "            7.4612e-01, -1.0789e-01]],\n",
       "\n",
       "         [[ 1.2471e+00,  1.8337e+00,  1.4891e+00,  ..., -2.7984e-01,\n",
       "            8.3699e-02, -4.6373e-01],\n",
       "          [-7.5646e-03,  2.5955e+00,  5.7040e-01,  ...,  1.0415e+00,\n",
       "           -1.6044e-01, -3.6244e-01],\n",
       "          [ 7.7530e-02,  2.8104e+00,  1.5624e+00,  ...,  1.5091e+00,\n",
       "           -1.8090e-02, -1.2300e+00],\n",
       "          ...,\n",
       "          [-4.7830e-01,  2.4190e+00,  2.3556e+00,  ...,  1.7733e+00,\n",
       "           -1.6757e+00, -2.5219e+00],\n",
       "          [ 9.0540e-02,  3.3119e+00, -3.6678e-01,  ...,  2.4435e+00,\n",
       "           -1.6148e+00, -2.1822e+00],\n",
       "          [ 5.6891e-01,  1.7215e+00,  7.9348e-01,  ..., -6.3631e-01,\n",
       "           -4.2486e-01,  1.1764e-02]]]], grad_fn=<PermuteBackward0>), tensor([[[[ 1.8121e-01, -1.3411e-01, -2.1478e-01,  ...,  7.0410e-02,\n",
       "            2.1432e-01, -4.1736e-01],\n",
       "          [-3.4722e-01, -2.4165e-01, -3.5845e-01,  ..., -4.0394e-03,\n",
       "            6.8928e-01,  6.3408e-01],\n",
       "          [ 1.8921e-01,  1.1412e-01, -3.7454e-01,  ..., -6.1886e-01,\n",
       "            1.5584e-01, -3.3977e-01],\n",
       "          ...,\n",
       "          [-9.8094e-01, -3.1040e-01, -4.3409e-02,  ...,  8.2431e-01,\n",
       "            1.2980e-02,  3.8812e-03],\n",
       "          [ 3.3090e-01, -8.1123e-02,  1.0761e-01,  ..., -2.8871e-01,\n",
       "            4.1475e-01,  4.9912e-01],\n",
       "          [ 6.5354e-01,  1.7933e-01,  9.7152e-02,  ..., -9.8650e-02,\n",
       "           -1.9800e-01, -7.1575e-02]],\n",
       "\n",
       "         [[-1.2948e-01, -1.3113e-01,  3.0471e-01,  ..., -9.5208e-02,\n",
       "           -5.2637e-01,  2.0390e-02],\n",
       "          [-5.2768e-01, -2.4699e-01, -1.2828e-01,  ..., -1.3858e-01,\n",
       "            1.6376e-01,  3.1491e-01],\n",
       "          [ 3.4299e-02,  2.9798e-01, -1.8136e-01,  ...,  3.7112e-01,\n",
       "           -1.1525e-02, -2.1673e-01],\n",
       "          ...,\n",
       "          [-8.4666e-03, -1.8613e-01,  2.7562e-01,  ...,  4.4664e-01,\n",
       "           -1.5165e-01, -5.9345e-01],\n",
       "          [ 2.7736e-01, -5.3698e-01,  1.0314e+00,  ...,  1.0592e-01,\n",
       "           -1.0284e-01, -2.0281e-01],\n",
       "          [ 3.6434e-01, -1.9135e-02,  1.0105e-01,  ...,  1.8953e-01,\n",
       "            2.6101e-01, -4.4438e-02]],\n",
       "\n",
       "         [[ 4.5340e-02, -4.0817e-01,  5.3539e-02,  ..., -4.7963e-01,\n",
       "           -2.8298e-01,  2.7899e-02],\n",
       "          [ 8.4529e-01,  9.4444e-02,  2.4181e-02,  ..., -7.5536e-01,\n",
       "           -4.6340e-02,  1.8064e-01],\n",
       "          [ 8.0320e-01,  1.4924e-01,  5.8457e-01,  ..., -8.3811e-01,\n",
       "            1.0063e-01,  2.5599e-01],\n",
       "          ...,\n",
       "          [ 9.6557e-01, -9.7738e-02,  2.9158e-02,  ..., -3.4948e-01,\n",
       "            1.6999e-01, -1.4430e-01],\n",
       "          [ 8.1042e-01, -2.8220e-02, -1.2203e-01,  ..., -3.2469e-01,\n",
       "           -2.2869e-01,  4.6458e-01],\n",
       "          [ 4.8326e-01,  1.9731e-01,  2.5365e-01,  ..., -9.5574e-02,\n",
       "           -1.1013e-01, -3.0431e-01]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[ 1.4653e-01,  6.2398e-01, -1.6506e-01,  ...,  1.2628e-01,\n",
       "           -9.4077e-01, -1.0024e-01],\n",
       "          [ 8.5481e-02, -3.7416e-01,  6.1154e-02,  ...,  2.1396e-01,\n",
       "           -6.8562e-01, -1.3296e-01],\n",
       "          [-2.7189e-01,  7.6011e-02,  5.0749e-01,  ...,  6.4782e-02,\n",
       "           -6.3529e-01, -3.4742e-01],\n",
       "          ...,\n",
       "          [-1.5075e-01,  1.6193e-01,  4.4225e-01,  ...,  5.5478e-01,\n",
       "            9.6686e-02,  1.3933e-02],\n",
       "          [ 2.9650e-02, -3.2289e-01, -3.7855e-01,  ...,  2.1001e-01,\n",
       "           -5.2798e-01,  7.2306e-01],\n",
       "          [ 5.7424e-02, -8.9264e-01,  3.6512e-01,  ...,  2.4724e-01,\n",
       "           -7.7394e-01, -2.8834e-01]],\n",
       "\n",
       "         [[ 1.9841e-01, -1.6214e-01,  8.8751e-02,  ...,  3.8060e-01,\n",
       "           -3.5175e+00, -4.8737e-02],\n",
       "          [ 4.8521e-02, -4.2689e-01,  2.9563e-03,  ...,  4.1736e-01,\n",
       "            3.3646e-02, -1.3576e-01],\n",
       "          [-2.3386e-01,  4.3438e-02,  1.9538e-01,  ...,  2.6902e-02,\n",
       "            7.1384e-02,  8.4704e-02],\n",
       "          ...,\n",
       "          [-1.4848e-03,  1.7191e-01, -5.1814e-01,  ...,  2.2445e-01,\n",
       "           -1.5839e-01,  1.5118e-01],\n",
       "          [ 1.7462e-02, -2.2361e-03,  4.6600e-01,  ...,  6.4922e-02,\n",
       "            3.4853e-02,  1.8072e-01],\n",
       "          [ 4.1400e-01,  1.7019e-01,  2.7337e-02,  ...,  1.4135e-01,\n",
       "           -1.5423e-01,  7.8442e-02]],\n",
       "\n",
       "         [[-5.0851e-02, -5.9927e-02, -5.6214e-02,  ..., -2.0252e-01,\n",
       "            1.2368e-01, -3.3855e-02],\n",
       "          [-1.1510e-02, -7.6934e-02,  2.3691e-01,  ...,  6.5874e-02,\n",
       "            1.5217e-02,  1.8671e-01],\n",
       "          [ 1.5296e-01, -1.3156e-01,  1.3923e-01,  ...,  1.8831e-01,\n",
       "           -8.2087e-02,  1.2744e-01],\n",
       "          ...,\n",
       "          [ 3.9660e-02,  2.4684e-01,  1.9528e-01,  ..., -9.5808e-02,\n",
       "            1.7549e-01, -6.6852e-03],\n",
       "          [ 8.3237e-02,  1.1592e-01, -5.2513e-02,  ..., -3.2531e-02,\n",
       "            2.5978e-01,  1.4565e-01],\n",
       "          [ 1.2383e-01, -1.1883e-02,  5.0095e-01,  ..., -1.0735e-02,\n",
       "            1.1066e-01,  1.8190e-01]]]], grad_fn=<PermuteBackward0>)), (tensor([[[[-0.1322, -1.1527,  0.3087,  ..., -0.6205, -0.1060, -0.0181],\n",
       "          [ 0.5456, -2.5518, -0.2942,  ..., -1.2907,  0.0803,  0.1253],\n",
       "          [ 0.9459, -2.5900,  0.3366,  ...,  0.2387,  0.0244, -0.0984],\n",
       "          ...,\n",
       "          [-1.4713, -1.8862, -0.5822,  ..., -0.4205, -0.2317, -0.3811],\n",
       "          [-0.4795, -1.7547,  0.5800,  ...,  1.0587,  0.7753, -1.0023],\n",
       "          [-0.2153, -2.2462, -0.5551,  ...,  1.7418,  0.0601, -0.8750]],\n",
       "\n",
       "         [[-0.4981,  0.4839, -0.4288,  ...,  1.2952, -0.4981, -0.4568],\n",
       "          [-1.4536, -0.2465, -0.9163,  ...,  0.4530,  0.6458,  0.4874],\n",
       "          [-1.6652, -0.9680, -1.0875,  ..., -0.0277,  1.3271, -0.3510],\n",
       "          ...,\n",
       "          [-2.1762, -0.6636, -1.8135,  ..., -0.9017,  1.3285,  0.1540],\n",
       "          [-1.3590, -1.9787, -1.5488,  ..., -1.1410,  1.3344, -0.2104],\n",
       "          [-1.4276, -0.6944, -0.5667,  ...,  1.1990,  0.8454, -0.3241]],\n",
       "\n",
       "         [[ 1.3747,  3.0648,  3.7658,  ...,  0.6805,  1.6251, -0.7113],\n",
       "          [-3.6194,  2.6739, -2.5109,  ..., -3.2047,  2.9450, -0.3144],\n",
       "          [-3.1928,  1.5469, -3.1628,  ..., -2.9214,  3.4801,  1.2360],\n",
       "          ...,\n",
       "          [-4.6832, -1.4904, -4.8171,  ..., -3.4247,  1.8220,  0.7462],\n",
       "          [-4.9916, -1.4019, -5.5591,  ..., -3.7371,  2.9422,  0.6384],\n",
       "          [-3.3272, -1.9141, -4.6821,  ..., -3.4394,  3.4462,  0.7966]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[ 1.3571, -2.6940, -2.5864,  ...,  0.9747,  0.4724,  2.7245],\n",
       "          [-3.2790,  1.4895,  0.4607,  ...,  0.6635, -1.8444, -0.3443],\n",
       "          [-3.4600,  2.6956,  1.3558,  ..., -0.1266, -2.7145,  0.0972],\n",
       "          ...,\n",
       "          [-3.1993,  4.0702,  1.9383,  ..., -0.9552, -3.1067, -1.4688],\n",
       "          [-3.8521,  4.6710,  2.9112,  ..., -2.2364, -3.9045, -1.5241],\n",
       "          [-3.5332,  4.3554,  2.3162,  ..., -0.6138, -2.1636, -1.3958]],\n",
       "\n",
       "         [[ 1.7131,  0.4496,  0.9466,  ...,  0.0104, -0.9866, -0.3256],\n",
       "          [ 2.1683,  0.8876,  1.4803,  ...,  0.1768, -1.8011, -1.9908],\n",
       "          [ 2.0806,  1.0288,  1.1972,  ..., -0.0733, -1.5057, -1.4346],\n",
       "          ...,\n",
       "          [ 1.9361,  0.8342,  1.0479,  ..., -0.0835, -1.9249, -1.2110],\n",
       "          [ 2.2447,  0.3010,  1.0368,  ..., -0.2990, -1.8205, -1.2766],\n",
       "          [ 1.7111,  0.2756,  0.7408,  ...,  0.0539, -1.9953, -0.9998]],\n",
       "\n",
       "         [[-0.2538,  0.1160, -0.5586,  ...,  0.2681,  0.2844,  0.1296],\n",
       "          [-0.2222,  0.8479, -0.4422,  ..., -0.2732,  0.8777,  0.8108],\n",
       "          [-0.6727,  0.5071,  0.0572,  ...,  0.0786,  0.3229,  0.3834],\n",
       "          ...,\n",
       "          [ 0.2167, -0.1937, -1.1421,  ..., -0.4700,  0.8094, -0.1040],\n",
       "          [-0.4399,  0.3740,  0.0801,  ..., -0.2625,  0.2348, -0.2820],\n",
       "          [-0.7647, -0.3623, -0.5461,  ...,  0.6692,  0.6866,  0.8584]]]],\n",
       "       grad_fn=<PermuteBackward0>), tensor([[[[-2.6110e-02, -6.3370e-03, -1.2897e-01,  ..., -2.7515e-02,\n",
       "            3.0886e-02, -5.4642e-01],\n",
       "          [ 2.8421e-01, -8.6538e-01, -2.9025e-01,  ...,  2.3311e-01,\n",
       "           -8.4891e-01,  1.3338e+00],\n",
       "          [ 2.4186e-01, -2.5358e-01, -8.4709e-01,  ..., -4.7114e-01,\n",
       "           -4.3807e-01,  8.2508e-01],\n",
       "          ...,\n",
       "          [ 5.8147e-01, -8.3756e-01,  1.4187e-02,  ...,  6.5366e-01,\n",
       "            3.2311e-01,  1.0884e+00],\n",
       "          [ 2.4318e-02, -4.3096e-01, -1.3149e-01,  ..., -2.2943e-01,\n",
       "            4.7279e-01,  6.5807e-01],\n",
       "          [ 1.6472e-01,  1.1389e+00, -1.3656e+00,  ...,  3.6690e-01,\n",
       "            5.6864e-01, -9.9857e-01]],\n",
       "\n",
       "         [[ 3.3488e-02, -2.1709e-02,  3.1775e-02,  ..., -3.6363e-02,\n",
       "           -2.0804e-02,  7.0153e-02],\n",
       "          [-2.9799e-02,  6.2118e-01, -3.6438e-01,  ...,  4.9335e-02,\n",
       "           -1.8000e-01,  3.8404e-01],\n",
       "          [-4.7976e-01,  1.9055e-02, -1.4230e+00,  ..., -4.2935e-01,\n",
       "            4.5856e-01,  1.0601e-01],\n",
       "          ...,\n",
       "          [-1.1260e+00, -6.7430e-02, -1.2505e+00,  ...,  9.8989e-01,\n",
       "            2.0997e-01,  6.6872e-01],\n",
       "          [-3.0780e-01, -2.7036e-02,  7.3606e-01,  ..., -1.8043e-01,\n",
       "           -6.6421e-02, -4.7701e-01],\n",
       "          [-5.7835e-02, -1.4136e-02,  2.8287e-01,  ..., -1.8417e-01,\n",
       "           -1.4726e-01,  1.0323e-01]],\n",
       "\n",
       "         [[ 6.9997e-03, -7.2047e-01, -1.4875e-02,  ...,  5.4531e-02,\n",
       "            3.6829e-02, -1.3042e-02],\n",
       "          [ 7.3096e-01, -1.3004e+00,  2.5491e-02,  ..., -2.0688e-01,\n",
       "            2.1388e-01, -1.9886e-01],\n",
       "          [-5.9057e-01, -1.3042e+00,  1.9017e-01,  ..., -1.8207e-01,\n",
       "           -3.5321e-03,  3.4668e-01],\n",
       "          ...,\n",
       "          [ 4.1662e-01, -4.3040e-01,  4.1711e-01,  ..., -7.3938e-01,\n",
       "           -1.8402e-01, -3.6389e-01],\n",
       "          [-1.5217e-02, -1.5639e+00, -5.2417e-01,  ..., -2.4254e-01,\n",
       "           -2.1577e-01,  3.0141e-01],\n",
       "          [ 1.0562e-01, -1.4539e+00, -1.0584e-01,  ..., -1.0705e-01,\n",
       "            3.9389e-02, -1.4812e-01]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[ 2.0108e-02, -5.6756e-02,  1.3052e+00,  ...,  1.0926e-03,\n",
       "            1.9278e-01, -2.0602e-03],\n",
       "          [-2.6232e-02,  3.2866e-01,  1.9221e+00,  ...,  1.4766e-01,\n",
       "           -6.7259e-02,  2.4187e-01],\n",
       "          [-4.8278e-01, -4.2951e-01,  1.8054e+00,  ...,  3.6253e-01,\n",
       "           -3.1257e-01,  3.7770e-01],\n",
       "          ...,\n",
       "          [-1.0864e+00, -7.8481e-01,  2.2155e+00,  ..., -2.1905e-01,\n",
       "            1.4895e-01,  3.9424e-02],\n",
       "          [ 1.1841e-01, -5.9558e-01,  1.9721e+00,  ...,  4.6364e-01,\n",
       "           -2.0958e-01,  1.8704e-01],\n",
       "          [ 8.8032e-02, -3.0378e-01,  2.5235e+00,  ..., -1.3628e-01,\n",
       "            2.6798e-01,  3.4488e-01]],\n",
       "\n",
       "         [[-1.5290e-02, -6.3416e-02, -1.4310e-01,  ...,  1.6105e-01,\n",
       "            1.1368e-01,  1.8954e-01],\n",
       "          [-9.1473e-01,  2.2297e-01,  1.4207e-01,  ...,  3.7581e-02,\n",
       "           -3.2340e-01, -1.5199e-01],\n",
       "          [ 3.7227e-01, -4.9001e-01,  1.7333e-01,  ..., -1.4885e-01,\n",
       "           -6.3321e-01, -2.0094e-01],\n",
       "          ...,\n",
       "          [-4.8509e-01,  2.7552e-01,  4.1076e-01,  ...,  7.9583e-02,\n",
       "           -1.2452e-02, -6.0336e-01],\n",
       "          [ 1.9530e-01,  2.1733e-01,  3.4654e-01,  ...,  3.8038e-01,\n",
       "           -1.0907e+00, -8.6067e-02],\n",
       "          [ 6.8279e-01,  6.9183e-01,  2.1597e-01,  ...,  2.1290e-01,\n",
       "           -1.1599e-01,  4.2206e-01]],\n",
       "\n",
       "         [[ 2.3022e-02,  2.4736e-02,  1.4599e-02,  ..., -1.3392e-02,\n",
       "            2.2758e-01, -6.3206e-03],\n",
       "          [ 4.9218e-02,  5.3240e-01,  9.0467e-01,  ...,  1.1255e-01,\n",
       "           -2.0287e+00,  9.8514e-01],\n",
       "          [-4.7997e-01, -7.5514e-02, -1.9737e-01,  ..., -6.4949e-02,\n",
       "           -1.9641e+00,  1.0550e-01],\n",
       "          ...,\n",
       "          [ 5.7662e-01,  1.3777e-01,  3.1310e-02,  ...,  1.8253e-01,\n",
       "           -2.1296e+00, -1.7676e-01],\n",
       "          [ 2.4730e-01,  8.7524e-02, -3.3989e-01,  ...,  8.7951e-02,\n",
       "           -2.2735e+00, -1.8762e-01],\n",
       "          [ 3.9984e-01, -2.0190e-01,  1.6419e-02,  ..., -2.5319e-01,\n",
       "           -1.3256e+00, -2.3872e-01]]]], grad_fn=<PermuteBackward0>)), (tensor([[[[ 5.1604e-03, -1.8791e-01,  1.4746e-01,  ..., -8.8068e-01,\n",
       "            7.5894e-01, -1.1973e+00],\n",
       "          [-2.5677e+00,  1.7679e+00, -3.9106e+00,  ...,  1.2833e+00,\n",
       "           -1.2259e+00,  2.1156e+00],\n",
       "          [-1.6822e+00, -3.9114e-01, -1.4672e+00,  ..., -1.2825e-01,\n",
       "            5.9892e-01,  4.3527e-01],\n",
       "          ...,\n",
       "          [-1.5596e+00, -3.2881e-02,  6.3978e-01,  ...,  1.7795e+00,\n",
       "           -1.6521e+00,  2.0406e+00],\n",
       "          [-6.5270e-01, -5.5162e-01,  1.0236e+00,  ..., -5.6010e-01,\n",
       "           -1.9103e+00,  1.2966e+00],\n",
       "          [-1.0920e-01, -8.7920e-01,  1.5121e-02,  ...,  6.2073e-01,\n",
       "           -2.7510e-01,  2.1425e-01]],\n",
       "\n",
       "         [[ 8.0498e-01,  2.0352e-01,  1.7896e-02,  ..., -1.6228e-01,\n",
       "           -1.0867e+00, -2.0632e-01],\n",
       "          [ 6.3069e-01, -1.4465e+00,  5.0411e-01,  ...,  3.4637e-01,\n",
       "            5.0468e+00,  2.1150e+00],\n",
       "          [-1.1162e+00, -1.3258e+00,  6.3189e-01,  ...,  1.2063e+00,\n",
       "            4.4761e+00,  1.0679e+00],\n",
       "          ...,\n",
       "          [ 2.2878e-01, -2.3659e+00,  7.8082e-01,  ...,  1.5353e+00,\n",
       "            4.7206e+00,  1.8281e+00],\n",
       "          [-4.8325e-01, -2.4448e+00,  2.4398e-01,  ..., -2.1931e-01,\n",
       "            4.3510e+00,  2.2534e+00],\n",
       "          [-1.7464e+00, -2.7801e-02, -1.4060e+00,  ..., -8.7864e-02,\n",
       "            4.2460e+00,  1.9398e+00]],\n",
       "\n",
       "         [[ 3.3238e-01, -3.5263e-01, -3.3511e-01,  ...,  3.3173e-01,\n",
       "            1.4302e+00,  2.7422e-01],\n",
       "          [-3.7069e-01, -6.1829e+00, -2.2466e+00,  ..., -3.6439e+00,\n",
       "           -2.1060e+00, -7.4567e+00],\n",
       "          [-4.5128e-01, -5.9982e+00, -2.4673e+00,  ..., -4.1265e+00,\n",
       "           -3.8545e+00, -6.1227e+00],\n",
       "          ...,\n",
       "          [-2.1317e+00, -6.2376e+00, -4.9529e+00,  ..., -3.5573e+00,\n",
       "           -2.2995e+00, -4.4041e+00],\n",
       "          [-2.2613e+00, -7.2026e+00, -3.0679e+00,  ..., -2.5467e+00,\n",
       "           -3.5838e+00, -3.2058e+00],\n",
       "          [-3.0414e+00, -6.8481e+00, -1.5457e+00,  ..., -3.4928e+00,\n",
       "           -2.2893e+00, -5.9958e+00]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[ 2.3405e-01,  1.8003e+00,  5.3820e-01,  ...,  2.6242e-01,\n",
       "            4.4860e-01, -1.6709e+00],\n",
       "          [ 1.3520e+00, -5.1191e+00,  2.9198e+00,  ..., -8.6180e-01,\n",
       "           -1.6337e+00,  5.9583e+00],\n",
       "          [-8.9199e-02, -6.2376e+00,  4.5249e-01,  ..., -1.8860e+00,\n",
       "           -2.1434e+00,  7.1481e+00],\n",
       "          ...,\n",
       "          [ 3.8247e-01, -7.7788e+00,  2.1529e+00,  ..., -2.9409e+00,\n",
       "           -1.4936e+00,  6.4806e+00],\n",
       "          [ 5.7257e-01, -6.5320e+00,  3.4520e-01,  ..., -4.8911e+00,\n",
       "           -1.1043e+00,  6.2615e+00],\n",
       "          [ 2.3119e-02, -7.1271e+00,  1.8783e+00,  ..., -2.4430e+00,\n",
       "           -1.0648e+00,  5.3935e+00]],\n",
       "\n",
       "         [[ 7.0162e-02, -5.0270e-02,  1.6436e-01,  ..., -8.4555e-02,\n",
       "           -8.5299e-02, -1.4128e-01],\n",
       "          [ 4.7663e-01, -5.4246e-01,  6.5545e-01,  ...,  4.4172e-01,\n",
       "           -1.1501e+00, -1.4259e+00],\n",
       "          [ 1.6341e+00, -1.9560e+00, -1.2453e+00,  ..., -1.1005e+00,\n",
       "           -1.0474e-01, -3.4483e-01],\n",
       "          ...,\n",
       "          [-1.4072e-01, -1.0868e+00, -9.5867e-02,  ...,  9.9929e-01,\n",
       "            1.2154e+00,  1.0204e+00],\n",
       "          [-5.3142e-01, -5.1684e-01, -7.8924e-01,  ..., -4.3379e-01,\n",
       "           -5.0606e-01, -1.4453e-01],\n",
       "          [-1.6270e-01, -6.7944e-01,  3.7796e-02,  ..., -8.0383e-02,\n",
       "           -2.0414e-01,  3.0041e-01]],\n",
       "\n",
       "         [[ 4.0545e-01, -3.8652e-02,  1.8953e+00,  ..., -2.2527e-01,\n",
       "           -1.9901e-01, -9.9291e-01],\n",
       "          [ 3.7929e+00,  9.7979e-01, -1.5706e+00,  ...,  8.7371e-01,\n",
       "            1.3123e+00,  4.0946e+00],\n",
       "          [ 2.6437e+00,  2.8533e+00, -1.1851e+00,  ...,  1.4979e+00,\n",
       "            1.7340e+00,  2.7878e+00],\n",
       "          ...,\n",
       "          [ 3.7213e+00,  4.2312e-01, -4.0639e+00,  ...,  3.0060e+00,\n",
       "            1.1026e+00,  4.5419e+00],\n",
       "          [ 4.2811e+00,  3.5972e-01, -3.0493e+00,  ...,  3.7985e+00,\n",
       "            1.4962e+00,  5.2126e+00],\n",
       "          [ 2.6493e+00,  1.5411e+00, -3.0109e+00,  ..., -5.3203e-01,\n",
       "            6.5990e-01,  4.2414e+00]]]], grad_fn=<PermuteBackward0>), tensor([[[[ 3.9765e-02,  6.3191e-02,  6.3376e-04,  ...,  1.9104e-02,\n",
       "            9.9082e-02,  3.4521e-02],\n",
       "          [ 1.1378e+00, -1.2861e+00, -1.2908e-01,  ..., -3.3004e-01,\n",
       "           -1.3635e+00, -2.2904e+00],\n",
       "          [-8.1626e-02,  9.3745e-02, -2.8481e-01,  ...,  2.1468e-01,\n",
       "           -9.6816e-01,  2.9894e-01],\n",
       "          ...,\n",
       "          [ 2.1699e-01, -1.8662e-01,  3.1800e-01,  ...,  7.8087e-02,\n",
       "           -1.3018e+00,  2.1320e-01],\n",
       "          [ 7.9638e-01,  2.0152e-01,  1.0130e-01,  ...,  2.5974e-01,\n",
       "           -7.7726e-01, -7.3525e-01],\n",
       "          [-2.1013e-01, -6.6706e-01,  1.7082e-01,  ...,  4.4624e-02,\n",
       "           -2.1095e-01,  1.0726e+00]],\n",
       "\n",
       "         [[-4.9772e-02,  4.5371e-03,  8.2704e-02,  ..., -5.1366e-02,\n",
       "           -3.5350e-02, -4.1695e-02],\n",
       "          [ 9.3125e-01,  4.8477e-01,  1.2721e-02,  ...,  2.5011e-01,\n",
       "            5.0911e-01,  7.7029e-02],\n",
       "          [ 5.3607e-01,  3.6337e-01,  5.9256e-01,  ...,  1.3317e-01,\n",
       "            2.4218e-01,  5.0219e-01],\n",
       "          ...,\n",
       "          [ 9.9435e-01,  8.4734e-01, -1.4116e-02,  ..., -2.2738e-01,\n",
       "           -3.3423e-01,  7.2092e-01],\n",
       "          [ 7.0013e-01,  4.3563e-02,  9.8744e-02,  ..., -4.9008e-01,\n",
       "           -1.1142e-01,  2.7808e-01],\n",
       "          [-2.6422e-01,  3.3482e-02,  4.0394e-02,  ...,  2.1468e-01,\n",
       "            2.0354e-01, -5.8808e-02]],\n",
       "\n",
       "         [[ 2.8733e-02, -1.1324e-01, -6.5474e-02,  ..., -2.8632e-02,\n",
       "            7.8444e-02, -1.6102e-01],\n",
       "          [-4.5820e-01,  4.7828e-01,  7.0546e-02,  ...,  5.3585e-01,\n",
       "           -4.5617e-01, -2.3535e-01],\n",
       "          [-3.3742e-01, -4.2131e-02,  5.9883e-01,  ...,  2.5610e-01,\n",
       "            7.7897e-02, -4.0960e-01],\n",
       "          ...,\n",
       "          [ 7.5842e-01, -5.6218e-02, -7.4135e-01,  ...,  5.3671e-02,\n",
       "           -4.3988e-01,  1.1196e+00],\n",
       "          [-8.5638e-01,  1.0162e+00, -1.6276e+00,  ..., -5.2393e-01,\n",
       "           -5.1775e-01,  2.8000e-01],\n",
       "          [ 3.0931e-01,  9.2681e-02, -5.2069e-01,  ..., -2.1601e-01,\n",
       "           -2.4955e-01, -6.8193e-02]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[-2.0445e-02,  1.2575e-01, -8.2247e-03,  ..., -3.0457e-02,\n",
       "            6.6124e-02, -3.3832e-02],\n",
       "          [-7.7384e-01,  4.6625e-01, -8.6402e-01,  ...,  1.1561e+00,\n",
       "            1.9820e-01, -3.0625e-01],\n",
       "          [-2.7733e-04,  6.3935e-01, -1.8950e-01,  ...,  2.0117e-01,\n",
       "           -1.7873e-02,  5.5499e-01],\n",
       "          ...,\n",
       "          [ 7.8662e-01, -5.6376e-02,  4.7865e-01,  ..., -4.2434e-01,\n",
       "           -1.5718e-01, -4.8276e-01],\n",
       "          [ 6.4126e-01, -9.5627e-02, -9.7521e-03,  ...,  1.4783e-02,\n",
       "            3.4626e-01,  1.4248e-01],\n",
       "          [ 1.3236e-01,  2.3395e-01,  8.1250e-02,  ...,  4.1129e-01,\n",
       "            2.2832e-01,  5.4297e-02]],\n",
       "\n",
       "         [[-1.7626e-01, -1.0919e-01, -8.4360e-02,  ..., -2.3040e-01,\n",
       "           -1.7456e-02, -4.5445e-02],\n",
       "          [-7.3267e-01,  3.0908e-01, -6.8270e-01,  ...,  1.1911e+00,\n",
       "            6.0792e-01, -5.9669e-01],\n",
       "          [ 3.2711e-01, -8.1702e-01, -6.7690e-01,  ..., -1.4106e-01,\n",
       "            5.1631e-01,  1.7033e+00],\n",
       "          ...,\n",
       "          [ 6.9877e-01,  2.1668e-01,  2.2550e+00,  ...,  7.3070e-01,\n",
       "            1.5383e-01, -1.6586e-01],\n",
       "          [-1.2518e-01,  2.8476e-03,  1.7571e-01,  ...,  1.3301e-01,\n",
       "            1.5198e-01, -2.0102e-02],\n",
       "          [ 3.2786e-01,  4.0679e-01,  6.3502e-01,  ..., -9.9532e-01,\n",
       "            2.8009e-01,  1.7276e-01]],\n",
       "\n",
       "         [[ 1.2476e-01, -6.5556e-02, -2.6808e-02,  ..., -8.9693e-03,\n",
       "           -9.2486e-02, -8.6446e-02],\n",
       "          [-3.6288e-01, -4.0901e-01,  4.7382e-02,  ...,  7.5869e-01,\n",
       "            2.6507e-01, -2.3738e-01],\n",
       "          [-3.1819e-01,  4.7386e-01,  2.4661e-01,  ...,  1.0666e-02,\n",
       "            2.0598e-01, -6.8917e-01],\n",
       "          ...,\n",
       "          [-5.9656e-02,  5.2958e-01, -5.8517e-01,  ...,  2.4241e-01,\n",
       "            7.2702e-02,  4.3875e-01],\n",
       "          [-2.8286e-01, -2.9105e-01, -5.9394e-01,  ..., -1.3986e-01,\n",
       "            3.1039e-01, -5.7178e-01],\n",
       "          [ 4.7208e-01, -1.1132e-01,  2.8570e-02,  ..., -8.4544e-02,\n",
       "           -4.1679e-01,  1.1348e-01]]]], grad_fn=<PermuteBackward0>)), (tensor([[[[-8.9410e-01, -1.4102e-01,  3.3448e-01,  ..., -9.8900e-01,\n",
       "            2.4201e-02, -2.9690e+00],\n",
       "          [ 2.0739e+00, -6.6276e-01, -9.6401e-01,  ..., -2.2700e+00,\n",
       "           -2.5282e+00,  5.5307e+00],\n",
       "          [ 1.4538e+00,  9.1335e-01, -1.2821e+00,  ..., -1.7829e+00,\n",
       "           -2.5922e+00,  7.0503e+00],\n",
       "          ...,\n",
       "          [-7.4861e-02, -1.2873e+00, -4.2342e+00,  ..., -1.1139e+00,\n",
       "           -4.1929e-01,  7.9390e+00],\n",
       "          [ 6.1465e-01,  1.4598e+00, -3.6137e+00,  ...,  1.9780e-01,\n",
       "           -5.2970e-01,  7.9857e+00],\n",
       "          [ 4.4373e-01, -2.1877e-01, -2.2024e+00,  ..., -2.1846e+00,\n",
       "           -1.8394e+00,  8.6313e+00]],\n",
       "\n",
       "         [[ 3.3228e-01, -5.2100e-02,  4.4828e-01,  ..., -1.3505e-01,\n",
       "           -6.7878e-02, -2.2125e+00],\n",
       "          [-1.9308e+00,  1.9478e-01,  3.7004e+00,  ..., -7.5413e-01,\n",
       "           -1.4453e+00,  5.9218e+00],\n",
       "          [-2.3695e+00,  6.8371e-01,  3.6881e+00,  ...,  4.3966e-01,\n",
       "           -1.1369e+00,  6.4070e+00],\n",
       "          ...,\n",
       "          [-1.4517e+00,  3.6522e-01,  2.4737e+00,  ..., -7.0429e-01,\n",
       "           -9.3994e-01,  6.8221e+00],\n",
       "          [-2.4442e+00,  5.6834e-01,  2.1406e+00,  ..., -1.6725e-02,\n",
       "           -2.8341e+00,  6.6074e+00],\n",
       "          [-1.7411e+00,  1.0740e-01,  2.9581e+00,  ..., -5.2872e-01,\n",
       "           -4.2821e-01,  6.4365e+00]],\n",
       "\n",
       "         [[ 1.3868e-01, -6.5963e-01, -2.3258e-01,  ...,  1.4861e-01,\n",
       "            2.7447e-01, -1.6060e-01],\n",
       "          [ 8.7126e-02,  1.5625e+00,  8.7948e-01,  ..., -5.6050e-02,\n",
       "            6.4115e-01, -2.2401e-02],\n",
       "          [-1.3730e+00,  1.8783e+00,  1.0146e+00,  ..., -7.2353e-01,\n",
       "           -1.0071e+00, -1.1916e-01],\n",
       "          ...,\n",
       "          [-1.4432e+00,  2.3373e+00, -7.6813e-02,  ...,  1.3016e-01,\n",
       "           -3.4882e-01,  1.0060e+00],\n",
       "          [ 3.4654e-01,  2.0433e+00,  1.0946e+00,  ..., -1.9423e-01,\n",
       "            5.6170e-01,  8.5613e-01],\n",
       "          [ 3.7556e-01,  2.7165e+00,  1.5422e+00,  ..., -6.2141e-01,\n",
       "            2.8909e-01, -1.2470e+00]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[-3.8548e-01,  3.3657e-02, -8.6126e-04,  ...,  1.2389e+00,\n",
       "            4.4031e-02,  1.7787e+00],\n",
       "          [-3.3900e-01, -1.1665e+00, -1.6126e+00,  ..., -2.2094e+00,\n",
       "           -1.2285e+00, -8.3998e-01],\n",
       "          [ 1.1457e+00, -1.9595e+00, -2.7776e+00,  ..., -2.5459e+00,\n",
       "           -2.1862e+00, -6.8518e-01],\n",
       "          ...,\n",
       "          [ 2.7297e+00, -5.4906e-01,  6.4908e-01,  ..., -2.9931e+00,\n",
       "           -1.2364e+00, -2.9284e+00],\n",
       "          [ 9.0714e-01, -2.5425e-01,  1.0949e+00,  ..., -4.2778e+00,\n",
       "           -7.9689e-01, -2.4453e+00],\n",
       "          [ 5.2888e-01, -4.7623e-01, -1.0388e+00,  ..., -4.2955e+00,\n",
       "           -1.3179e+00, -6.7411e-01]],\n",
       "\n",
       "         [[-3.5278e-01, -1.6115e-01,  2.2002e-01,  ...,  2.5153e-01,\n",
       "           -2.5440e-02,  2.5464e-02],\n",
       "          [-5.8535e-01, -1.6797e+00, -9.7090e-02,  ...,  1.3156e+00,\n",
       "           -4.3526e-01, -3.6301e-01],\n",
       "          [-1.4276e+00, -4.4909e-01,  4.2122e-01,  ...,  8.0923e-01,\n",
       "            2.0564e+00,  5.5882e-01],\n",
       "          ...,\n",
       "          [ 2.3271e+00,  3.7035e-01, -1.2498e-01,  ...,  1.6023e+00,\n",
       "            7.8903e-01,  8.7364e-01],\n",
       "          [ 1.0688e+00,  6.8914e-01,  1.0407e+00,  ...,  1.1552e+00,\n",
       "           -4.6404e-01,  5.0676e-01],\n",
       "          [-5.4183e-01,  2.2367e-01,  1.1283e+00,  ...,  4.9600e-01,\n",
       "            4.4042e-01,  1.2166e+00]],\n",
       "\n",
       "         [[ 3.4287e+00,  2.1075e+00, -2.1753e+00,  ..., -2.8384e+00,\n",
       "           -3.9170e+00, -1.1987e+00],\n",
       "          [-2.3681e+00,  6.4387e-01,  2.7032e+00,  ...,  9.6174e-01,\n",
       "            5.6489e+00, -3.8173e+00],\n",
       "          [-1.7256e+00, -6.4346e-01,  3.3742e+00,  ...,  1.0354e-01,\n",
       "            1.0917e+01, -2.0007e+00],\n",
       "          ...,\n",
       "          [-2.4372e+00, -3.5694e+00,  5.2531e+00,  ..., -2.7536e+00,\n",
       "            1.4675e+01,  2.3681e+00],\n",
       "          [-5.4329e+00, -3.3473e+00,  6.1980e+00,  ..., -1.9572e+00,\n",
       "            1.3900e+01,  2.5586e+00],\n",
       "          [-5.6755e+00, -3.2005e+00,  9.5993e+00,  ..., -1.9898e-01,\n",
       "            1.4622e+01, -1.6569e+00]]]], grad_fn=<PermuteBackward0>), tensor([[[[-9.6098e-04, -5.7273e-02,  1.4961e-02,  ...,  5.1007e-02,\n",
       "            2.4101e-02,  6.1853e-02],\n",
       "          [ 8.1719e-01,  3.1390e-01,  4.9729e-01,  ..., -4.4034e-01,\n",
       "           -5.6994e-01,  1.1864e+00],\n",
       "          [ 9.6677e-03, -1.2195e-01,  9.2735e-02,  ..., -8.4646e-01,\n",
       "            1.8833e-01, -2.2715e-02],\n",
       "          ...,\n",
       "          [-8.3899e-02, -1.7438e-01,  8.2991e-01,  ...,  4.0479e-01,\n",
       "           -5.2889e-01, -1.2070e+00],\n",
       "          [ 3.9354e-01,  4.8803e-02,  1.7574e-01,  ...,  1.7612e-01,\n",
       "            3.2382e-01, -6.9690e-01],\n",
       "          [-4.1270e-01,  1.0351e-01,  1.8573e-02,  ...,  6.5982e-02,\n",
       "           -2.0056e-01, -4.9836e-02]],\n",
       "\n",
       "         [[-5.8025e-02, -2.3602e-02, -1.3654e-01,  ..., -3.6943e-02,\n",
       "            4.6139e-02, -3.2085e-04],\n",
       "          [ 5.3729e-01,  2.3709e-01,  3.6851e-01,  ...,  3.0927e-01,\n",
       "           -7.5684e-01,  6.9841e-01],\n",
       "          [ 8.7619e-02, -3.0468e-01, -4.1744e-01,  ...,  4.6823e-01,\n",
       "           -5.7566e-01,  6.2960e-01],\n",
       "          ...,\n",
       "          [ 4.8862e-01,  6.7331e-01, -1.3201e-01,  ..., -4.5718e-01,\n",
       "            4.0848e-01,  3.6935e-01],\n",
       "          [ 1.3514e-01,  3.2319e-02, -1.7488e-01,  ...,  2.5361e-01,\n",
       "            6.0363e-01,  3.6989e-01],\n",
       "          [-3.1217e-01,  6.9661e-02, -1.0854e-01,  ..., -3.3014e-01,\n",
       "           -1.8350e-01,  1.7417e-01]],\n",
       "\n",
       "         [[ 5.9868e-02,  9.4087e-02,  8.9803e-02,  ...,  1.8044e-02,\n",
       "           -8.2018e-02, -7.6688e-03],\n",
       "          [ 4.6105e-03, -1.3464e-01,  5.5771e-01,  ..., -1.0991e+00,\n",
       "           -1.0160e-01, -4.8283e-01],\n",
       "          [-5.1055e-01, -3.0271e-01, -7.6251e-01,  ..., -5.6934e-02,\n",
       "           -4.8848e-01,  6.0690e-02],\n",
       "          ...,\n",
       "          [-5.6406e-01, -3.3250e-01,  5.2524e-01,  ...,  3.8864e-01,\n",
       "            6.4106e-01,  2.6506e-02],\n",
       "          [-2.4166e-01,  6.7879e-01, -5.6899e-01,  ...,  5.1276e-01,\n",
       "            8.9496e-01, -3.2001e-01],\n",
       "          [ 1.9636e-01, -1.2071e-01, -1.2509e-01,  ..., -2.6019e-01,\n",
       "           -2.9594e-01,  3.1047e-02]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[-5.9281e-03,  8.4552e-02, -7.0813e-02,  ...,  4.7959e-02,\n",
       "            4.0231e-02, -1.3932e-01],\n",
       "          [-6.4524e-02, -1.8763e-01, -5.0959e-01,  ...,  1.2312e-01,\n",
       "            9.9802e-01, -5.4559e-01],\n",
       "          [ 2.7483e-01,  5.9988e-01, -8.3259e-01,  ...,  2.9060e-01,\n",
       "           -1.6327e-01,  5.9452e-01],\n",
       "          ...,\n",
       "          [ 1.1946e+00,  3.1072e-01, -2.8943e-01,  ...,  4.8221e-01,\n",
       "            1.0371e+00, -1.0225e+00],\n",
       "          [ 5.3250e-01, -1.2834e+00, -4.2687e-01,  ...,  1.3547e+00,\n",
       "           -1.9294e-01,  4.6109e-01],\n",
       "          [-2.1219e-01,  5.5494e-01, -4.9708e-01,  ...,  1.8992e-01,\n",
       "           -1.3037e-01, -9.2226e-01]],\n",
       "\n",
       "         [[-1.4771e-01, -5.4743e-02,  1.0354e-01,  ..., -6.4467e-02,\n",
       "            5.0235e-02, -4.7105e-03],\n",
       "          [ 2.0542e-01,  7.5674e-01, -3.8195e-01,  ...,  7.9123e-01,\n",
       "            3.3666e-01,  3.1182e-01],\n",
       "          [ 1.7716e-02, -2.9456e-01, -4.1132e-01,  ..., -3.7352e-02,\n",
       "           -5.0875e-01,  7.6014e-01],\n",
       "          ...,\n",
       "          [ 4.6112e-01,  1.3695e+00,  4.5065e-01,  ..., -6.2355e-01,\n",
       "            2.1478e-01,  1.1362e+00],\n",
       "          [-4.8438e-02,  1.9274e-02, -2.2616e-01,  ...,  6.6442e-02,\n",
       "           -1.4004e-01,  1.5981e-02],\n",
       "          [ 3.6793e-01, -4.4020e-01,  3.2716e-01,  ...,  5.6387e-01,\n",
       "            6.9215e-01,  6.9416e-02]],\n",
       "\n",
       "         [[-1.8205e-02, -3.0886e-03, -1.9086e-02,  ..., -2.5843e-02,\n",
       "            8.7407e-03, -1.7466e-02],\n",
       "          [-7.2501e-01, -4.0966e-01, -5.0878e-02,  ...,  7.9444e-01,\n",
       "           -4.2704e-01, -3.8911e-01],\n",
       "          [-2.3777e-01,  1.8286e-01, -1.1924e-01,  ...,  3.9828e-01,\n",
       "           -9.7823e-02, -7.2584e-01],\n",
       "          ...,\n",
       "          [-4.2147e-01, -4.4071e-01,  1.3276e-01,  ..., -3.9426e-01,\n",
       "            5.9784e-01, -5.9402e-01],\n",
       "          [-1.1911e+00, -2.2353e-01, -5.2611e-01,  ..., -6.8117e-01,\n",
       "            4.3145e-01, -6.0661e-01],\n",
       "          [ 5.7201e-01, -2.1513e-01, -2.0821e-02,  ..., -2.6662e-01,\n",
       "            4.3591e-01,  9.3392e-02]]]], grad_fn=<PermuteBackward0>)), (tensor([[[[ 1.7386e-02, -2.9366e-01,  2.2930e-01,  ...,  1.6992e+00,\n",
       "           -2.1956e-01, -6.9672e-02],\n",
       "          [ 1.3258e+00,  4.3517e-03, -4.5178e-01,  ..., -3.3861e+00,\n",
       "            6.0934e-01, -9.7995e-02],\n",
       "          [-3.8376e-03,  6.8534e-01, -4.5098e-01,  ..., -3.5777e+00,\n",
       "            4.3510e-01, -1.0692e+00],\n",
       "          ...,\n",
       "          [ 1.0944e+00, -1.3281e-01, -8.4927e-01,  ..., -4.4945e+00,\n",
       "           -8.8708e-01, -7.0986e-01],\n",
       "          [-4.0466e-01,  1.8280e-01,  7.3281e-02,  ..., -3.9964e+00,\n",
       "            5.3101e-01, -8.9631e-01],\n",
       "          [-2.8022e-01,  2.4591e-01, -9.5407e-01,  ..., -3.1422e+00,\n",
       "            5.9489e-01, -6.9649e-01]],\n",
       "\n",
       "         [[ 1.4856e-01,  9.7933e-01, -1.4228e+00,  ..., -1.2427e-01,\n",
       "            2.7183e-01,  9.2512e-01],\n",
       "          [-4.5280e+00, -5.5021e+00,  7.7566e-01,  ...,  3.7733e-02,\n",
       "            2.0070e+00, -6.6960e-01],\n",
       "          [-1.8428e+00, -5.4617e+00, -1.5096e+00,  ..., -1.3806e-01,\n",
       "           -1.6786e+00, -1.6976e+00],\n",
       "          ...,\n",
       "          [-1.0155e+00, -4.1935e+00,  9.1142e-01,  ..., -6.7730e-01,\n",
       "            8.5889e-01, -2.9881e+00],\n",
       "          [ 3.6602e-01, -2.7849e+00,  1.7301e+00,  ...,  8.3982e-01,\n",
       "            1.8913e+00, -1.8723e+00],\n",
       "          [ 1.8487e-01, -2.9940e+00,  3.7968e-01,  ..., -6.0796e-01,\n",
       "            1.2830e+00, -1.7956e+00]],\n",
       "\n",
       "         [[-6.7567e-01,  2.5791e-01, -4.4622e-02,  ...,  1.8188e-01,\n",
       "            3.5410e-02, -2.7977e-01],\n",
       "          [ 7.9076e-01, -6.6335e-01, -1.7585e+00,  ..., -1.6750e+00,\n",
       "           -1.0765e+00, -1.3595e+00],\n",
       "          [ 2.1679e+00, -6.9918e-02,  4.3477e-01,  ...,  1.2724e-01,\n",
       "           -4.9424e-01, -1.1511e+00],\n",
       "          ...,\n",
       "          [ 1.8131e+00, -2.6902e-01,  1.9496e-01,  ..., -7.7713e-02,\n",
       "           -1.8967e+00,  9.0249e-02],\n",
       "          [ 1.1712e+00, -1.0126e+00, -8.9854e-01,  ..., -8.4999e-01,\n",
       "           -1.0316e+00, -2.0615e+00],\n",
       "          [ 6.7570e-01,  3.8597e-01, -3.9110e-01,  ..., -8.8803e-02,\n",
       "           -3.7925e-01,  9.4492e-01]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[-2.6750e-02,  1.1317e-01,  1.4411e-01,  ..., -9.5337e-02,\n",
       "            2.8758e-02,  1.7906e-01],\n",
       "          [ 5.2997e-01, -2.4245e-01, -8.6448e-03,  ...,  1.7977e+00,\n",
       "           -1.2685e+00,  6.6170e-01],\n",
       "          [ 1.3688e+00, -5.8446e-01, -1.0249e+00,  ...,  6.1781e-01,\n",
       "           -1.2555e+00,  1.8549e-01],\n",
       "          ...,\n",
       "          [-4.0208e-01, -3.9943e-01,  9.0350e-01,  ..., -8.4164e-02,\n",
       "            2.9849e-01,  1.3064e+00],\n",
       "          [ 8.7521e-01, -5.4618e-01,  6.0006e-01,  ..., -4.2599e-01,\n",
       "            9.5319e-01,  4.6232e-02],\n",
       "          [ 6.7025e-01, -2.4473e-02,  6.4640e-01,  ...,  9.7068e-01,\n",
       "            5.1276e-01,  1.2734e-01]],\n",
       "\n",
       "         [[-3.0175e+00,  3.9929e-01, -3.6378e-02,  ..., -4.7122e-01,\n",
       "           -3.4804e-01,  1.2437e+00],\n",
       "          [ 5.1598e+00, -6.4113e-01, -1.2113e+00,  ..., -5.4299e-01,\n",
       "           -6.3423e-01, -2.7112e+00],\n",
       "          [ 3.9647e+00, -4.5005e-01, -8.1767e-01,  ...,  1.3302e+00,\n",
       "            1.0222e+00, -3.6523e-01],\n",
       "          ...,\n",
       "          [ 6.9698e+00,  1.0402e-01, -1.2673e+00,  ...,  1.4771e+00,\n",
       "           -3.9454e-01, -1.3275e+00],\n",
       "          [ 5.9740e+00, -3.1729e-01, -3.0471e-01,  ..., -1.6971e+00,\n",
       "            1.9964e+00, -5.9249e-01],\n",
       "          [ 4.3904e+00,  3.9197e-01, -6.1715e-01,  ..., -1.7761e+00,\n",
       "            3.8099e-01,  9.8360e-02]],\n",
       "\n",
       "         [[-1.4068e-02, -2.3846e-01,  2.5290e-02,  ..., -1.6074e-01,\n",
       "            3.3649e-01,  8.6691e-02],\n",
       "          [ 2.8667e-02, -2.1868e+00,  1.7570e-01,  ...,  3.2765e-01,\n",
       "            3.8849e-01, -1.7206e+00],\n",
       "          [-2.3775e-02, -1.9333e+00, -1.1854e-01,  ..., -5.0054e-01,\n",
       "            1.4996e+00, -2.0566e-01],\n",
       "          ...,\n",
       "          [-1.6754e-01, -7.2157e-01,  6.8358e-01,  ...,  2.4385e-01,\n",
       "            1.1955e+00, -2.1362e+00],\n",
       "          [-1.7204e-01, -1.5373e+00,  4.9301e-01,  ...,  1.1557e-01,\n",
       "            1.8178e+00, -1.0164e+00],\n",
       "          [ 2.1786e-01, -1.8928e+00,  4.4174e-01,  ...,  4.2819e-01,\n",
       "            1.5182e-01, -2.0864e-01]]]], grad_fn=<PermuteBackward0>), tensor([[[[-0.0283, -0.0237, -0.0065,  ..., -0.0032, -0.0248,  0.3515],\n",
       "          [ 0.8264,  0.2939, -0.5176,  ..., -0.2927, -0.2171,  0.0181],\n",
       "          [ 1.9226, -0.7101,  0.3985,  ...,  0.1594,  0.7347, -0.2097],\n",
       "          ...,\n",
       "          [ 0.3205, -0.6643,  0.0532,  ..., -0.3513, -0.3939, -1.5316],\n",
       "          [ 0.0561, -0.2875, -0.6101,  ...,  1.2166,  0.5898, -1.2971],\n",
       "          [ 0.1016,  0.4254, -0.1121,  ...,  0.1045,  0.3800, -1.0232]],\n",
       "\n",
       "         [[ 0.0052, -0.0295,  0.0151,  ..., -0.0267,  0.0140,  0.0122],\n",
       "          [-0.5628, -0.9254,  0.9794,  ..., -0.2008,  1.3605, -0.3241],\n",
       "          [ 0.7993,  0.5138, -0.2952,  ..., -0.7700,  0.4493, -0.8436],\n",
       "          ...,\n",
       "          [-0.0907,  0.4751,  1.4920,  ..., -0.6118,  0.6339, -0.4834],\n",
       "          [ 1.1835, -0.9603,  1.0771,  ..., -0.7290,  0.1659, -1.8319],\n",
       "          [-0.7705,  0.0071, -1.1086,  ..., -0.2116,  1.5475, -1.0844]],\n",
       "\n",
       "         [[-0.0587,  0.0065, -0.0385,  ..., -0.0367,  0.0108, -0.0737],\n",
       "          [-0.3782,  0.4603, -0.0197,  ..., -0.0948, -0.1734, -0.0478],\n",
       "          [-1.0508, -0.9034, -0.1986,  ...,  0.7021,  0.1026,  0.6465],\n",
       "          ...,\n",
       "          [ 0.6918, -0.7151,  0.4399,  ..., -0.5258,  0.4609,  0.3265],\n",
       "          [-0.5982, -0.3842, -0.2050,  ..., -0.3919, -0.2597, -0.3368],\n",
       "          [-0.2127,  0.2818,  1.1315,  ..., -0.4063,  0.3917,  0.4088]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[-0.3268, -0.1926, -0.0725,  ..., -0.4884,  0.2164,  0.0924],\n",
       "          [ 2.4077, -2.1106,  0.0814,  ...,  2.1166,  0.7840, -0.3542],\n",
       "          [ 0.7914, -2.0034, -1.2725,  ...,  0.1668,  0.2048, -0.0149],\n",
       "          ...,\n",
       "          [ 0.1624,  0.0656,  0.3339,  ...,  1.6880,  0.5858, -0.4577],\n",
       "          [ 0.3699, -2.1781, -0.6363,  ...,  1.6692, -0.0237,  0.6235],\n",
       "          [ 1.1835, -1.2606, -0.3178,  ...,  0.8860, -0.2557, -1.0126]],\n",
       "\n",
       "         [[-0.0743, -0.1376, -0.0477,  ..., -0.1892, -0.1411,  0.1259],\n",
       "          [-0.8208, -0.8386,  0.6731,  ...,  0.6332,  0.0199, -0.7615],\n",
       "          [ 0.2827,  0.3740, -0.2432,  ...,  0.1827, -0.3376,  0.5142],\n",
       "          ...,\n",
       "          [ 0.9158,  0.1738,  0.1931,  ..., -0.3132, -0.0828,  0.6322],\n",
       "          [-1.2725,  0.6167, -0.2177,  ..., -0.5662, -0.9483,  0.7020],\n",
       "          [-0.4237, -0.4396, -0.7444,  ..., -0.1120, -0.1108,  0.0457]],\n",
       "\n",
       "         [[-0.0376, -0.0407,  0.1040,  ...,  0.0808, -0.0425,  0.0137],\n",
       "          [ 0.1663, -0.5093,  0.4754,  ..., -0.8001, -0.2647, -0.0260],\n",
       "          [ 0.8517, -0.0424,  0.2084,  ..., -0.4614, -0.3543, -0.4421],\n",
       "          ...,\n",
       "          [ 0.8837, -0.2089, -1.3849,  ...,  0.0321,  0.3742,  0.3700],\n",
       "          [ 1.5021,  0.3024, -1.6816,  ...,  0.2755, -0.4314,  1.2684],\n",
       "          [ 0.8986, -0.4031, -0.7551,  ...,  0.2049,  0.5642,  0.3766]]]],\n",
       "       grad_fn=<PermuteBackward0>)), (tensor([[[[-3.4242e-01,  8.8585e-01, -1.5804e-01,  ...,  1.1261e+00,\n",
       "           -1.9518e-01,  1.4177e-01],\n",
       "          [-1.9220e+00, -3.5766e+00,  1.0735e+00,  ..., -2.9989e+00,\n",
       "            2.4456e-01,  1.6964e+00],\n",
       "          [ 2.0634e-01, -4.5545e+00, -3.9479e-01,  ..., -2.3682e+00,\n",
       "            7.5262e-01,  1.9643e+00],\n",
       "          ...,\n",
       "          [ 4.1849e-02, -3.6175e+00, -3.7537e-01,  ..., -3.7409e+00,\n",
       "           -2.5598e+00,  1.3602e+00],\n",
       "          [ 9.7942e-01, -4.3081e+00, -8.1837e-01,  ..., -3.9368e+00,\n",
       "           -3.5577e-01,  6.4094e-01],\n",
       "          [-1.1241e+00, -4.2882e+00,  9.7364e-01,  ..., -3.9289e+00,\n",
       "            3.2482e-01,  1.0470e+00]],\n",
       "\n",
       "         [[ 4.3837e-02,  8.5526e-01, -6.5018e-01,  ..., -4.8066e-02,\n",
       "            2.9846e-01,  1.3144e-02],\n",
       "          [ 1.8581e+00, -8.6762e-01,  1.1911e-01,  ...,  1.4102e+00,\n",
       "           -1.5516e+00, -8.9818e-01],\n",
       "          [ 3.9744e-01, -7.4263e-01, -8.7026e-01,  ...,  2.1461e+00,\n",
       "            6.7020e-01, -5.7750e-01],\n",
       "          ...,\n",
       "          [-1.8549e+00,  4.0110e-01,  1.0519e+00,  ..., -6.1304e-01,\n",
       "            1.0524e+00, -1.9586e+00],\n",
       "          [-1.6497e+00,  1.7422e+00,  2.8445e+00,  ..., -4.2254e-01,\n",
       "            1.4167e+00,  2.2797e-02],\n",
       "          [ 3.9260e-01,  5.7385e-01,  6.0347e-01,  ...,  1.8517e+00,\n",
       "            1.6710e-01, -8.4129e-03]],\n",
       "\n",
       "         [[-3.0691e-01,  1.4184e-01, -9.7711e-01,  ..., -3.5239e-01,\n",
       "           -6.5329e-02, -1.5691e-01],\n",
       "          [ 5.4449e-01,  2.6089e-01,  3.3700e+00,  ...,  5.8992e-01,\n",
       "            3.0752e-01,  1.6272e-01],\n",
       "          [ 1.8183e-01, -4.2966e-01,  3.6656e+00,  ...,  6.5348e-01,\n",
       "            1.6247e-01,  1.1807e+00],\n",
       "          ...,\n",
       "          [ 6.0342e-01, -2.8372e-02,  4.0457e+00,  ...,  1.4485e-01,\n",
       "           -3.0530e-01, -1.8585e-02],\n",
       "          [ 8.4563e-01, -6.5224e-02,  3.8245e+00,  ...,  4.2784e-02,\n",
       "            8.7349e-01,  8.6994e-01],\n",
       "          [-4.1459e-01,  9.7909e-01,  2.9803e+00,  ..., -3.9050e-01,\n",
       "           -1.2682e-01,  4.7121e-02]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[ 3.7623e-01,  8.8415e-02, -6.9547e-02,  ..., -4.7736e-02,\n",
       "            2.2840e-01,  1.9032e-02],\n",
       "          [ 7.1144e-01,  1.8763e-01, -3.9937e-01,  ..., -4.9091e-01,\n",
       "           -1.2776e+00,  6.4044e-01],\n",
       "          [-3.3112e-01,  1.5726e+00, -1.1475e-01,  ...,  1.2293e+00,\n",
       "           -3.2792e-01,  1.3483e+00],\n",
       "          ...,\n",
       "          [-1.2218e+00,  1.0106e+00,  1.5841e+00,  ...,  2.4095e-01,\n",
       "            1.8157e+00,  2.5420e-01],\n",
       "          [-7.0075e-01, -2.0352e+00,  3.5377e-01,  ..., -2.1764e-01,\n",
       "            1.2933e+00,  3.9608e-01],\n",
       "          [-1.2331e+00, -2.5181e-01,  9.9036e-01,  ..., -7.5638e-02,\n",
       "            1.3286e+00,  6.3743e-01]],\n",
       "\n",
       "         [[ 1.9031e-01,  5.9438e-02,  3.3524e-01,  ...,  4.2277e-01,\n",
       "            1.9400e-02,  2.2399e-01],\n",
       "          [ 5.9327e-01,  2.8811e-01, -1.4083e-01,  ..., -1.5039e+00,\n",
       "           -2.3323e-01,  3.1527e-01],\n",
       "          [ 1.1468e+00, -2.1674e-01,  1.2189e+00,  ..., -4.9551e-01,\n",
       "            2.9952e-01,  4.2660e-01],\n",
       "          ...,\n",
       "          [ 1.2327e+00,  1.9132e-01,  9.7667e-01,  ..., -3.9745e-01,\n",
       "           -8.6595e-01,  1.6085e-01],\n",
       "          [ 1.5555e+00,  1.2641e-01,  1.5087e+00,  ..., -2.6383e-01,\n",
       "           -1.4203e+00, -3.3963e-01],\n",
       "          [ 2.5104e+00, -4.2117e-01,  9.7204e-01,  ...,  7.9408e-02,\n",
       "           -1.9639e+00,  1.2759e+00]],\n",
       "\n",
       "         [[-3.0269e+00,  5.5620e-01,  5.7562e-01,  ..., -9.3499e-01,\n",
       "            3.1438e-01,  1.8537e-01],\n",
       "          [ 6.4035e+00, -2.6481e-01, -1.7831e+00,  ...,  9.5200e-01,\n",
       "           -1.3967e+00, -1.9981e-02],\n",
       "          [ 6.5658e+00, -5.8532e-01, -3.0053e+00,  ...,  1.1486e+00,\n",
       "           -8.4129e-01, -1.1981e+00],\n",
       "          ...,\n",
       "          [ 8.7470e+00,  4.9102e-01, -8.1829e-01,  ...,  9.2144e-01,\n",
       "           -1.9827e-01, -2.1748e+00],\n",
       "          [ 8.8249e+00,  7.7651e-01, -9.9607e-01,  ...,  1.0463e-01,\n",
       "           -8.5053e-01, -1.0812e+00],\n",
       "          [ 9.4365e+00, -1.1869e+00, -5.3457e-01,  ...,  2.5683e+00,\n",
       "            4.1110e-01, -7.9529e-01]]]], grad_fn=<PermuteBackward0>), tensor([[[[ 4.2588e-02, -5.0356e-02,  8.1064e-03,  ..., -7.2504e-02,\n",
       "            9.6369e-03, -9.2134e-02],\n",
       "          [-4.3344e-01,  1.3342e-01, -4.7552e-02,  ...,  4.4079e-01,\n",
       "            2.6153e-01,  1.0634e+00],\n",
       "          [ 3.0987e-01, -1.2253e-01, -8.2554e-01,  ..., -5.1116e-01,\n",
       "           -7.1061e-01,  9.2578e-01],\n",
       "          ...,\n",
       "          [-8.3976e-01,  2.3753e-01,  2.7464e-01,  ...,  2.1087e-01,\n",
       "            2.5150e-01,  4.4114e-01],\n",
       "          [-1.1380e+00,  2.7989e-01, -2.8601e-01,  ..., -3.8905e-02,\n",
       "            6.6291e-01, -1.5787e-01],\n",
       "          [ 2.5615e-01, -1.8164e-01,  3.2558e-01,  ..., -7.4514e-02,\n",
       "            1.2012e-01, -5.7117e-01]],\n",
       "\n",
       "         [[ 7.9494e-02,  1.9629e-02, -1.4942e-02,  ..., -3.9482e-02,\n",
       "            2.1056e-02, -1.7007e-02],\n",
       "          [ 4.6345e-01, -1.4725e-01,  8.2258e-01,  ...,  3.4143e-01,\n",
       "            6.6488e-02, -1.0046e+00],\n",
       "          [ 4.0347e-01, -8.7195e-01, -1.1793e+00,  ...,  1.0580e+00,\n",
       "            5.7153e-01,  5.3592e-01],\n",
       "          ...,\n",
       "          [-6.6896e-01,  1.4602e-01,  1.4269e+00,  ..., -1.1985e-01,\n",
       "            1.1612e+00,  2.0802e+00],\n",
       "          [-2.0435e-01,  2.5427e-01,  8.5200e-01,  ..., -8.8234e-01,\n",
       "            9.5405e-01,  4.8340e-01],\n",
       "          [ 8.2470e-01,  1.2348e-01, -1.8963e-02,  ..., -2.7635e-01,\n",
       "            3.8721e-01,  7.9089e-01]],\n",
       "\n",
       "         [[ 6.5061e-02,  1.7394e-02, -1.3197e-02,  ...,  1.9595e-02,\n",
       "           -6.2144e-02, -6.3763e-02],\n",
       "          [-1.1359e-02,  6.8293e-01, -8.8521e-01,  ..., -8.8195e-01,\n",
       "           -9.2533e-01,  1.0995e+00],\n",
       "          [ 7.2526e-01, -6.4550e-01, -1.0321e-01,  ..., -6.4715e-01,\n",
       "           -5.5848e-01,  1.3314e+00],\n",
       "          ...,\n",
       "          [-6.7675e-01, -1.5009e-01, -6.6685e-01,  ..., -4.3112e-01,\n",
       "           -8.0608e-01, -2.0868e-01],\n",
       "          [-6.1433e-01, -2.4866e-01, -3.9539e-01,  ..., -1.8268e+00,\n",
       "            3.9165e-01, -1.5418e+00],\n",
       "          [-3.6172e-01, -6.2750e-01, -3.1816e-01,  ..., -9.2306e-02,\n",
       "            2.7719e-01,  3.6235e-02]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[-1.2369e-03,  2.0590e-02,  1.8059e-02,  ..., -6.7566e-02,\n",
       "           -2.1130e-02,  1.0344e-02],\n",
       "          [-2.6505e-01,  6.8229e-02, -5.8064e-01,  ...,  2.0919e+00,\n",
       "           -2.0487e-01, -6.2631e-02],\n",
       "          [-1.1090e+00, -4.0114e-01,  6.4974e-01,  ...,  1.8533e+00,\n",
       "           -5.9498e-01, -1.5189e+00],\n",
       "          ...,\n",
       "          [-1.7999e-01, -1.3203e+00, -4.0722e-01,  ...,  5.0378e-01,\n",
       "           -6.4837e-01,  2.0433e-01],\n",
       "          [ 1.6615e-01, -6.4841e-02,  4.4674e-01,  ..., -1.3760e+00,\n",
       "           -1.5886e+00,  1.6740e+00],\n",
       "          [-2.3218e-01, -1.0700e+00, -2.0870e-01,  ...,  4.2789e-01,\n",
       "            1.3457e-01,  4.9070e-01]],\n",
       "\n",
       "         [[ 4.5884e-02, -1.3652e-02,  3.3538e-02,  ...,  4.2297e-02,\n",
       "           -1.0650e-02,  1.3725e-02],\n",
       "          [ 6.7968e-01, -6.1345e-01, -5.1771e-01,  ...,  3.9566e-01,\n",
       "           -6.3267e-01, -7.4087e-01],\n",
       "          [ 3.2293e-01, -5.0785e-01, -2.0838e+00,  ...,  1.2406e+00,\n",
       "            3.0909e-01,  7.6786e-01],\n",
       "          ...,\n",
       "          [ 6.6215e-01,  1.4883e+00, -1.3764e+00,  ..., -8.5867e-01,\n",
       "           -9.7664e-01,  7.5939e-02],\n",
       "          [-3.7327e-01,  1.0781e+00, -1.1749e+00,  ...,  1.5870e-02,\n",
       "           -8.2584e-01, -8.5889e-01],\n",
       "          [-1.0721e-01, -2.9212e-01, -1.0636e+00,  ...,  4.9721e-01,\n",
       "           -3.1658e-02, -7.9182e-02]],\n",
       "\n",
       "         [[ 7.6791e-02, -2.0113e-01, -8.1943e-02,  ..., -2.3764e-02,\n",
       "            2.0464e-01, -5.6837e-02],\n",
       "          [ 9.9647e-02, -2.1344e+00,  4.2548e-02,  ...,  4.3812e-01,\n",
       "           -7.9336e-01,  4.1660e-01],\n",
       "          [-1.0473e-01, -8.4764e-01, -1.0118e-01,  ...,  7.0098e-01,\n",
       "           -6.8365e-02, -1.1925e-01],\n",
       "          ...,\n",
       "          [ 8.6354e-02,  2.0148e+00, -7.8215e-01,  ...,  1.2433e+00,\n",
       "           -1.2465e+00,  8.4485e-02],\n",
       "          [ 2.1874e-01,  1.7080e+00, -7.7510e-01,  ...,  2.0696e-01,\n",
       "            1.1354e-02,  1.2248e-01],\n",
       "          [-9.0595e-01, -7.8929e-01,  2.8048e-02,  ...,  1.5975e-01,\n",
       "           -3.9888e-01,  8.6554e-01]]]], grad_fn=<PermuteBackward0>)), (tensor([[[[ 1.0446, -0.2861, -0.1417,  ...,  0.6163,  0.7392, -0.3206],\n",
       "          [-3.9079, -2.3097,  2.2869,  ..., -1.0490, -3.9810,  0.3166],\n",
       "          [-4.8613, -0.9516,  2.6062,  ..., -0.0303, -3.9882, -0.5383],\n",
       "          ...,\n",
       "          [-3.5566, -2.6193,  0.7828,  ...,  0.6872, -4.4898, -0.6281],\n",
       "          [-4.2590, -1.7552,  2.8268,  ..., -0.6523, -4.0929, -0.2955],\n",
       "          [-4.9762, -1.9708,  0.0080,  ..., -0.3365, -3.7900, -0.1139]],\n",
       "\n",
       "         [[-0.1387, -0.0569,  0.1582,  ..., -0.0465, -0.8847, -0.2043],\n",
       "          [-0.5542, -0.0729, -0.3130,  ..., -0.0560, -0.7031,  1.6099],\n",
       "          [-1.3839,  1.4181, -0.2936,  ...,  1.3373, -1.0466,  1.9511],\n",
       "          ...,\n",
       "          [-1.8806,  0.3803, -1.2217,  ...,  1.7803,  0.9103,  0.4689],\n",
       "          [-1.2040,  1.6865, -0.4972,  ...,  0.5542, -0.4544,  1.0700],\n",
       "          [-2.2268,  0.3460, -1.5116,  ...,  0.5357, -0.8997,  0.5252]],\n",
       "\n",
       "         [[ 0.1924,  0.3075,  1.1360,  ..., -0.4682,  0.4283, -0.5042],\n",
       "          [ 1.1163, -0.8514, -1.3363,  ..., -0.5048, -2.3173,  1.4079],\n",
       "          [-0.4352, -0.6518, -1.4657,  ...,  0.3762, -3.2430,  2.4583],\n",
       "          ...,\n",
       "          [-1.2546, -1.4138, -1.6076,  ..., -2.2605, -1.6244, -0.5993],\n",
       "          [-0.4396, -2.4026, -0.8960,  ..., -1.1734, -1.3114,  0.8956],\n",
       "          [ 0.6041, -1.4925, -0.8934,  ..., -1.4043, -2.3182,  1.5456]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[-0.1552,  0.0680, -0.2229,  ..., -0.0112,  0.1523,  0.0379],\n",
       "          [-1.4715, -0.2699, -1.0010,  ...,  0.9816, -0.0131, -0.5940],\n",
       "          [-2.8031, -0.0749, -0.5765,  ...,  1.4030,  0.5713, -0.3813],\n",
       "          ...,\n",
       "          [-0.7421,  0.1682,  1.1091,  ...,  0.2662, -1.0974, -1.7564],\n",
       "          [ 0.5485,  0.4741,  1.4970,  ..., -0.3822, -0.6439, -1.4745],\n",
       "          [-1.9331,  1.2431, -0.7142,  ...,  0.8943, -0.3290, -0.2887]],\n",
       "\n",
       "         [[-0.3520, -2.1820,  0.1305,  ..., -0.0781, -0.0452,  0.9126],\n",
       "          [-0.1676,  0.7424,  0.8457,  ..., -0.0430, -0.6897,  0.9972],\n",
       "          [-1.0619,  3.2376,  2.5831,  ...,  0.4896,  0.5569,  1.1650],\n",
       "          ...,\n",
       "          [ 0.1921,  4.7669, -0.9619,  ...,  0.1920,  0.0281, -1.6348],\n",
       "          [-0.3346,  4.0043,  0.8320,  ...,  1.9739,  0.4835,  0.2861],\n",
       "          [-0.4076,  1.6655, -1.2985,  ...,  1.3795, -0.0910,  1.3841]],\n",
       "\n",
       "         [[ 0.3705,  0.0804, -0.1422,  ...,  0.6539,  0.1286,  0.2599],\n",
       "          [ 0.2785, -0.1845, -0.7767,  ..., -0.6425,  0.7165, -1.3394],\n",
       "          [-0.0426, -1.2866, -0.2084,  ...,  0.4318,  1.4718,  0.1445],\n",
       "          ...,\n",
       "          [-2.9565, -2.3736, -1.0456,  ...,  0.5869, -0.2452, -0.1996],\n",
       "          [-2.0673, -1.9636,  0.6297,  ..., -1.2112, -0.6369,  2.2856],\n",
       "          [-1.5174, -0.1974,  1.0155,  ...,  0.1546,  0.5188,  1.0438]]]],\n",
       "       grad_fn=<PermuteBackward0>), tensor([[[[-3.2128e-02,  4.2769e-02, -6.2959e-02,  ..., -2.4990e-02,\n",
       "           -5.5760e-04,  1.9665e-02],\n",
       "          [ 5.3965e-01,  1.5681e-02, -8.6195e-01,  ..., -6.3801e-01,\n",
       "            3.3483e-02,  2.0060e-01],\n",
       "          [-2.2823e-02,  1.7671e-01, -1.9057e-01,  ..., -6.8474e-01,\n",
       "           -7.3559e-01,  5.6330e-01],\n",
       "          ...,\n",
       "          [-5.3655e-01, -6.8140e-01, -2.7338e-01,  ..., -8.9929e-01,\n",
       "            6.7783e-01, -7.7594e-02],\n",
       "          [-6.6500e-02, -2.1919e-01,  7.2967e-01,  ...,  3.1051e-01,\n",
       "           -9.1984e-01,  1.6034e+00],\n",
       "          [-4.6006e-02,  1.7540e-01, -1.4991e-01,  ..., -3.8299e-01,\n",
       "           -1.2224e-01,  3.0644e-01]],\n",
       "\n",
       "         [[ 8.2082e-03, -2.1715e-02,  3.5759e-02,  ...,  2.5118e-02,\n",
       "           -4.6394e-02,  1.3449e-02],\n",
       "          [-5.6012e-01, -4.7913e-01, -7.3465e-01,  ...,  2.4955e-01,\n",
       "           -2.2302e-01,  1.7229e-02],\n",
       "          [-2.6669e-01, -5.7561e-02, -1.0933e+00,  ...,  6.9103e-01,\n",
       "            8.8651e-01,  2.3362e-01],\n",
       "          ...,\n",
       "          [ 1.4549e+00,  6.2701e-01,  3.4483e-01,  ..., -2.6119e-01,\n",
       "           -1.8188e-01,  2.0104e-01],\n",
       "          [ 9.1125e-01, -5.0901e-01,  1.7518e-01,  ...,  4.0395e-02,\n",
       "           -8.2197e-03, -6.1320e-01],\n",
       "          [-3.7897e-01, -1.3483e-01,  9.1874e-02,  ...,  3.8443e-01,\n",
       "            4.2697e-01, -3.3593e-01]],\n",
       "\n",
       "         [[ 5.1188e-02, -3.5194e-02,  2.7533e-02,  ...,  1.4181e-02,\n",
       "           -6.4940e-03,  2.2787e-02],\n",
       "          [ 5.1446e-01, -1.3208e+00, -1.0446e+00,  ...,  1.7527e-02,\n",
       "            1.7641e+00, -3.5737e-01],\n",
       "          [-2.7716e-01,  6.2415e-02, -1.8526e+00,  ..., -4.1313e-01,\n",
       "           -5.6179e-01,  1.0078e+00],\n",
       "          ...,\n",
       "          [ 4.1809e-01, -5.0762e-02, -3.4584e-01,  ..., -1.2960e+00,\n",
       "           -5.4343e-01, -1.4812e-01],\n",
       "          [ 1.5966e-01, -1.5909e-01, -2.6502e-01,  ..., -2.6891e-01,\n",
       "            1.3304e+00, -6.3225e-01],\n",
       "          [ 4.6884e-01, -2.5218e-01, -6.3545e-01,  ..., -2.6028e+00,\n",
       "            3.5557e-02, -2.5807e-01]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[-1.8817e-01,  7.5630e-02,  5.1482e-02,  ...,  6.3044e-02,\n",
       "            4.5983e-02, -1.2272e-01],\n",
       "          [-4.8166e-01, -8.0081e-01,  4.7553e-01,  ...,  9.2072e-01,\n",
       "            1.4889e+00, -4.9100e-01],\n",
       "          [ 3.1190e-01, -1.7164e+00, -6.4195e-01,  ..., -3.0160e-01,\n",
       "            9.8333e-01, -1.3910e+00],\n",
       "          ...,\n",
       "          [ 2.6294e-01,  1.7654e-01, -2.2927e-01,  ..., -5.6709e-01,\n",
       "           -4.2602e-01,  1.8268e-01],\n",
       "          [-1.2287e+00, -8.8129e-01, -4.9618e-01,  ...,  5.7656e-01,\n",
       "           -9.3970e-02,  9.1813e-01],\n",
       "          [-5.2435e-01, -5.0482e-01, -6.5543e-01,  ..., -9.4713e-02,\n",
       "            2.7462e-01, -4.3347e-01]],\n",
       "\n",
       "         [[-5.8365e-01, -2.0280e-02,  4.3795e-02,  ..., -1.2728e-02,\n",
       "            1.4847e-02, -1.1488e-02],\n",
       "          [ 2.1189e-01,  6.3309e-01, -8.5212e-01,  ...,  1.7114e-02,\n",
       "           -1.6814e-01,  3.3735e-01],\n",
       "          [-5.0398e-01,  1.2201e-01, -9.9508e-01,  ...,  1.0425e+00,\n",
       "            5.1320e-01,  8.6971e-01],\n",
       "          ...,\n",
       "          [-1.4073e+00,  4.5373e-01, -1.8495e-01,  ...,  1.0106e-01,\n",
       "            1.1485e+00, -5.4966e-02],\n",
       "          [-7.8669e-01, -4.6794e-01,  8.1497e-02,  ..., -6.6191e-01,\n",
       "            9.3902e-01, -4.7145e-01],\n",
       "          [-2.6530e+00, -1.3969e-01,  2.4383e-01,  ..., -2.2311e-01,\n",
       "            4.6453e-01,  3.6028e-01]],\n",
       "\n",
       "         [[ 1.8879e-02,  8.3848e-02, -5.0716e-02,  ...,  5.5664e-02,\n",
       "            2.9355e-02, -4.1400e-02],\n",
       "          [-3.1204e-01,  3.7567e-01,  3.4224e-01,  ...,  7.1520e-01,\n",
       "            7.5320e-01, -1.1299e-01],\n",
       "          [-1.0119e+00,  3.3448e-01, -1.2705e+00,  ..., -8.8392e-01,\n",
       "           -2.3355e+00, -2.2953e-01],\n",
       "          ...,\n",
       "          [ 1.9422e-01,  1.2625e+00,  6.1021e-01,  ...,  1.0141e+00,\n",
       "           -1.3892e-01, -3.8717e-01],\n",
       "          [-1.6888e+00,  1.0561e+00,  6.2911e-01,  ..., -7.9248e-01,\n",
       "           -1.2793e+00,  2.5450e-02],\n",
       "          [-6.8985e-01,  3.8036e-01,  1.1172e+00,  ..., -3.2168e-01,\n",
       "           -9.4463e-01,  8.1670e-01]]]], grad_fn=<PermuteBackward0>)), (tensor([[[[-3.0739e-02, -2.3328e+00,  1.6444e-01,  ..., -2.4874e-01,\n",
       "           -2.1955e-01,  6.7818e-02],\n",
       "          [ 6.6502e-01,  3.8897e+00,  2.5626e-01,  ..., -1.2447e+00,\n",
       "           -5.1476e-01,  1.7492e-01],\n",
       "          [ 2.3555e-01,  4.3144e+00,  1.5203e+00,  ..., -1.0305e+00,\n",
       "           -2.7401e-01, -2.9700e-01],\n",
       "          ...,\n",
       "          [-7.4795e-01,  6.0315e+00,  1.4481e+00,  ..., -1.7219e+00,\n",
       "            1.5043e-01,  3.6440e-02],\n",
       "          [-1.1906e+00,  6.0815e+00,  7.5815e-01,  ...,  7.6884e-03,\n",
       "            1.6461e+00, -4.0052e-01],\n",
       "          [-7.7746e-01,  4.0435e+00,  8.6778e-01,  ..., -1.0730e+00,\n",
       "            3.5681e-01, -7.8314e-01]],\n",
       "\n",
       "         [[-8.1230e-01,  2.0313e-01,  4.6825e-01,  ..., -5.1517e-01,\n",
       "            1.0709e+00,  1.1254e+00],\n",
       "          [ 1.9943e-01,  7.3408e-02,  1.8181e+00,  ..., -3.0854e-01,\n",
       "            1.3158e+00,  6.8891e-02],\n",
       "          [ 3.2874e-01, -6.4358e-01,  2.3493e+00,  ..., -1.5555e-01,\n",
       "            2.4417e+00,  1.8911e-01],\n",
       "          ...,\n",
       "          [ 6.9652e-01,  1.3289e+00, -4.7835e-01,  ..., -1.1337e+00,\n",
       "            6.5288e-01, -2.0380e+00],\n",
       "          [ 2.1073e-01,  1.0003e+00,  1.2394e+00,  ..., -6.3918e-01,\n",
       "            8.0475e-01, -1.5733e+00],\n",
       "          [ 3.1132e-01,  2.8403e-01,  2.9093e-01,  ..., -1.5865e+00,\n",
       "            3.1738e+00, -7.4550e-01]],\n",
       "\n",
       "         [[-8.3052e-01,  4.9202e-01,  1.5700e-02,  ...,  5.0540e-01,\n",
       "           -2.3221e-01,  1.1760e+00],\n",
       "          [ 1.8285e+00, -1.1218e+00, -6.0110e-01,  ...,  1.7362e-01,\n",
       "            1.3459e+00,  1.1774e-01],\n",
       "          [ 1.6429e+00, -2.1772e+00,  1.7248e+00,  ...,  3.0036e-01,\n",
       "            7.7876e-01,  9.9057e-01],\n",
       "          ...,\n",
       "          [ 1.0010e+00, -1.7313e+00,  1.9045e+00,  ...,  8.0702e-01,\n",
       "            5.2974e-01, -6.8052e-01],\n",
       "          [ 1.5992e+00, -2.7915e+00,  1.0877e+00,  ...,  7.4601e-01,\n",
       "            3.0331e-02, -1.3237e+00],\n",
       "          [ 1.6438e+00, -9.9109e-01,  9.2324e-01,  ...,  1.1409e+00,\n",
       "            5.9684e-01, -3.3008e-03]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[-3.0107e-01, -1.1655e-01,  1.3441e-01,  ...,  1.8965e-01,\n",
       "            1.7320e+00, -2.8680e+00],\n",
       "          [ 1.9154e+00,  1.2091e+00,  2.8646e-01,  ...,  1.3748e-01,\n",
       "           -3.8644e+00,  3.9812e+00],\n",
       "          [ 9.2973e-01, -1.0160e-01, -1.3099e-02,  ..., -3.6517e-01,\n",
       "           -4.1451e+00,  4.7426e+00],\n",
       "          ...,\n",
       "          [ 5.7876e-01, -5.6903e-01, -9.7980e-01,  ..., -1.5678e+00,\n",
       "           -5.4191e+00,  5.7578e+00],\n",
       "          [-1.9895e-01, -5.5637e-01, -4.0058e-01,  ..., -1.3523e+00,\n",
       "           -5.3991e+00,  5.7263e+00],\n",
       "          [ 1.0074e-01,  9.1534e-01,  5.5960e-01,  ..., -1.0116e+00,\n",
       "           -5.7379e+00,  5.5124e+00]],\n",
       "\n",
       "         [[ 1.8309e-01,  3.5898e-01,  2.1865e-01,  ..., -2.1976e-01,\n",
       "            3.2432e-02, -1.3621e-01],\n",
       "          [-1.1902e+00,  2.5967e-01,  1.3770e+00,  ...,  1.6082e+00,\n",
       "            5.3296e-01,  1.0141e-01],\n",
       "          [-1.6201e+00,  3.1865e-01,  5.4789e-01,  ...,  7.9333e-01,\n",
       "            4.1223e-01, -9.2892e-01],\n",
       "          ...,\n",
       "          [-1.7144e+00,  2.9198e-01, -6.5764e-01,  ...,  1.4415e+00,\n",
       "           -4.7544e-02,  6.2226e-01],\n",
       "          [-1.0669e+00,  9.0313e-01, -9.0604e-01,  ...,  1.4476e+00,\n",
       "           -3.0205e-01, -8.2349e-01],\n",
       "          [-6.1682e-01,  4.0207e-01, -2.7279e-01,  ...,  1.9482e+00,\n",
       "           -5.3192e-01, -1.6915e-01]],\n",
       "\n",
       "         [[ 3.8544e-01,  1.1505e-01,  6.3900e-01,  ...,  5.3104e-01,\n",
       "            5.7931e-01, -3.3185e-01],\n",
       "          [ 4.0666e-01, -9.3643e-01, -1.0314e+00,  ..., -6.5106e-01,\n",
       "           -3.1251e+00, -3.6746e-01],\n",
       "          [ 9.7926e-01, -9.1059e-01,  2.0969e-01,  ..., -1.8782e+00,\n",
       "           -3.4683e+00,  2.0600e-01],\n",
       "          ...,\n",
       "          [-1.4226e-01, -9.7476e-01, -5.4240e-01,  ..., -2.8039e+00,\n",
       "           -4.1771e+00,  1.9208e+00],\n",
       "          [-7.5886e-01, -6.7513e-01, -4.4396e-02,  ..., -2.4649e+00,\n",
       "           -4.5963e+00,  6.7639e-01],\n",
       "          [-6.9450e-01,  4.6577e-01, -7.3801e-01,  ..., -2.1795e+00,\n",
       "           -4.2150e+00,  2.8132e-01]]]], grad_fn=<PermuteBackward0>), tensor([[[[ 5.3734e-02, -1.4238e-02, -3.5810e-02,  ...,  1.2174e-01,\n",
       "           -5.9948e-02, -4.3049e-02],\n",
       "          [-4.0712e-01,  2.3952e-01, -1.4630e-01,  ..., -4.0966e-01,\n",
       "            8.4789e-01, -2.2644e-01],\n",
       "          [-1.4474e+00, -8.4849e-01, -9.8838e-02,  ..., -2.0224e-01,\n",
       "            6.6301e-01,  7.1867e-01],\n",
       "          ...,\n",
       "          [-6.3025e-01, -9.1939e-01,  1.5612e+00,  ...,  6.4965e-01,\n",
       "            2.6919e+00,  8.4095e-01],\n",
       "          [-5.2483e-01, -1.3637e+00,  1.8403e+00,  ...,  3.8219e-01,\n",
       "            1.4497e+00,  1.0675e-03],\n",
       "          [-2.9676e-02,  1.7135e-01, -7.6390e-01,  ..., -3.9354e-01,\n",
       "            4.4974e-01, -9.9768e-01]],\n",
       "\n",
       "         [[ 1.7320e-02,  2.3816e-02,  4.7974e-02,  ..., -4.7309e-03,\n",
       "           -6.8244e-03,  3.2884e-03],\n",
       "          [ 5.4060e-01,  1.3387e+00,  1.0772e+00,  ..., -4.1182e-01,\n",
       "           -7.2695e-02,  7.3047e-01],\n",
       "          [ 1.6604e+00, -1.0405e+00, -4.6419e-02,  ..., -1.5549e-01,\n",
       "            4.8931e-01,  2.6706e-01],\n",
       "          ...,\n",
       "          [-2.6609e-02,  1.4448e+00, -9.6233e-02,  ...,  1.4043e+00,\n",
       "           -2.1171e-01,  6.2546e-01],\n",
       "          [ 1.7877e+00, -7.9523e-01,  3.4070e-01,  ..., -1.0215e+00,\n",
       "            5.2032e-01, -7.5460e-01],\n",
       "          [-6.6356e-01,  7.8232e-01,  8.1055e-01,  ..., -6.7723e-01,\n",
       "           -5.3983e-01, -7.7323e-01]],\n",
       "\n",
       "         [[ 5.1727e-02, -3.2729e-02,  6.2676e-02,  ...,  3.5751e-02,\n",
       "           -7.1613e-02, -5.9328e-02],\n",
       "          [-3.5343e-01,  7.5885e-01, -4.1292e-02,  ...,  2.4424e-01,\n",
       "           -6.9912e-01, -5.4976e-01],\n",
       "          [ 7.8512e-02, -4.0856e-01,  5.3512e-01,  ..., -8.2215e-01,\n",
       "            1.6117e-03, -5.0301e-01],\n",
       "          ...,\n",
       "          [-7.4352e-01,  2.1682e-01,  5.4941e-01,  ..., -7.3409e-01,\n",
       "            5.1859e-01,  4.1630e-02],\n",
       "          [-3.8653e-01,  1.1920e+00, -2.0321e-01,  ..., -1.3420e-01,\n",
       "           -2.3149e-01, -1.5669e-01],\n",
       "          [-6.5035e-01,  1.6409e-01, -2.7168e-01,  ..., -5.6858e-02,\n",
       "            3.0423e-01,  6.7128e-02]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[-9.5711e-02, -3.3712e-02,  3.5806e-02,  ..., -9.8101e-02,\n",
       "            3.3037e-02,  2.1276e-02],\n",
       "          [-1.0548e+00,  1.0086e+00,  3.5336e-01,  ...,  1.4173e-01,\n",
       "            1.1533e+00, -5.7750e-01],\n",
       "          [ 3.0501e-01,  3.8879e-01,  1.4593e+00,  ..., -1.5761e+00,\n",
       "            6.6029e-01, -2.4784e+00],\n",
       "          ...,\n",
       "          [ 2.1433e-01,  3.0902e-01, -1.5677e-01,  ..., -1.4376e+00,\n",
       "            3.1234e-01, -1.5930e-01],\n",
       "          [ 9.2111e-02,  1.3849e+00, -3.5478e-01,  ..., -2.0249e+00,\n",
       "            1.7484e+00,  1.5088e+00],\n",
       "          [ 1.0588e+00,  5.0692e-02,  1.6492e-02,  ..., -3.3868e-01,\n",
       "            6.8473e-01,  6.6577e-02]],\n",
       "\n",
       "         [[ 1.5089e-01, -6.4248e-02,  1.3481e-01,  ...,  7.3110e-02,\n",
       "            4.8974e-03, -1.2889e-01],\n",
       "          [ 6.8720e-01,  5.7469e-02, -1.7870e-01,  ..., -1.1156e+00,\n",
       "           -1.2297e+00,  1.5925e-02],\n",
       "          [ 4.6848e-01,  6.8089e-01,  4.5960e-01,  ..., -3.4666e-01,\n",
       "           -8.2155e-01,  7.9260e-01],\n",
       "          ...,\n",
       "          [-6.8295e-01, -4.1632e-01, -3.1648e-01,  ..., -1.0120e+00,\n",
       "           -1.2630e+00,  4.1550e-01],\n",
       "          [ 3.2676e-02, -5.2766e-01, -1.6868e+00,  ..., -9.0995e-01,\n",
       "           -2.2853e-01,  1.4612e+00],\n",
       "          [ 2.0993e-01, -2.2173e-01,  1.0833e+00,  ..., -6.0767e-01,\n",
       "            6.2882e-01,  8.7863e-02]],\n",
       "\n",
       "         [[ 2.1724e-01, -6.2232e-02, -5.5712e-02,  ...,  2.1461e-02,\n",
       "            4.3089e-02,  3.7415e-02],\n",
       "          [ 1.3125e+00, -4.4910e-01, -7.5738e-01,  ...,  3.2747e-01,\n",
       "           -9.8980e-02, -8.0641e-02],\n",
       "          [-1.6063e-01, -7.7957e-01, -1.6312e-01,  ..., -4.8402e-01,\n",
       "            8.7137e-01, -2.5966e-01],\n",
       "          ...,\n",
       "          [-3.6798e-01,  5.0565e-01,  2.8998e-01,  ...,  2.5920e-01,\n",
       "           -1.9344e+00,  7.5729e-01],\n",
       "          [ 4.9461e-02, -4.9568e-01,  8.4527e-01,  ..., -3.3127e-01,\n",
       "            6.4244e-01, -3.5335e-01],\n",
       "          [ 1.1826e-01, -1.5026e-01,  9.7509e-01,  ...,  4.7500e-01,\n",
       "           -8.4105e-01,  6.5077e-01]]]], grad_fn=<PermuteBackward0>)), (tensor([[[[ 3.3000e-02, -2.4427e-01, -4.4500e-01,  ...,  3.0083e-01,\n",
       "            3.2684e-01,  3.6586e-01],\n",
       "          [-3.1921e-01,  1.1035e+00, -8.1620e-01,  ..., -1.0129e+00,\n",
       "           -1.0356e-01, -3.4720e-01],\n",
       "          [ 1.2050e-01,  8.5493e-01, -4.7644e-01,  ...,  2.5663e-01,\n",
       "           -7.2198e-01,  7.0494e-01],\n",
       "          ...,\n",
       "          [-3.0341e-01,  2.8498e-03,  4.0952e-01,  ...,  2.3109e+00,\n",
       "           -8.2102e-01, -3.8004e-01],\n",
       "          [-3.2066e-03,  2.9617e-01, -1.1834e-01,  ...,  3.1173e+00,\n",
       "            4.8562e-01,  1.0006e+00],\n",
       "          [ 4.6373e-01, -9.1021e-02, -9.6269e-01,  ...,  6.1472e-01,\n",
       "           -6.0768e-01,  9.6790e-01]],\n",
       "\n",
       "         [[-2.7947e-01,  1.5264e-01,  1.2265e-01,  ...,  6.3867e-02,\n",
       "           -1.1406e+00, -1.3803e-01],\n",
       "          [ 7.5798e-01,  7.3406e-01,  2.4232e+00,  ..., -1.7083e-01,\n",
       "           -1.2876e+00,  6.4982e-01],\n",
       "          [ 5.9163e-01, -2.5393e-01,  9.1644e-01,  ...,  1.4073e+00,\n",
       "           -9.9605e-01,  1.3304e+00],\n",
       "          ...,\n",
       "          [ 2.9274e-01, -3.5706e-01,  1.8072e+00,  ...,  5.2217e-01,\n",
       "           -1.9019e+00, -1.5490e+00],\n",
       "          [-3.1272e-02,  1.4249e-01,  1.4840e+00,  ...,  4.6309e-01,\n",
       "           -1.1398e+00, -3.6930e-01],\n",
       "          [-1.5090e-01,  1.7000e+00,  1.0501e+00,  ...,  4.5167e-01,\n",
       "            8.9898e-01,  2.7553e-01]],\n",
       "\n",
       "         [[-1.2364e+00, -1.0861e-01,  5.7612e-01,  ..., -6.5710e-01,\n",
       "            4.6449e-01, -2.9151e-01],\n",
       "          [ 5.4201e-01,  4.5983e-01, -1.3218e-01,  ...,  6.8797e-01,\n",
       "            3.7853e-01,  7.0102e-01],\n",
       "          [ 1.0557e+00,  3.7951e-01,  9.9992e-02,  ...,  1.0149e+00,\n",
       "           -8.0010e-02,  3.1117e-01],\n",
       "          ...,\n",
       "          [ 2.7617e+00,  9.2056e-01,  2.5572e+00,  ..., -2.7046e-01,\n",
       "           -2.6543e-01,  4.2675e-01],\n",
       "          [ 3.2526e+00,  2.6459e+00,  1.0609e+00,  ...,  4.3606e-01,\n",
       "           -1.4836e-01,  5.3355e-01],\n",
       "          [ 1.2066e+00,  2.9331e+00,  7.7196e-03,  ...,  6.1325e-01,\n",
       "           -1.2444e+00,  4.3294e-01]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[ 7.8954e-01, -9.0881e-01, -3.9374e-01,  ..., -1.0515e+00,\n",
       "           -4.3844e-01,  4.8402e-01],\n",
       "          [ 1.0609e+00, -1.1448e+00, -7.7389e-01,  ..., -4.1185e-01,\n",
       "           -8.2883e-01, -6.0351e-01],\n",
       "          [ 5.0185e-01, -6.5958e-01, -5.7257e-02,  ..., -1.2351e+00,\n",
       "           -6.1601e-01, -5.3572e-01],\n",
       "          ...,\n",
       "          [ 2.8482e-01,  9.0139e-01, -4.6546e-01,  ..., -1.0758e+00,\n",
       "            5.9336e-01, -1.4398e+00],\n",
       "          [-1.4410e+00,  1.8157e+00, -1.4097e-01,  ..., -1.1282e+00,\n",
       "            1.1474e+00, -1.1473e+00],\n",
       "          [ 9.0526e-01,  2.4593e-01, -1.1596e+00,  ...,  3.7836e-01,\n",
       "           -1.2631e+00, -4.5477e-01]],\n",
       "\n",
       "         [[-9.2652e-01,  2.5459e+00,  3.0600e-01,  ...,  3.4490e-01,\n",
       "            1.9552e+00, -5.4894e-01],\n",
       "          [ 4.5858e-01, -2.1932e+00,  1.7457e-01,  ...,  1.7436e+00,\n",
       "           -2.5849e+00,  2.2585e+00],\n",
       "          [ 1.7932e-02, -2.7316e+00,  2.1564e-01,  ...,  6.8611e-01,\n",
       "           -4.1920e+00,  1.6138e+00],\n",
       "          ...,\n",
       "          [ 2.3517e-01, -6.8795e+00,  2.4382e+00,  ...,  1.0712e+00,\n",
       "           -3.7377e+00,  2.1150e+00],\n",
       "          [-5.6485e-01, -5.2606e+00,  1.9043e+00,  ...,  6.5478e-01,\n",
       "           -4.0746e+00,  7.2067e-01],\n",
       "          [-4.5099e-02, -3.6478e+00,  1.7280e-01,  ..., -7.5855e-01,\n",
       "           -4.0225e+00,  1.6237e+00]],\n",
       "\n",
       "         [[-2.0252e+00, -3.5049e-01, -1.1153e+00,  ..., -3.9788e-01,\n",
       "            6.0882e-02,  2.5314e-01],\n",
       "          [ 2.0696e+00, -4.9512e-01,  1.0584e+00,  ...,  7.4010e-01,\n",
       "            4.9070e-01,  5.7455e-01],\n",
       "          [ 2.5638e+00,  1.1148e+00, -5.5302e-01,  ...,  1.2067e-01,\n",
       "            3.6513e-01, -1.7264e-01],\n",
       "          ...,\n",
       "          [ 4.7496e+00, -1.1327e+00,  3.4154e+00,  ...,  1.0882e+00,\n",
       "            5.4831e-01, -2.3920e-01],\n",
       "          [ 3.7342e+00, -7.3702e-01,  2.6068e+00,  ...,  3.8731e-01,\n",
       "           -4.0681e-01, -8.3097e-01],\n",
       "          [ 2.0753e+00,  1.0187e+00,  2.0027e+00,  ...,  5.9609e-01,\n",
       "           -1.2866e-01, -3.3631e-01]]]], grad_fn=<PermuteBackward0>), tensor([[[[-5.9407e-02, -8.5679e-02,  1.7006e-02,  ...,  1.0696e-01,\n",
       "           -2.7256e-02,  1.2712e-02],\n",
       "          [-1.1028e-01, -2.6695e-01, -2.7972e-01,  ...,  7.3319e-02,\n",
       "            8.1303e-02,  9.9563e-01],\n",
       "          [ 2.2058e-01,  5.5391e-01, -1.4877e+00,  ..., -2.8705e-01,\n",
       "            3.1862e-01,  8.9564e-01],\n",
       "          ...,\n",
       "          [-2.2597e+00,  4.0054e-01,  3.5313e-01,  ..., -3.1097e-01,\n",
       "           -6.2333e-01,  6.2798e-02],\n",
       "          [-4.0290e-01,  4.4471e-01, -5.3869e-01,  ...,  3.8806e-01,\n",
       "            7.7051e-01, -8.9605e-02],\n",
       "          [-6.9788e-02,  9.6001e-01, -1.9557e-01,  ...,  1.2244e+00,\n",
       "           -4.5771e-01, -4.4116e-01]],\n",
       "\n",
       "         [[ 1.9984e-02,  9.9456e-03, -1.8535e-02,  ...,  2.8180e-02,\n",
       "            3.0411e-02,  5.3314e-02],\n",
       "          [ 5.9644e-01,  5.1996e-01, -1.2417e+00,  ..., -7.3600e-02,\n",
       "            1.1116e+00,  4.7235e-01],\n",
       "          [ 9.7821e-01, -1.1090e+00, -1.6003e-01,  ...,  1.1247e+00,\n",
       "           -1.9147e-01,  2.4299e-01],\n",
       "          ...,\n",
       "          [-2.4673e+00,  2.1871e+00,  1.5703e+00,  ..., -4.3630e-01,\n",
       "           -1.2542e-01, -6.9803e-01],\n",
       "          [ 2.0128e-01,  7.3901e-01, -9.9507e-01,  ..., -1.5172e+00,\n",
       "           -4.1022e-01, -5.7266e-01],\n",
       "          [ 2.5452e-02,  8.2747e-01, -5.4466e-01,  ..., -5.8290e-01,\n",
       "           -9.6744e-01, -4.3227e-01]],\n",
       "\n",
       "         [[ 3.0771e-02,  3.3925e-02, -9.5088e-02,  ...,  2.0078e-02,\n",
       "           -1.8055e-03,  3.6985e-03],\n",
       "          [-7.0360e-01,  1.2550e-01,  2.7324e-01,  ...,  2.4933e-01,\n",
       "           -1.0440e+00,  2.5170e+00],\n",
       "          [-3.4458e-01, -8.2069e-01, -4.0843e-01,  ..., -7.7839e-01,\n",
       "           -7.3081e-01, -2.8720e-01],\n",
       "          ...,\n",
       "          [-6.4247e-01,  2.9498e-01, -1.6238e+00,  ...,  1.0017e+00,\n",
       "           -4.7254e-01,  3.8370e-01],\n",
       "          [ 1.0751e-01,  5.6388e-01, -1.1467e+00,  ...,  4.9151e-01,\n",
       "            4.4171e-01,  4.7236e-01],\n",
       "          [-5.2674e-01,  6.2729e-01, -7.3631e-01,  ..., -9.5472e-02,\n",
       "           -3.0964e-01, -9.4149e-02]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[ 3.2988e-02,  1.8975e-02, -2.9906e-03,  ..., -2.4181e-03,\n",
       "           -7.7690e-04,  3.6506e-02],\n",
       "          [ 1.0896e+00, -5.7907e-01,  2.7873e-01,  ...,  5.9871e-01,\n",
       "            2.0423e+00, -1.8797e-01],\n",
       "          [ 1.1956e+00, -2.4500e-01,  9.7125e-01,  ...,  7.4840e-01,\n",
       "           -1.6449e-01, -7.1359e-01],\n",
       "          ...,\n",
       "          [-1.2320e+00,  2.2416e+00,  1.4954e+00,  ...,  3.9171e-01,\n",
       "           -3.1156e-01, -1.0704e+00],\n",
       "          [-9.0054e-01,  2.1070e-01, -2.4055e-01,  ..., -2.8735e-01,\n",
       "           -2.5012e-01, -2.8542e-01],\n",
       "          [ 1.7677e-01,  9.9594e-01, -1.2952e-01,  ...,  3.1505e-01,\n",
       "            1.0217e+00, -2.8011e-01]],\n",
       "\n",
       "         [[-6.6889e-02, -4.6806e-02,  1.3993e-02,  ...,  1.3800e-02,\n",
       "           -6.0449e-02, -1.0146e-01],\n",
       "          [-1.2984e-01, -3.9512e-01,  9.2745e-02,  ...,  2.9087e-01,\n",
       "            1.8776e-01, -2.5012e-02],\n",
       "          [ 1.4572e+00, -1.4012e-01,  1.1444e-01,  ...,  2.8305e-01,\n",
       "           -3.2233e-01, -1.2569e-01],\n",
       "          ...,\n",
       "          [ 7.1413e-01, -1.7208e+00,  6.2937e-02,  ..., -6.0423e-01,\n",
       "            3.3152e-01, -1.3289e+00],\n",
       "          [ 8.2658e-02, -8.4584e-02, -1.6153e+00,  ..., -2.1428e-01,\n",
       "           -1.4150e-01,  1.0627e+00],\n",
       "          [ 6.7371e-01, -6.1994e-01, -2.5474e-01,  ...,  3.9961e-01,\n",
       "            4.5265e-01,  4.9732e-01]],\n",
       "\n",
       "         [[-1.5357e-02,  3.5532e-02, -6.0959e-02,  ..., -2.2750e-02,\n",
       "            4.1217e-03,  3.3249e-03],\n",
       "          [-4.9697e-01, -9.3557e-01, -6.3136e-02,  ...,  6.6151e-01,\n",
       "            4.0965e-01, -7.5030e-02],\n",
       "          [-4.4717e-02, -6.6199e-01, -9.2862e-01,  ...,  5.9766e-01,\n",
       "            2.3181e-01, -2.1284e-01],\n",
       "          ...,\n",
       "          [ 9.3537e-01,  2.7058e-01,  1.1165e+00,  ..., -6.2206e-01,\n",
       "            2.9264e-01,  1.3320e+00],\n",
       "          [ 7.5008e-01,  1.0221e+00, -3.0356e-02,  ...,  1.3531e-01,\n",
       "            1.1982e+00,  1.2875e-02],\n",
       "          [-1.1730e+00, -6.0364e-01,  1.8613e-01,  ...,  4.8842e-01,\n",
       "           -8.2276e-02,  3.5773e-01]]]], grad_fn=<PermuteBackward0>)), (tensor([[[[-0.5168,  0.5003, -0.8527,  ..., -1.0179, -1.3143,  0.2035],\n",
       "          [ 0.5839,  0.4706, -0.0539,  ...,  0.7804, -0.5336, -0.4840],\n",
       "          [ 0.6194, -0.1866,  0.5777,  ...,  1.4978, -0.3987, -1.0123],\n",
       "          ...,\n",
       "          [ 1.3470,  0.3720, -0.5412,  ...,  1.6907, -0.3261, -0.6066],\n",
       "          [ 1.6338, -0.9301,  0.1403,  ...,  1.6979,  0.9465, -1.1397],\n",
       "          [ 1.0263, -0.0312,  0.4805,  ...,  1.9947, -0.4926, -1.0369]],\n",
       "\n",
       "         [[ 0.8600, -2.0708,  0.1513,  ...,  0.2453, -2.4785, -0.4229],\n",
       "          [ 1.3937, -0.5899,  1.3122,  ...,  1.7829, -0.9209,  0.0461],\n",
       "          [ 1.2642,  1.6078,  0.4370,  ...,  0.8510,  0.5742, -0.4311],\n",
       "          ...,\n",
       "          [-0.2522,  3.1486, -0.0085,  ...,  0.1856,  3.6584, -0.8100],\n",
       "          [ 0.7939,  1.4203,  0.2039,  ...,  0.5636,  2.3544, -1.3126],\n",
       "          [ 0.5796,  1.1935, -0.4128,  ..., -0.0242,  2.9575, -1.5629]],\n",
       "\n",
       "         [[ 1.0033,  0.3818, -0.1770,  ..., -0.8112, -1.4041, -0.3646],\n",
       "          [-0.8060, -0.2143, -1.7062,  ...,  1.3577, -0.0913,  0.3456],\n",
       "          [-0.1113, -0.4196, -1.3489,  ...,  1.2174,  0.0079,  0.0536],\n",
       "          ...,\n",
       "          [-3.3497, -0.2660,  0.2152,  ...,  1.4347,  0.1310,  1.4777],\n",
       "          [-1.0710, -0.9403, -2.2382,  ...,  1.8840, -0.8573,  0.6312],\n",
       "          [ 0.1759, -0.5456, -0.7188,  ...,  1.3544, -0.2407, -0.2945]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[ 0.2209, -0.5819,  0.4830,  ..., -0.6500,  1.0846,  0.2841],\n",
       "          [-0.4176, -0.2444, -0.1033,  ..., -1.9155, -1.5817,  0.6707],\n",
       "          [-1.1385,  1.7267, -2.5488,  ..., -2.3843, -1.3253,  0.6908],\n",
       "          ...,\n",
       "          [-1.5273,  1.4584, -2.7749,  ..., -1.0319, -2.3548,  1.2913],\n",
       "          [-0.4061,  2.5040, -2.4008,  ..., -1.3577, -1.8418,  1.6940],\n",
       "          [-0.3213,  0.4250, -1.2700,  ..., -3.8195, -0.4979,  1.6445]],\n",
       "\n",
       "         [[ 0.2557,  0.5515,  0.5423,  ...,  0.7245,  0.0758,  0.8386],\n",
       "          [ 0.0373,  1.1293, -0.8812,  ...,  1.3053,  0.1991,  1.0668],\n",
       "          [ 0.1955,  1.5928, -0.9004,  ...,  0.8965, -0.3034, -0.3748],\n",
       "          ...,\n",
       "          [-0.0899,  2.2438, -0.5928,  ...,  0.3604, -1.3797, -0.0256],\n",
       "          [ 1.0178,  2.0922, -0.1509,  ...,  1.3384, -1.0962, -0.1963],\n",
       "          [ 1.6399,  1.9349,  1.2059,  ..., -0.5345, -1.0859,  0.4052]],\n",
       "\n",
       "         [[-0.7103,  0.3011, -1.5822,  ..., -0.4242,  0.2360, -1.3157],\n",
       "          [ 0.0054,  1.0721,  1.2615,  ...,  0.6028, -0.2910,  0.4983],\n",
       "          [-0.3964,  0.6106, -0.1719,  ...,  1.1167, -0.9742,  1.4335],\n",
       "          ...,\n",
       "          [ 1.7506,  0.9744,  1.3645,  ...,  1.5455,  0.3070,  1.9734],\n",
       "          [-0.9912,  0.5391,  1.5399,  ...,  0.9224,  0.0180,  3.2051],\n",
       "          [-0.3301,  0.1262,  2.0273,  ...,  2.3883,  0.8853, -0.5951]]]],\n",
       "       grad_fn=<PermuteBackward0>), tensor([[[[-6.0669e-03,  4.1580e-02, -6.0121e-02,  ...,  5.8650e-02,\n",
       "           -5.0545e-02, -7.4066e-02],\n",
       "          [-1.3468e+00,  1.2899e-01,  1.0814e+00,  ..., -3.7751e-01,\n",
       "           -3.3316e-01, -2.1958e-01],\n",
       "          [-5.5962e-01,  5.0996e-01,  1.2739e+00,  ..., -3.0219e-01,\n",
       "            1.1735e-01,  8.0562e-01],\n",
       "          ...,\n",
       "          [ 4.0354e-01, -1.3779e+00, -1.7089e-01,  ..., -7.0931e-01,\n",
       "           -1.5366e+00,  4.9889e-01],\n",
       "          [-1.0464e-01,  6.3718e-01,  1.1747e+00,  ...,  3.1930e-01,\n",
       "           -1.4733e+00,  2.9720e-01],\n",
       "          [ 4.8980e-01, -3.8612e-02,  1.0899e+00,  ...,  3.5262e-01,\n",
       "           -1.5532e+00, -7.8569e-01]],\n",
       "\n",
       "         [[ 5.0298e-02, -3.1197e-03,  2.3202e-02,  ..., -2.5890e-02,\n",
       "           -1.5499e-02, -2.3228e-02],\n",
       "          [-1.0113e+00,  5.7627e-01,  4.9574e-01,  ..., -3.2675e-01,\n",
       "            1.8656e-01,  8.2123e-01],\n",
       "          [ 1.4849e-01,  7.7403e-01, -1.0155e+00,  ...,  3.1667e-01,\n",
       "           -1.8597e-01,  4.8591e-01],\n",
       "          ...,\n",
       "          [-1.2798e-01, -5.1137e-01,  1.0950e+00,  ...,  3.6580e+00,\n",
       "            2.7943e-02, -6.0400e-01],\n",
       "          [ 5.0271e-01, -1.6136e+00, -1.8023e-01,  ..., -5.1400e-02,\n",
       "            9.7688e-01, -1.7705e-01],\n",
       "          [-5.7576e-01,  2.3451e-01, -5.6929e-02,  ...,  1.7267e-01,\n",
       "            5.6935e-02,  1.5519e+00]],\n",
       "\n",
       "         [[-2.1214e-02,  3.3352e-02, -8.7513e-02,  ..., -8.9315e-03,\n",
       "            3.0822e-02,  3.8786e-02],\n",
       "          [-5.2402e-01, -2.6213e-01, -2.3345e-01,  ...,  3.4806e-01,\n",
       "            1.7069e-01, -2.4676e-04],\n",
       "          [-4.9385e-01,  4.8865e-01, -4.3231e-01,  ...,  7.6064e-01,\n",
       "           -2.9906e-02,  1.8609e-01],\n",
       "          ...,\n",
       "          [-1.5463e-01,  3.1104e+00, -7.4308e-01,  ..., -2.0623e-01,\n",
       "           -7.7929e-01,  9.4195e-01],\n",
       "          [-7.9531e-01,  1.3734e+00, -4.3312e-01,  ..., -4.4867e-01,\n",
       "            1.8048e-01,  6.7478e-02],\n",
       "          [-7.1460e-01,  5.9888e-01,  1.0618e+00,  ...,  5.3553e-01,\n",
       "           -7.6815e-01, -2.0368e-02]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[-1.0483e-01,  3.7457e-02, -1.1568e-03,  ..., -4.4699e-02,\n",
       "           -8.8917e-03,  1.7837e-02],\n",
       "          [ 8.6269e-01, -1.4697e-01,  3.6745e-01,  ...,  3.1546e-01,\n",
       "            3.5175e-02, -3.0443e-01],\n",
       "          [ 5.1551e-01,  2.5198e-01,  1.5730e-01,  ...,  1.4892e+00,\n",
       "           -5.0922e-01,  2.0167e+00],\n",
       "          ...,\n",
       "          [-8.4730e-01, -4.3078e-01,  5.2054e-01,  ..., -9.0684e-02,\n",
       "            2.3514e-01, -9.9466e-02],\n",
       "          [-6.5804e-01, -7.2872e-01, -3.5989e-02,  ...,  1.0229e+00,\n",
       "           -7.1775e-01,  9.4541e-01],\n",
       "          [-8.0607e-01, -7.0931e-01,  7.0981e-01,  ...,  9.3561e-02,\n",
       "           -8.5871e-01,  1.3432e-01]],\n",
       "\n",
       "         [[ 9.2671e-02,  1.3452e-02,  5.3226e-02,  ...,  3.1511e-03,\n",
       "            4.3145e-02,  1.1899e-02],\n",
       "          [-1.5642e-01,  1.9026e-01,  7.6595e-01,  ..., -1.3407e-01,\n",
       "            2.6676e-01, -5.0297e-01],\n",
       "          [ 6.1654e-01,  8.3766e-01,  1.0002e+00,  ...,  1.2987e-01,\n",
       "            1.4224e+00, -4.2693e-01],\n",
       "          ...,\n",
       "          [-3.0884e-02,  1.5424e+00,  6.3994e-01,  ..., -7.0803e-01,\n",
       "            2.7368e+00,  1.4347e+00],\n",
       "          [ 1.9046e-01,  2.8000e+00, -1.0230e+00,  ..., -9.5444e-02,\n",
       "            6.7177e-01,  5.6665e-02],\n",
       "          [-2.2475e-01,  9.7459e-01,  2.7916e+00,  ..., -2.6344e-01,\n",
       "            6.6847e-01, -1.8378e+00]],\n",
       "\n",
       "         [[-1.0774e-01,  2.8406e-02, -5.5795e-02,  ..., -8.4050e-02,\n",
       "            6.1122e-02, -6.3834e-03],\n",
       "          [ 2.9482e-01, -4.6036e-01, -5.0067e-01,  ..., -4.2501e-01,\n",
       "            1.4048e+00,  3.5767e-02],\n",
       "          [-7.1401e-01, -6.3282e-01, -6.3516e-01,  ..., -1.0530e+00,\n",
       "            4.5487e-01, -1.2223e+00],\n",
       "          ...,\n",
       "          [-3.1073e-01, -1.3114e-01,  6.1908e-01,  ..., -1.1282e+00,\n",
       "            9.6892e-01,  1.1984e-01],\n",
       "          [-9.2515e-01,  1.1405e-01,  1.4297e-01,  ...,  2.4874e-01,\n",
       "           -2.8435e-01,  3.1334e-01],\n",
       "          [-4.7070e-01, -2.1283e-01,  1.8143e-01,  ..., -6.6503e-01,\n",
       "           -9.3209e-01,  8.6368e-02]]]], grad_fn=<PermuteBackward0>)), (tensor([[[[-1.7151, -0.3351, -0.2937,  ...,  0.1779,  0.3182, -0.4845],\n",
       "          [-0.0573, -0.2968, -0.2792,  ...,  1.8097, -0.9933, -0.0485],\n",
       "          [ 0.4551,  0.2350, -0.5202,  ...,  1.4073, -1.1556, -0.5031],\n",
       "          ...,\n",
       "          [ 0.4398,  0.2872, -1.3292,  ...,  1.0791, -0.5067, -0.6926],\n",
       "          [ 1.4281,  0.7992, -0.5960,  ...,  0.2981, -0.7968, -0.0723],\n",
       "          [ 1.4787,  0.0107, -0.4716,  ...,  0.5184, -2.1060, -0.5924]],\n",
       "\n",
       "         [[ 0.1182, -0.0641,  2.3043,  ...,  0.2435,  0.0934, -0.1985],\n",
       "          [ 0.3078, -0.7078, -0.0229,  ..., -0.1684,  0.3255,  0.4997],\n",
       "          [-0.0072, -1.4351, -0.3262,  ..., -0.0116, -0.4237, -1.0236],\n",
       "          ...,\n",
       "          [ 0.0789,  1.0277, -0.1959,  ..., -0.3314, -0.0123,  0.1554],\n",
       "          [-0.1221, -0.4109, -0.2510,  ..., -0.1642,  0.4183, -1.1042],\n",
       "          [ 0.3249, -0.0160, -1.2671,  ...,  0.3571,  0.7415, -0.0222]],\n",
       "\n",
       "         [[-0.1897,  1.0560,  0.4685,  ..., -0.5601,  0.3216, -0.1095],\n",
       "          [-1.2898,  0.1961,  0.3851,  ...,  0.0611, -0.4468, -0.4500],\n",
       "          [-1.2418, -0.5390, -0.3550,  ...,  1.1175,  0.2285, -0.6764],\n",
       "          ...,\n",
       "          [-1.7289, -1.8670, -0.2395,  ...,  0.8597,  0.1823,  0.1264],\n",
       "          [-1.4579, -1.1564,  0.1421,  ...,  1.1056, -0.7289, -0.9327],\n",
       "          [-0.5909,  0.2044, -0.4299,  ...,  1.1715,  0.5954, -0.7546]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[ 0.5702,  0.9629, -0.8822,  ..., -0.7408,  0.6916,  0.8577],\n",
       "          [ 1.1006,  1.7073, -1.6259,  ..., -0.9801, -0.5523,  0.2173],\n",
       "          [ 0.6847,  1.8472, -0.7955,  ..., -0.5540, -0.5932,  0.3027],\n",
       "          ...,\n",
       "          [-1.6889,  0.1444,  0.2351,  ..., -0.8126, -1.4681, -0.2635],\n",
       "          [ 0.2845,  1.1499, -0.4750,  ..., -0.8172, -0.8146,  0.3588],\n",
       "          [ 0.0874, -0.2214, -0.0856,  ..., -0.7394, -1.2066,  0.1715]],\n",
       "\n",
       "         [[-0.4207,  0.3802,  0.3173,  ...,  0.7072,  0.0623, -0.0753],\n",
       "          [-0.9885,  1.0626, -0.9565,  ..., -0.0961, -0.5346, -1.2537],\n",
       "          [-0.8144,  1.0409, -0.3393,  ...,  0.7575, -0.7873, -0.3033],\n",
       "          ...,\n",
       "          [ 0.7162,  0.5353,  0.5786,  ...,  1.2553, -0.3931,  0.5719],\n",
       "          [-0.4955,  0.3584, -0.4676,  ...,  0.7423, -0.1553,  0.4724],\n",
       "          [-1.1989,  0.5740, -1.4242,  ..., -0.1654, -0.2418, -1.2209]],\n",
       "\n",
       "         [[-0.7356, -0.0250,  0.4452,  ..., -0.0942,  0.0330, -0.0571],\n",
       "          [ 0.1039, -0.4520,  1.6473,  ...,  0.0384, -0.3122,  0.2989],\n",
       "          [ 0.0984, -1.1736,  1.5161,  ...,  0.2563, -0.7524, -0.1577],\n",
       "          ...,\n",
       "          [ 0.3700,  0.0524,  0.3323,  ..., -0.1997, -1.8872,  0.6791],\n",
       "          [ 0.1144, -0.8560,  1.3444,  ..., -0.9688, -0.0111,  0.6124],\n",
       "          [ 0.3719, -1.1888,  2.9866,  ...,  1.5564,  1.0569,  1.0004]]]],\n",
       "       grad_fn=<PermuteBackward0>), tensor([[[[ 0.0627, -0.1041, -0.1861,  ..., -0.2973,  0.2617, -0.1300],\n",
       "          [ 0.9190, -0.4094,  0.9732,  ...,  0.7065, -1.3206,  1.6103],\n",
       "          [-0.3314,  0.7099,  0.8130,  ...,  1.2446, -1.0029,  1.6824],\n",
       "          ...,\n",
       "          [ 0.3073, -0.1798,  2.0175,  ...,  3.8588, -1.2389,  0.9518],\n",
       "          [-1.0282,  0.0810,  1.8490,  ...,  2.1276, -0.6254,  0.2580],\n",
       "          [-2.1301,  0.1848,  0.6389,  ...,  0.8497, -2.1894,  2.4372]],\n",
       "\n",
       "         [[ 0.0701, -0.0366,  0.0433,  ..., -0.0205, -0.1226,  0.1881],\n",
       "          [-0.4828,  0.0397,  0.1133,  ...,  0.6646, -0.4122, -0.4976],\n",
       "          [-0.0560,  0.5185, -0.3796,  ..., -0.0358, -1.7324, -0.5987],\n",
       "          ...,\n",
       "          [ 0.3621, -1.0770,  0.8416,  ..., -1.0863, -1.4621,  1.3165],\n",
       "          [-0.4170, -0.1856, -0.2208,  ...,  0.6679,  0.2648, -0.7330],\n",
       "          [ 0.9327,  0.1231, -0.2568,  ...,  0.0206, -0.5632, -0.0348]],\n",
       "\n",
       "         [[ 0.0102,  0.0407, -0.0427,  ...,  0.0176,  0.0324,  0.0545],\n",
       "          [-0.8474, -0.1339,  0.6198,  ..., -1.1320, -0.1028,  0.0237],\n",
       "          [-0.3179,  0.2617, -0.2293,  ..., -0.3102, -0.2109,  0.9155],\n",
       "          ...,\n",
       "          [-1.1459,  0.6247, -0.4677,  ...,  0.4716,  0.2258,  1.9537],\n",
       "          [ 0.3159, -0.4951,  0.3948,  ...,  0.0583,  0.3305,  2.0492],\n",
       "          [ 0.5957, -0.2422, -0.1601,  ...,  0.0871,  0.6242,  0.0631]],\n",
       "\n",
       "         ...,\n",
       "\n",
       "         [[-0.0194, -0.0276,  0.0886,  ...,  0.0796, -0.0209,  0.0248],\n",
       "          [-0.7218,  0.9741,  0.7868,  ..., -0.1377, -0.3252, -1.0529],\n",
       "          [ 0.0102,  0.0130,  0.1943,  ...,  1.0051,  0.9481, -0.4571],\n",
       "          ...,\n",
       "          [-1.2697,  1.1965,  1.8222,  ...,  1.2815,  1.1525, -0.2608],\n",
       "          [-0.9059, -0.1876, -0.2131,  ...,  0.1001,  0.5176, -0.7554],\n",
       "          [-0.2481,  0.0416, -0.7926,  ...,  0.2645, -0.6107, -0.3649]],\n",
       "\n",
       "         [[-0.1515, -0.0920,  0.0492,  ..., -0.0616,  0.0336, -0.0914],\n",
       "          [ 0.1947,  0.3574,  0.4865,  ..., -0.0827, -0.0695,  0.1024],\n",
       "          [ 0.0617, -0.4696,  0.1419,  ..., -0.5913, -0.3143,  0.7776],\n",
       "          ...,\n",
       "          [-0.4784,  1.0185, -0.0705,  ...,  0.2748, -0.4973,  1.3698],\n",
       "          [-0.8326,  0.6881,  0.1242,  ..., -0.5708,  0.6708,  0.8386],\n",
       "          [-1.0543, -0.0815,  0.9794,  ...,  0.3561,  0.6065,  0.8012]],\n",
       "\n",
       "         [[ 0.1127, -0.1414,  0.0995,  ..., -0.1078,  0.0248, -0.1947],\n",
       "          [ 0.3453, -0.7535,  0.9195,  ...,  0.1146, -0.0401,  0.5830],\n",
       "          [-0.6160, -0.7786,  1.2499,  ..., -1.0763,  0.0126, -0.6472],\n",
       "          ...,\n",
       "          [-1.0815,  0.2212,  0.6810,  ..., -1.4694, -0.5813,  0.5124],\n",
       "          [-0.5673, -0.7975, -0.1831,  ...,  0.2839,  0.3034,  0.0535],\n",
       "          [-0.9700,  0.6699, -0.1582,  ...,  0.8679, -0.3234,  1.0039]]]],\n",
       "       grad_fn=<PermuteBackward0>))), hidden_states=None, attentions=None, cross_attentions=None)"
      ]
     },
     "execution_count": 81,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "output"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "5756662c-df25-497e-9529-16696279f90e",
   "metadata": {},
   "source": [
    "你发现了吗？每次更换模型都需要更换导入的对象和代码，这样就非常麻烦，因此Huggingface又封装了全新的类`AutoClasses`："
   ]
  },
  {
   "cell_type": "markdown",
   "id": "eb270356-e869-4042-954d-94d7f046a140",
   "metadata": {},
   "source": [
    "> **AutoClasses**\n",
    "\n",
    "可以通过模型的名字直接定位到使用的模型，是一个通用于大部分模型的功能类。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 82,
   "id": "308fb9c6-5f7d-4404-9059-f0556f12de4b",
   "metadata": {},
   "outputs": [],
   "source": [
    "from transformers import AutoModel, AutoTokenizer\n",
    "\n",
    "# 对于BERT\n",
    "bert_model = AutoModel.from_pretrained('bert-base-chinese')\n",
    "bert_tokenizer = AutoTokenizer.from_pretrained('bert-base-chinese')\n",
    "\n",
    "# 对于GPT-2\n",
    "gpt2_model = AutoModel.from_pretrained('gpt2')\n",
    "gpt2_tokenizer = AutoTokenizer.from_pretrained('gpt2')"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "40502f8b-d33a-42ad-af71-8e76e6da1365",
   "metadata": {},
   "source": [
    "只需要修改模型的名字，就能够实现模型的调用，这比使用Models方便很多。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "ce10ce52-d64a-4897-8bd6-1d639f176952",
   "metadata": {},
   "source": [
    "> **Pipelines**"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "efff96fc-872e-45ac-afb3-1b0cceb1c3bb",
   "metadata": {},
   "source": [
    "这是一种高层次的API，用于轻松地完成各种自然语言处理任务，如文本分类、信息提取、问答、摘要、翻译等。pipeline 聚合了模型和预处理，使得用户不需要手动执行文本的编码和解码，模型的加载和输出的解析等步骤。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "52d3cfdb-79f6-4221-a109-7aca408bf2f7",
   "metadata": {},
   "outputs": [],
   "source": [
    "from transformers import pipeline\n",
    "\n",
    "# 创建一个 pipeline，自动加载预训练模型和相应的预处理\n",
    "#nlp = pipeline(\"task-name\")\n",
    "\n",
    "# 对文本进行处理，得到结果\n",
    "#result = nlp(\"Some input text\")"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 84,
   "id": "f9bb6950-aa04-4565-8dce-3dae4212df91",
   "metadata": {},
   "outputs": [
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation.\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "[{'generated_text': '你好呀bert！的、备決脑八于。这样,'},\n",
       " {'generated_text': '你好呀bert！ 東常 情铮真森。真�'},\n",
       " {'generated_text': '你好呀bert！Ｑ ﴐ\\uf6f0女ﴆ ﴏ�'},\n",
       " {'generated_text': '你好呀bert！\\uff00Ｆ\\uff00３Ｆ Ｆ �'},\n",
       " {'generated_text': '你好呀bert！＄\\n\\nQ: What was his most impressive role or accomplishment?\\n\\nB:'}]"
      ]
     },
     "execution_count": 84,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "from transformers import pipeline, set_seed\n",
    "\n",
    "#只需要输入模型的名字，以及要执行的功能，就能够直接实现功能的执行\n",
    "generator = pipeline('text-generation', model='gpt2')\n",
    "set_seed(42)\n",
    "generator(\"你好呀bert！\", max_length=30, num_return_sequences=5)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "60c67aad-0b82-4b52-83a4-65c0b4d3fc20",
   "metadata": {},
   "source": [
    "**在pipelines中，可以选择的任务有**："
   ]
  },
  {
   "cell_type": "markdown",
   "id": "919fb83f-1a92-46e0-a118-1aafe3b94f7d",
   "metadata": {},
   "source": [
    "\"sentiment-analysis\": 用于文本情感分类的任务。\n",
    "\n",
    "\"text-generation\": 文本生成任务，如自动写作或补全文本。\n",
    "\n",
    "\"ner\" (命名实体识别): 识别文本中的实体（如人名、地点、组织）。\n",
    "\n",
    "\"question-answering\": 针对给定文本的问题回答任务。\n",
    "\n",
    "\"fill-mask\": 填充文本中的掩码（mask）词汇的任务。\n",
    "\n",
    "\"summarization\": 自动文本摘要生成任务。\n",
    "\n",
    "\"translation_xx_to_yy\": 翻译任务，xx 和 yy 表示不同的语言代码，如 \"translation_en_to_fr\"。\n",
    "\n",
    "\"text2text-generation\": 将文本转换为另一种形式或语言的任务\n",
    "\n",
    "\"zero-shot-classification\": 不带训练的直接分类任务，可以为文本分配多个标签。\n",
    "\n",
    "\"conversational\": 对话模型，根据对话的历史回应新的对话输入。\n",
    "\n",
    "\"feature-extraction\": 提取文本的特征向量。\n",
    "\n",
    "\"text-classification\": 文本分类，也被称为主题分类。\n",
    "\n",
    "\"token-classification\": 分词层面的分类，如词性标注。\n",
    "\n",
    "\"table-question-answering\": 对结构化表格数据进行问题回答的任务。\n",
    "\n",
    "\"translation\": 自动翻译任务，通常需要指定源语言和目标语言。\n",
    "\n",
    "\"automatic-speech-recognition\": 自动语音识别，将语音转录为文本。\n",
    "\n",
    "\"image-classification\": 图像分类任务。\n",
    "\n",
    "\"object-detection\": 在图像中识别多个物体及其位置。\n",
    "\n",
    "\"text-to-speech\": 文本转语音任务。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "eb7baae4-0e6e-457a-8513-eb2539f7722f",
   "metadata": {},
   "source": [
    "**也可以直接调用封装好的pipelines类**"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "c8bd6ccd-5154-4671-8fdc-4e587978e996",
   "metadata": {},
   "source": [
    "![](https://skojiangdoc.oss-cn-beijing.aliyuncs.com/2023DL/transformer/24.png)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 85,
   "id": "5d12a68e-c20d-4848-a709-222a29311e93",
   "metadata": {},
   "outputs": [
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "Some weights of GPT2ForSequenceClassification were not initialized from the model checkpoint at gpt2 and are newly initialized: ['score.weight']\n",
      "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[[{'label': 'LABEL_0', 'score': 0.9996962547302246}, {'label': 'LABEL_1', 'score': 0.00030382093973457813}]]\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "D:\\ProgramData\\Anaconda3\\lib\\site-packages\\transformers\\pipelines\\text_classification.py:105: UserWarning: `return_all_scores` is now deprecated,  if want a similar functionality use `top_k=None` instead of `return_all_scores=True` or `top_k=1` instead of `return_all_scores=False`.\n",
      "  warnings.warn(\n"
     ]
    }
   ],
   "source": [
    "from transformers import AutoModelForSequenceClassification, AutoTokenizer, TextClassificationPipeline\n",
    "\n",
    "# 指定模型和分词器\n",
    "model_name = \"gpt2\"\n",
    "model = AutoModelForSequenceClassification.from_pretrained(model_name)\n",
    "tokenizer = AutoTokenizer.from_pretrained(model_name)\n",
    "\n",
    "# 直接实例化TextClassificationPipeline类\n",
    "text_classifier = TextClassificationPipeline(model=model, tokenizer=tokenizer, return_all_scores=True)\n",
    "\n",
    "# 对单个句子进行分类\n",
    "result = text_classifier(\"I love machine learning!\")\n",
    "\n",
    "# 输出结果\n",
    "print(result)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "cbcaa88b-accb-47ca-afcb-6abe5008c86b",
   "metadata": {},
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3 (ipykernel)",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.9.7"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
