{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# BERTモデルの実装\n",
    "本ファイルでは、BERTの基本モデル、BERTのMasked Language Modelタスク、単語のベクトル表現の確認を実装します。\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "※　本章のファイルはすべてUbuntuでの動作を前提としています。Windowsなど文字コードが違う環境での動作にはご注意下さい。\n",
    "\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# 8.2 学習目標\n",
    "\n",
    "1.\tBERTのEmbeddingsモジュールの動作を理解し、実装できる\n",
    "2.\tBERTのSelf-Attentionを活用したTransformer部分であるBertLayerモジュールの動作を理解し、実装できる\n",
    "3.\tBERTのPoolerモジュールの動作を理解し、実装できる\n",
    "\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# 8.3 学習目標\n",
    "\n",
    "1.\tBERTの学習済みモデルを自分の実装モデルにロードできる\n",
    "2.\tBERT用の単語分割クラスなど、言語データの前処理部分を実装できる\n",
    "3.\tBERTで単語ベクトルを取り出して確認する内容を実装できる\n",
    "\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# 事前準備\n",
    "\n",
    "- 書籍の指示に従い、本章で使用するデータを用意します\n",
    "- pip install attrdict\n",
    "\n",
    "でパッケージattrdictを入れておきます"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {},
   "outputs": [],
   "source": [
    "import math\n",
    "import numpy as np\n",
    "\n",
    "import torch\n",
    "from torch import nn\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# 8.2 BERTの実装"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## BERT_Baseのネットワークの設定ファイルの読み込み"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "{'attention_probs_dropout_prob': 0.1,\n",
       " 'hidden_act': 'gelu',\n",
       " 'hidden_dropout_prob': 0.1,\n",
       " 'hidden_size': 768,\n",
       " 'initializer_range': 0.02,\n",
       " 'intermediate_size': 3072,\n",
       " 'max_position_embeddings': 512,\n",
       " 'num_attention_heads': 12,\n",
       " 'num_hidden_layers': 12,\n",
       " 'type_vocab_size': 2,\n",
       " 'vocab_size': 30522}"
      ]
     },
     "execution_count": 6,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 設定をconfig.jsonから読み込み、JSONの辞書変数をオブジェクト変数に変換\n",
    "import json\n",
    "\n",
    "config_file = \"./weights/bert_config.json\"\n",
    "\n",
    "# ファイルを開き、JSONとして読み込む\n",
    "json_file = open(config_file, 'r')\n",
    "config = json.load(json_file)\n",
    "\n",
    "# 出力確認\n",
    "config\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "768"
      ]
     },
     "execution_count": 7,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# こう書くのは面倒・・・\n",
    "config['hidden_size']\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "768"
      ]
     },
     "execution_count": 8,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 辞書変数をオブジェクト変数に\n",
    "from attrdict import AttrDict\n",
    "\n",
    "config = AttrDict(config)\n",
    "config.hidden_size\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## BERT用にLayerNormalization層を定義"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {},
   "outputs": [],
   "source": [
    "# BERT用にLayerNormalization層を定義します。\n",
    "# 実装の細かな点をTensorFlowに合わせています。\n",
    "\n",
    "\n",
    "class BertLayerNorm(nn.Module):\n",
    "    \"\"\"LayerNormalization層 \"\"\"\n",
    "\n",
    "    def __init__(self, hidden_size, eps=1e-12):\n",
    "        super(BertLayerNorm, self).__init__()\n",
    "        self.gamma = nn.Parameter(torch.ones(hidden_size))  # weightのこと\n",
    "        self.beta = nn.Parameter(torch.zeros(hidden_size))  # biasのこと\n",
    "        self.variance_epsilon = eps\n",
    "\n",
    "    def forward(self, x):\n",
    "        u = x.mean(-1, keepdim=True)\n",
    "        s = (x - u).pow(2).mean(-1, keepdim=True)\n",
    "        x = (x - u) / torch.sqrt(s + self.variance_epsilon)\n",
    "        return self.gamma * x + self.beta\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Embeddingsモジュールの実装"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "metadata": {},
   "outputs": [],
   "source": [
    "# BERTのEmbeddingsモジュールです\n",
    "\n",
    "\n",
    "class BertEmbeddings(nn.Module):\n",
    "    \"\"\"文章の単語ID列と、1文目か2文目かの情報を、埋め込みベクトルに変換する\n",
    "    \"\"\"\n",
    "\n",
    "    def __init__(self, config):\n",
    "        super(BertEmbeddings, self).__init__()\n",
    "\n",
    "        # 3つのベクトル表現の埋め込み\n",
    "\n",
    "        # Token Embedding：単語IDを単語ベクトルに変換、\n",
    "        # vocab_size = 30522でBERTの学習済みモデルで使用したボキャブラリーの量\n",
    "        # hidden_size = 768 で特徴量ベクトルの長さは768\n",
    "        self.word_embeddings = nn.Embedding(\n",
    "            config.vocab_size, config.hidden_size, padding_idx=0)\n",
    "        # （注釈）padding_idx=0はidx=0の単語のベクトルは0にする。BERTのボキャブラリーのidx=0が[PAD]である。\n",
    "\n",
    "        # Transformer Positional Embedding：位置情報テンソルをベクトルに変換\n",
    "        # Transformerの場合はsin、cosからなる固定値だったが、BERTは学習させる\n",
    "        # max_position_embeddings = 512　で文の長さは512単語\n",
    "        self.position_embeddings = nn.Embedding(\n",
    "            config.max_position_embeddings, config.hidden_size)\n",
    "\n",
    "        # Sentence Embedding：文章の1文目、2文目の情報をベクトルに変換\n",
    "        # type_vocab_size = 2\n",
    "        self.token_type_embeddings = nn.Embedding(\n",
    "            config.type_vocab_size, config.hidden_size)\n",
    "\n",
    "        # 作成したLayerNormalization層\n",
    "        self.LayerNorm = BertLayerNorm(config.hidden_size, eps=1e-12)\n",
    "\n",
    "        # Dropout　'hidden_dropout_prob': 0.1\n",
    "        self.dropout = nn.Dropout(config.hidden_dropout_prob)\n",
    "\n",
    "    def forward(self, input_ids, token_type_ids=None):\n",
    "        '''\n",
    "        input_ids： [batch_size, seq_len]の文章の単語IDの羅列\n",
    "        token_type_ids：[batch_size, seq_len]の各単語が1文目なのか、2文目なのかを示すid\n",
    "        '''\n",
    "\n",
    "        # 1. Token Embeddings\n",
    "        # 単語IDを単語ベクトルに変換\n",
    "        words_embeddings = self.word_embeddings(input_ids)\n",
    "\n",
    "        # 2. Sentence Embedding\n",
    "        # token_type_idsがない場合は文章の全単語を1文目として、0にする\n",
    "        # そこで、input_idsと同じサイズのゼロテンソルを作成\n",
    "        if token_type_ids is None:\n",
    "            token_type_ids = torch.zeros_like(input_ids)\n",
    "        token_type_embeddings = self.token_type_embeddings(token_type_ids)\n",
    "\n",
    "        # 3. Transformer Positional Embedding：\n",
    "        # [0, 1, 2 ・・・]と文章の長さだけ、数字が1つずつ昇順に入った\n",
    "        # [batch_size, seq_len]のテンソルposition_idsを作成\n",
    "        # position_idsを入力して、position_embeddings層から768次元のテンソルを取り出す\n",
    "        seq_length = input_ids.size(1)  # 文章の長さ\n",
    "        position_ids = torch.arange(\n",
    "            seq_length, dtype=torch.long, device=input_ids.device)\n",
    "        position_ids = position_ids.unsqueeze(0).expand_as(input_ids)\n",
    "        position_embeddings = self.position_embeddings(position_ids)\n",
    "\n",
    "        # 3つの埋め込みテンソルを足し合わせる [batch_size, seq_len, hidden_size]\n",
    "        embeddings = words_embeddings + position_embeddings + token_type_embeddings\n",
    "\n",
    "        # LayerNormalizationとDropoutを実行\n",
    "        embeddings = self.LayerNorm(embeddings)\n",
    "        embeddings = self.dropout(embeddings)\n",
    "\n",
    "        return embeddings\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## BertLayerモジュール\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "metadata": {},
   "outputs": [],
   "source": [
    "class BertLayer(nn.Module):\n",
    "    '''BERTのBertLayerモジュールです。Transformerになります'''\n",
    "\n",
    "    def __init__(self, config):\n",
    "        super(BertLayer, self).__init__()\n",
    "\n",
    "        # Self-Attention部分\n",
    "        self.attention = BertAttention(config)\n",
    "\n",
    "        # Self-Attentionの出力を処理する全結合層\n",
    "        self.intermediate = BertIntermediate(config)\n",
    "\n",
    "        # Self-Attentionによる特徴量とBertLayerへの元の入力を足し算する層\n",
    "        self.output = BertOutput(config)\n",
    "\n",
    "    def forward(self, hidden_states, attention_mask, attention_show_flg=False):\n",
    "        '''\n",
    "        hidden_states：Embedderモジュールの出力テンソル[batch_size, seq_len, hidden_size]\n",
    "        attention_mask：Transformerのマスクと同じ働きのマスキング\n",
    "        attention_show_flg：Self-Attentionの重みを返すかのフラグ\n",
    "        '''\n",
    "        if attention_show_flg == True:\n",
    "            '''attention_showのときは、attention_probsもリターンする'''\n",
    "            attention_output, attention_probs = self.attention(\n",
    "                hidden_states, attention_mask, attention_show_flg)\n",
    "            intermediate_output = self.intermediate(attention_output)\n",
    "            layer_output = self.output(intermediate_output, attention_output)\n",
    "            return layer_output, attention_probs\n",
    "\n",
    "        elif attention_show_flg == False:\n",
    "            attention_output = self.attention(\n",
    "                hidden_states, attention_mask, attention_show_flg)\n",
    "            intermediate_output = self.intermediate(attention_output)\n",
    "            layer_output = self.output(intermediate_output, attention_output)\n",
    "\n",
    "            return layer_output  # [batch_size, seq_length, hidden_size]\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "metadata": {},
   "outputs": [],
   "source": [
    "class BertAttention(nn.Module):\n",
    "    '''BertLayerモジュールのSelf-Attention部分です'''\n",
    "    def __init__(self, config):\n",
    "        super(BertAttention, self).__init__()\n",
    "        self.selfattn = BertSelfAttention(config)\n",
    "        self.output = BertSelfOutput(config)\n",
    "\n",
    "    def forward(self, input_tensor, attention_mask, attention_show_flg=False):\n",
    "        '''\n",
    "        input_tensor：Embeddingsモジュールもしくは前段のBertLayerからの出力\n",
    "        attention_mask：Transformerのマスクと同じ働きのマスキングです\n",
    "        attention_show_flg：Self-Attentionの重みを返すかのフラグ\n",
    "        '''\n",
    "        if attention_show_flg == True:\n",
    "            '''attention_showのときは、attention_probsもリターンする'''\n",
    "            self_output, attention_probs = self.selfattn(input_tensor, attention_mask, attention_show_flg)\n",
    "            attention_output = self.output(self_output, input_tensor)\n",
    "            return attention_output, attention_probs\n",
    "        \n",
    "        elif attention_show_flg == False:\n",
    "            self_output = self.selfattn(input_tensor, attention_mask, attention_show_flg)\n",
    "            attention_output = self.output(self_output, input_tensor)\n",
    "            return attention_output"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "metadata": {},
   "outputs": [],
   "source": [
    "class BertSelfAttention(nn.Module):\n",
    "    '''BertAttentionのSelf-Attentionです'''\n",
    "\n",
    "    def __init__(self, config):\n",
    "        super(BertSelfAttention, self).__init__()\n",
    "\n",
    "        self.num_attention_heads = config.num_attention_heads\n",
    "        # num_attention_heads': 12\n",
    "\n",
    "        self.attention_head_size = int(\n",
    "            config.hidden_size / config.num_attention_heads)  # 768/12=64\n",
    "        self.all_head_size = self.num_attention_heads * \\\n",
    "            self.attention_head_size  # = 'hidden_size': 768\n",
    "\n",
    "        # Self-Attentionの特徴量を作成する全結合層\n",
    "        self.query = nn.Linear(config.hidden_size, self.all_head_size)\n",
    "        self.key = nn.Linear(config.hidden_size, self.all_head_size)\n",
    "        self.value = nn.Linear(config.hidden_size, self.all_head_size)\n",
    "\n",
    "        # Dropout\n",
    "        self.dropout = nn.Dropout(config.attention_probs_dropout_prob)\n",
    "\n",
    "    def transpose_for_scores(self, x):\n",
    "        '''multi-head Attention用にテンソルの形を変換する\n",
    "        [batch_size, seq_len, hidden] → [batch_size, 12, seq_len, hidden/12] \n",
    "        '''\n",
    "        new_x_shape = x.size()[\n",
    "            :-1] + (self.num_attention_heads, self.attention_head_size)\n",
    "        x = x.view(*new_x_shape)\n",
    "        return x.permute(0, 2, 1, 3)\n",
    "\n",
    "    def forward(self, hidden_states, attention_mask, attention_show_flg=False):\n",
    "        '''\n",
    "        hidden_states：Embeddingsモジュールもしくは前段のBertLayerからの出力\n",
    "        attention_mask：Transformerのマスクと同じ働きのマスキングです\n",
    "        attention_show_flg：Self-Attentionの重みを返すかのフラグ\n",
    "        '''\n",
    "        # 入力を全結合層で特徴量変換（注意、multi-head Attentionの全部をまとめて変換しています）\n",
    "        mixed_query_layer = self.query(hidden_states)\n",
    "        mixed_key_layer = self.key(hidden_states)\n",
    "        mixed_value_layer = self.value(hidden_states)\n",
    "\n",
    "        # multi-head Attention用にテンソルの形を変換\n",
    "        query_layer = self.transpose_for_scores(mixed_query_layer)\n",
    "        key_layer = self.transpose_for_scores(mixed_key_layer)\n",
    "        value_layer = self.transpose_for_scores(mixed_value_layer)\n",
    "\n",
    "        # 特徴量同士を掛け算して似ている度合をAttention_scoresとして求める\n",
    "        attention_scores = torch.matmul(\n",
    "            query_layer, key_layer.transpose(-1, -2))\n",
    "        attention_scores = attention_scores / \\\n",
    "            math.sqrt(self.attention_head_size)\n",
    "\n",
    "        # マスクがある部分にはマスクをかけます\n",
    "        attention_scores = attention_scores + attention_mask\n",
    "        # （備考）\n",
    "        # マスクが掛け算でなく足し算なのが直感的でないですが、このあとSoftmaxで正規化するので、\n",
    "        # マスクされた部分は-infにしたいです。 attention_maskには、0か-infが\n",
    "        # もともと入っているので足し算にしています。\n",
    "\n",
    "        # Attentionを正規化する\n",
    "        attention_probs = nn.Softmax(dim=-1)(attention_scores)\n",
    "\n",
    "        # ドロップアウトします\n",
    "        attention_probs = self.dropout(attention_probs)\n",
    "\n",
    "        # Attention Mapを掛け算します\n",
    "        context_layer = torch.matmul(attention_probs, value_layer)\n",
    "\n",
    "        # multi-head Attentionのテンソルの形をもとに戻す\n",
    "        context_layer = context_layer.permute(0, 2, 1, 3).contiguous()\n",
    "        new_context_layer_shape = context_layer.size()[\n",
    "            :-2] + (self.all_head_size,)\n",
    "        context_layer = context_layer.view(*new_context_layer_shape)\n",
    "\n",
    "        # attention_showのときは、attention_probsもリターンする\n",
    "        if attention_show_flg == True:\n",
    "            return context_layer, attention_probs\n",
    "        elif attention_show_flg == False:\n",
    "            return context_layer\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "metadata": {},
   "outputs": [],
   "source": [
    "class BertSelfOutput(nn.Module):\n",
    "    '''BertSelfAttentionの出力を処理する全結合層です'''\n",
    "\n",
    "    def __init__(self, config):\n",
    "        super(BertSelfOutput, self).__init__()\n",
    "\n",
    "        self.dense = nn.Linear(config.hidden_size, config.hidden_size)\n",
    "        self.LayerNorm = BertLayerNorm(config.hidden_size, eps=1e-12)\n",
    "        self.dropout = nn.Dropout(config.hidden_dropout_prob)\n",
    "        # 'hidden_dropout_prob': 0.1\n",
    "\n",
    "    def forward(self, hidden_states, input_tensor):\n",
    "        '''\n",
    "        hidden_states：BertSelfAttentionの出力テンソル\n",
    "        input_tensor：Embeddingsモジュールもしくは前段のBertLayerからの出力\n",
    "        '''\n",
    "        hidden_states = self.dense(hidden_states)\n",
    "        hidden_states = self.dropout(hidden_states)\n",
    "        hidden_states = self.LayerNorm(hidden_states + input_tensor)\n",
    "        return hidden_states\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 15,
   "metadata": {},
   "outputs": [],
   "source": [
    "def gelu(x):\n",
    "    '''Gaussian Error Linear Unitという活性化関数です。\n",
    "    LeLUが0でカクっと不連続なので、そこを連続になるように滑らかにした形のLeLUです。\n",
    "    '''\n",
    "    return x * 0.5 * (1.0 + torch.erf(x / math.sqrt(2.0)))\n",
    "\n",
    "\n",
    "class BertIntermediate(nn.Module):\n",
    "    '''BERTのTransformerBlockモジュールのFeedForwardです'''\n",
    "    def __init__(self, config):\n",
    "        super(BertIntermediate, self).__init__()\n",
    "        \n",
    "        # 全結合層：'hidden_size': 768、'intermediate_size': 3072\n",
    "        self.dense = nn.Linear(config.hidden_size, config.intermediate_size)\n",
    "        \n",
    "        # 活性化関数gelu\n",
    "        self.intermediate_act_fn = gelu\n",
    "            \n",
    "    def forward(self, hidden_states):\n",
    "        '''\n",
    "        hidden_states： BertAttentionの出力テンソル\n",
    "        '''\n",
    "        hidden_states = self.dense(hidden_states)\n",
    "        hidden_states = self.intermediate_act_fn(hidden_states)  # GELUによる活性化\n",
    "        return hidden_states"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 16,
   "metadata": {},
   "outputs": [],
   "source": [
    "class BertOutput(nn.Module):\n",
    "    '''BERTのTransformerBlockモジュールのFeedForwardです'''\n",
    "\n",
    "    def __init__(self, config):\n",
    "        super(BertOutput, self).__init__()\n",
    "\n",
    "        # 全結合層：'intermediate_size': 3072、'hidden_size': 768\n",
    "        self.dense = nn.Linear(config.intermediate_size, config.hidden_size)\n",
    "\n",
    "        self.LayerNorm = BertLayerNorm(config.hidden_size, eps=1e-12)\n",
    "\n",
    "        # 'hidden_dropout_prob': 0.1\n",
    "        self.dropout = nn.Dropout(config.hidden_dropout_prob)\n",
    "\n",
    "    def forward(self, hidden_states, input_tensor):\n",
    "        '''\n",
    "        hidden_states： BertIntermediateの出力テンソル\n",
    "        input_tensor：BertAttentionの出力テンソル\n",
    "        '''\n",
    "        hidden_states = self.dense(hidden_states)\n",
    "        hidden_states = self.dropout(hidden_states)\n",
    "        hidden_states = self.LayerNorm(hidden_states + input_tensor)\n",
    "        return hidden_states\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## BertLayerモジュールの繰り返し部分"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 17,
   "metadata": {},
   "outputs": [],
   "source": [
    "# BertLayerモジュールの繰り返し部分モジュールの繰り返し部分です\n",
    "\n",
    "\n",
    "class BertEncoder(nn.Module):\n",
    "    def __init__(self, config):\n",
    "        '''BertLayerモジュールの繰り返し部分モジュールの繰り返し部分です'''\n",
    "        super(BertEncoder, self).__init__()\n",
    "\n",
    "        # config.num_hidden_layers の値、すなわち12 個のBertLayerモジュールを作ります\n",
    "        self.layer = nn.ModuleList([BertLayer(config)\n",
    "                                    for _ in range(config.num_hidden_layers)])\n",
    "\n",
    "    def forward(self, hidden_states, attention_mask, output_all_encoded_layers=True, attention_show_flg=False):\n",
    "        '''\n",
    "        hidden_states：Embeddingsモジュールの出力\n",
    "        attention_mask：Transformerのマスクと同じ働きのマスキングです\n",
    "        output_all_encoded_layers：返り値を全TransformerBlockモジュールの出力にするか、\n",
    "        それとも、最終層だけにするかのフラグ。\n",
    "        attention_show_flg：Self-Attentionの重みを返すかのフラグ\n",
    "        '''\n",
    "\n",
    "        # 返り値として使うリスト\n",
    "        all_encoder_layers = []\n",
    "\n",
    "        # BertLayerモジュールの処理を繰り返す\n",
    "        for layer_module in self.layer:\n",
    "\n",
    "            if attention_show_flg == True:\n",
    "                '''attention_showのときは、attention_probsもリターンする'''\n",
    "                hidden_states, attention_probs = layer_module(\n",
    "                    hidden_states, attention_mask, attention_show_flg)\n",
    "            elif attention_show_flg == False:\n",
    "                hidden_states = layer_module(\n",
    "                    hidden_states, attention_mask, attention_show_flg)\n",
    "\n",
    "            # 返り値にBertLayerから出力された特徴量を12層分、すべて使用する場合の処理\n",
    "            if output_all_encoded_layers:\n",
    "                all_encoder_layers.append(hidden_states)\n",
    "\n",
    "        # 返り値に最後のBertLayerから出力された特徴量だけを使う場合の処理\n",
    "        if not output_all_encoded_layers:\n",
    "            all_encoder_layers.append(hidden_states)\n",
    "\n",
    "        # attention_showのときは、attention_probs（最後の12段目）もリターンする\n",
    "        if attention_show_flg == True:\n",
    "            return all_encoder_layers, attention_probs\n",
    "        elif attention_show_flg == False:\n",
    "            return all_encoder_layers\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## BertPoolerモジュール\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 18,
   "metadata": {},
   "outputs": [],
   "source": [
    "class BertPooler(nn.Module):\n",
    "    '''入力文章の1単語目[cls]の特徴量を変換して保持するためのモジュール'''\n",
    "\n",
    "    def __init__(self, config):\n",
    "        super(BertPooler, self).__init__()\n",
    "\n",
    "        # 全結合層、'hidden_size': 768\n",
    "        self.dense = nn.Linear(config.hidden_size, config.hidden_size)\n",
    "        self.activation = nn.Tanh()\n",
    "\n",
    "    def forward(self, hidden_states):\n",
    "        # 1単語目の特徴量を取得\n",
    "        first_token_tensor = hidden_states[:, 0]\n",
    "\n",
    "        # 全結合層で特徴量変換\n",
    "        pooled_output = self.dense(first_token_tensor)\n",
    "\n",
    "        # 活性化関数Tanhを計算\n",
    "        pooled_output = self.activation(pooled_output)\n",
    "\n",
    "        return pooled_output\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 動作確認"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 19,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "入力の単語ID列のテンソルサイズ： torch.Size([2, 5])\n",
      "入力のマスクのテンソルサイズ： torch.Size([2, 5])\n",
      "入力の文章IDのテンソルサイズ： torch.Size([2, 5])\n",
      "拡張したマスクのテンソルサイズ： torch.Size([2, 1, 1, 5])\n",
      "BertEmbeddingsの出力テンソルサイズ： torch.Size([2, 5, 768])\n",
      "BertEncoderの最終層の出力テンソルサイズ： torch.Size([2, 5, 768])\n",
      "BertPoolerの出力テンソルサイズ： torch.Size([2, 768])\n"
     ]
    }
   ],
   "source": [
    "# 動作確認\n",
    "\n",
    "# 入力の単語ID列、batch_sizeは2つ\n",
    "input_ids = torch.LongTensor([[31, 51, 12, 23, 99], [15, 5, 1, 0, 0]])\n",
    "print(\"入力の単語ID列のテンソルサイズ：\", input_ids.shape)\n",
    "\n",
    "# マスク\n",
    "attention_mask = torch.LongTensor([[1, 1, 1, 1, 1], [1, 1, 1, 0, 0]])\n",
    "print(\"入力のマスクのテンソルサイズ：\", attention_mask.shape)\n",
    "\n",
    "# 文章のID。2つのミニバッチそれぞれについて、0が1文目、1が2文目を示す\n",
    "token_type_ids = torch.LongTensor([[0, 0, 1, 1, 1], [0, 1, 1, 1, 1]])\n",
    "print(\"入力の文章IDのテンソルサイズ：\", token_type_ids.shape)\n",
    "\n",
    "\n",
    "# BERTの各モジュールを用意\n",
    "embeddings = BertEmbeddings(config)\n",
    "encoder = BertEncoder(config)\n",
    "pooler = BertPooler(config)\n",
    "\n",
    "# マスクの変形　[batch_size, 1, 1, seq_length]にする\n",
    "# Attentionをかけない部分はマイナス無限にしたいので、代わりに-10000をかけ算しています\n",
    "extended_attention_mask = attention_mask.unsqueeze(1).unsqueeze(2)\n",
    "extended_attention_mask = extended_attention_mask.to(dtype=torch.float32)\n",
    "extended_attention_mask = (1.0 - extended_attention_mask) * -10000.0\n",
    "print(\"拡張したマスクのテンソルサイズ：\", extended_attention_mask.shape)\n",
    "\n",
    "# 順伝搬する\n",
    "out1 = embeddings(input_ids, token_type_ids)\n",
    "print(\"BertEmbeddingsの出力テンソルサイズ：\", out1.shape)\n",
    "\n",
    "out2 = encoder(out1, extended_attention_mask)\n",
    "# out2は、[minibatch, seq_length, embedding_dim]が12個のリスト\n",
    "print(\"BertEncoderの最終層の出力テンソルサイズ：\", out2[0].shape)\n",
    "\n",
    "out3 = pooler(out2[-1])  # out2は12層の特徴量のリストになっているので一番最後を使用\n",
    "print(\"BertPoolerの出力テンソルサイズ：\", out3.shape)\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 全部をつなげてBERTモデルにする"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 20,
   "metadata": {},
   "outputs": [],
   "source": [
    "class BertModel(nn.Module):\n",
    "    '''モジュールを全部つなげたBERTモデル'''\n",
    "\n",
    "    def __init__(self, config):\n",
    "        super(BertModel, self).__init__()\n",
    "\n",
    "        # 3つのモジュールを作成\n",
    "        self.embeddings = BertEmbeddings(config)\n",
    "        self.encoder = BertEncoder(config)\n",
    "        self.pooler = BertPooler(config)\n",
    "\n",
    "    def forward(self, input_ids, token_type_ids=None, attention_mask=None, output_all_encoded_layers=True, attention_show_flg=False):\n",
    "        '''\n",
    "        input_ids： [batch_size, sequence_length]の文章の単語IDの羅列\n",
    "        token_type_ids： [batch_size, sequence_length]の、各単語が1文目なのか、2文目なのかを示すid\n",
    "        attention_mask：Transformerのマスクと同じ働きのマスキングです\n",
    "        output_all_encoded_layers：最終出力に12段のTransformerの全部をリストで返すか、最後だけかを指定\n",
    "        attention_show_flg：Self-Attentionの重みを返すかのフラグ\n",
    "        '''\n",
    "\n",
    "        # Attentionのマスクと文の1文目、2文目のidが無ければ作成する\n",
    "        if attention_mask is None:\n",
    "            attention_mask = torch.ones_like(input_ids)\n",
    "        if token_type_ids is None:\n",
    "            token_type_ids = torch.zeros_like(input_ids)\n",
    "\n",
    "        # マスクの変形　[minibatch, 1, 1, seq_length]にする\n",
    "        # 後ほどmulti-head Attentionで使用できる形にしたいので\n",
    "        extended_attention_mask = attention_mask.unsqueeze(1).unsqueeze(2)\n",
    "\n",
    "        # マスクは0、1だがソフトマックスを計算したときにマスクになるように、0と-infにする\n",
    "        # -infの代わりに-10000にしておく\n",
    "        extended_attention_mask = extended_attention_mask.to(\n",
    "            dtype=torch.float32)\n",
    "        extended_attention_mask = (1.0 - extended_attention_mask) * -10000.0\n",
    "\n",
    "        # 順伝搬させる\n",
    "        # BertEmbeddinsモジュール\n",
    "        embedding_output = self.embeddings(input_ids, token_type_ids)\n",
    "\n",
    "        # BertLayerモジュール（Transformer）を繰り返すBertEncoderモジュール\n",
    "        if attention_show_flg == True:\n",
    "            '''attention_showのときは、attention_probsもリターンする'''\n",
    "\n",
    "            encoded_layers, attention_probs = self.encoder(embedding_output,\n",
    "                                                           extended_attention_mask,\n",
    "                                                           output_all_encoded_layers, attention_show_flg)\n",
    "\n",
    "        elif attention_show_flg == False:\n",
    "            encoded_layers = self.encoder(embedding_output,\n",
    "                                          extended_attention_mask,\n",
    "                                          output_all_encoded_layers, attention_show_flg)\n",
    "\n",
    "        # BertPoolerモジュール\n",
    "        # encoderの一番最後のBertLayerから出力された特徴量を使う\n",
    "        pooled_output = self.pooler(encoded_layers[-1])\n",
    "\n",
    "        # output_all_encoded_layersがFalseの場合はリストではなく、テンソルを返す\n",
    "        if not output_all_encoded_layers:\n",
    "            encoded_layers = encoded_layers[-1]\n",
    "\n",
    "        # attention_showのときは、attention_probs（1番最後の）もリターンする\n",
    "        if attention_show_flg == True:\n",
    "            return encoded_layers, pooled_output, attention_probs\n",
    "        elif attention_show_flg == False:\n",
    "            return encoded_layers, pooled_output\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 21,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "encoded_layersのテンソルサイズ： torch.Size([2, 5, 768])\n",
      "pooled_outputのテンソルサイズ： torch.Size([2, 768])\n",
      "attention_probsのテンソルサイズ： torch.Size([2, 12, 5, 5])\n"
     ]
    }
   ],
   "source": [
    "# 動作確認\n",
    "# 入力の用意\n",
    "input_ids = torch.LongTensor([[31, 51, 12, 23, 99], [15, 5, 1, 0, 0]])\n",
    "attention_mask = torch.LongTensor([[1, 1, 1, 1, 1], [1, 1, 1, 0, 0]])\n",
    "token_type_ids = torch.LongTensor([[0, 0, 1, 1, 1], [0, 1, 1, 1, 1]])\n",
    "\n",
    "# BERTモデルを作る\n",
    "net = BertModel(config)\n",
    "\n",
    "# 順伝搬させる\n",
    "encoded_layers, pooled_output, attention_probs = net(\n",
    "    input_ids, token_type_ids, attention_mask, output_all_encoded_layers=False, attention_show_flg=True)\n",
    "\n",
    "print(\"encoded_layersのテンソルサイズ：\", encoded_layers.shape)\n",
    "print(\"pooled_outputのテンソルサイズ：\", pooled_output.shape)\n",
    "print(\"attention_probsのテンソルサイズ：\", attention_probs.shape)\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# 8.3 BERTを用いたbank（銀行）とbank（土手）の単語ベクトル表現の比較"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 学習済みモデルのロード"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 22,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "bert.embeddings.word_embeddings.weight\n",
      "bert.embeddings.position_embeddings.weight\n",
      "bert.embeddings.token_type_embeddings.weight\n",
      "bert.embeddings.LayerNorm.gamma\n",
      "bert.embeddings.LayerNorm.beta\n",
      "bert.encoder.layer.0.attention.self.query.weight\n",
      "bert.encoder.layer.0.attention.self.query.bias\n",
      "bert.encoder.layer.0.attention.self.key.weight\n",
      "bert.encoder.layer.0.attention.self.key.bias\n",
      "bert.encoder.layer.0.attention.self.value.weight\n",
      "bert.encoder.layer.0.attention.self.value.bias\n",
      "bert.encoder.layer.0.attention.output.dense.weight\n",
      "bert.encoder.layer.0.attention.output.dense.bias\n",
      "bert.encoder.layer.0.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.0.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.0.intermediate.dense.weight\n",
      "bert.encoder.layer.0.intermediate.dense.bias\n",
      "bert.encoder.layer.0.output.dense.weight\n",
      "bert.encoder.layer.0.output.dense.bias\n",
      "bert.encoder.layer.0.output.LayerNorm.gamma\n",
      "bert.encoder.layer.0.output.LayerNorm.beta\n",
      "bert.encoder.layer.1.attention.self.query.weight\n",
      "bert.encoder.layer.1.attention.self.query.bias\n",
      "bert.encoder.layer.1.attention.self.key.weight\n",
      "bert.encoder.layer.1.attention.self.key.bias\n",
      "bert.encoder.layer.1.attention.self.value.weight\n",
      "bert.encoder.layer.1.attention.self.value.bias\n",
      "bert.encoder.layer.1.attention.output.dense.weight\n",
      "bert.encoder.layer.1.attention.output.dense.bias\n",
      "bert.encoder.layer.1.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.1.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.1.intermediate.dense.weight\n",
      "bert.encoder.layer.1.intermediate.dense.bias\n",
      "bert.encoder.layer.1.output.dense.weight\n",
      "bert.encoder.layer.1.output.dense.bias\n",
      "bert.encoder.layer.1.output.LayerNorm.gamma\n",
      "bert.encoder.layer.1.output.LayerNorm.beta\n",
      "bert.encoder.layer.2.attention.self.query.weight\n",
      "bert.encoder.layer.2.attention.self.query.bias\n",
      "bert.encoder.layer.2.attention.self.key.weight\n",
      "bert.encoder.layer.2.attention.self.key.bias\n",
      "bert.encoder.layer.2.attention.self.value.weight\n",
      "bert.encoder.layer.2.attention.self.value.bias\n",
      "bert.encoder.layer.2.attention.output.dense.weight\n",
      "bert.encoder.layer.2.attention.output.dense.bias\n",
      "bert.encoder.layer.2.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.2.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.2.intermediate.dense.weight\n",
      "bert.encoder.layer.2.intermediate.dense.bias\n",
      "bert.encoder.layer.2.output.dense.weight\n",
      "bert.encoder.layer.2.output.dense.bias\n",
      "bert.encoder.layer.2.output.LayerNorm.gamma\n",
      "bert.encoder.layer.2.output.LayerNorm.beta\n",
      "bert.encoder.layer.3.attention.self.query.weight\n",
      "bert.encoder.layer.3.attention.self.query.bias\n",
      "bert.encoder.layer.3.attention.self.key.weight\n",
      "bert.encoder.layer.3.attention.self.key.bias\n",
      "bert.encoder.layer.3.attention.self.value.weight\n",
      "bert.encoder.layer.3.attention.self.value.bias\n",
      "bert.encoder.layer.3.attention.output.dense.weight\n",
      "bert.encoder.layer.3.attention.output.dense.bias\n",
      "bert.encoder.layer.3.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.3.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.3.intermediate.dense.weight\n",
      "bert.encoder.layer.3.intermediate.dense.bias\n",
      "bert.encoder.layer.3.output.dense.weight\n",
      "bert.encoder.layer.3.output.dense.bias\n",
      "bert.encoder.layer.3.output.LayerNorm.gamma\n",
      "bert.encoder.layer.3.output.LayerNorm.beta\n",
      "bert.encoder.layer.4.attention.self.query.weight\n",
      "bert.encoder.layer.4.attention.self.query.bias\n",
      "bert.encoder.layer.4.attention.self.key.weight\n",
      "bert.encoder.layer.4.attention.self.key.bias\n",
      "bert.encoder.layer.4.attention.self.value.weight\n",
      "bert.encoder.layer.4.attention.self.value.bias\n",
      "bert.encoder.layer.4.attention.output.dense.weight\n",
      "bert.encoder.layer.4.attention.output.dense.bias\n",
      "bert.encoder.layer.4.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.4.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.4.intermediate.dense.weight\n",
      "bert.encoder.layer.4.intermediate.dense.bias\n",
      "bert.encoder.layer.4.output.dense.weight\n",
      "bert.encoder.layer.4.output.dense.bias\n",
      "bert.encoder.layer.4.output.LayerNorm.gamma\n",
      "bert.encoder.layer.4.output.LayerNorm.beta\n",
      "bert.encoder.layer.5.attention.self.query.weight\n",
      "bert.encoder.layer.5.attention.self.query.bias\n",
      "bert.encoder.layer.5.attention.self.key.weight\n",
      "bert.encoder.layer.5.attention.self.key.bias\n",
      "bert.encoder.layer.5.attention.self.value.weight\n",
      "bert.encoder.layer.5.attention.self.value.bias\n",
      "bert.encoder.layer.5.attention.output.dense.weight\n",
      "bert.encoder.layer.5.attention.output.dense.bias\n",
      "bert.encoder.layer.5.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.5.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.5.intermediate.dense.weight\n",
      "bert.encoder.layer.5.intermediate.dense.bias\n",
      "bert.encoder.layer.5.output.dense.weight\n",
      "bert.encoder.layer.5.output.dense.bias\n",
      "bert.encoder.layer.5.output.LayerNorm.gamma\n",
      "bert.encoder.layer.5.output.LayerNorm.beta\n",
      "bert.encoder.layer.6.attention.self.query.weight\n",
      "bert.encoder.layer.6.attention.self.query.bias\n",
      "bert.encoder.layer.6.attention.self.key.weight\n",
      "bert.encoder.layer.6.attention.self.key.bias\n",
      "bert.encoder.layer.6.attention.self.value.weight\n",
      "bert.encoder.layer.6.attention.self.value.bias\n",
      "bert.encoder.layer.6.attention.output.dense.weight\n",
      "bert.encoder.layer.6.attention.output.dense.bias\n",
      "bert.encoder.layer.6.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.6.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.6.intermediate.dense.weight\n",
      "bert.encoder.layer.6.intermediate.dense.bias\n",
      "bert.encoder.layer.6.output.dense.weight\n",
      "bert.encoder.layer.6.output.dense.bias\n",
      "bert.encoder.layer.6.output.LayerNorm.gamma\n",
      "bert.encoder.layer.6.output.LayerNorm.beta\n",
      "bert.encoder.layer.7.attention.self.query.weight\n",
      "bert.encoder.layer.7.attention.self.query.bias\n",
      "bert.encoder.layer.7.attention.self.key.weight\n",
      "bert.encoder.layer.7.attention.self.key.bias\n",
      "bert.encoder.layer.7.attention.self.value.weight\n",
      "bert.encoder.layer.7.attention.self.value.bias\n",
      "bert.encoder.layer.7.attention.output.dense.weight\n",
      "bert.encoder.layer.7.attention.output.dense.bias\n",
      "bert.encoder.layer.7.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.7.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.7.intermediate.dense.weight\n",
      "bert.encoder.layer.7.intermediate.dense.bias\n",
      "bert.encoder.layer.7.output.dense.weight\n",
      "bert.encoder.layer.7.output.dense.bias\n",
      "bert.encoder.layer.7.output.LayerNorm.gamma\n",
      "bert.encoder.layer.7.output.LayerNorm.beta\n",
      "bert.encoder.layer.8.attention.self.query.weight\n",
      "bert.encoder.layer.8.attention.self.query.bias\n",
      "bert.encoder.layer.8.attention.self.key.weight\n",
      "bert.encoder.layer.8.attention.self.key.bias\n",
      "bert.encoder.layer.8.attention.self.value.weight\n",
      "bert.encoder.layer.8.attention.self.value.bias\n",
      "bert.encoder.layer.8.attention.output.dense.weight\n",
      "bert.encoder.layer.8.attention.output.dense.bias\n",
      "bert.encoder.layer.8.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.8.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.8.intermediate.dense.weight\n",
      "bert.encoder.layer.8.intermediate.dense.bias\n",
      "bert.encoder.layer.8.output.dense.weight\n",
      "bert.encoder.layer.8.output.dense.bias\n",
      "bert.encoder.layer.8.output.LayerNorm.gamma\n",
      "bert.encoder.layer.8.output.LayerNorm.beta\n",
      "bert.encoder.layer.9.attention.self.query.weight\n",
      "bert.encoder.layer.9.attention.self.query.bias\n",
      "bert.encoder.layer.9.attention.self.key.weight\n",
      "bert.encoder.layer.9.attention.self.key.bias\n",
      "bert.encoder.layer.9.attention.self.value.weight\n",
      "bert.encoder.layer.9.attention.self.value.bias\n",
      "bert.encoder.layer.9.attention.output.dense.weight\n",
      "bert.encoder.layer.9.attention.output.dense.bias\n",
      "bert.encoder.layer.9.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.9.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.9.intermediate.dense.weight\n",
      "bert.encoder.layer.9.intermediate.dense.bias\n",
      "bert.encoder.layer.9.output.dense.weight\n",
      "bert.encoder.layer.9.output.dense.bias\n",
      "bert.encoder.layer.9.output.LayerNorm.gamma\n",
      "bert.encoder.layer.9.output.LayerNorm.beta\n",
      "bert.encoder.layer.10.attention.self.query.weight\n",
      "bert.encoder.layer.10.attention.self.query.bias\n",
      "bert.encoder.layer.10.attention.self.key.weight\n",
      "bert.encoder.layer.10.attention.self.key.bias\n",
      "bert.encoder.layer.10.attention.self.value.weight\n",
      "bert.encoder.layer.10.attention.self.value.bias\n",
      "bert.encoder.layer.10.attention.output.dense.weight\n",
      "bert.encoder.layer.10.attention.output.dense.bias\n",
      "bert.encoder.layer.10.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.10.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.10.intermediate.dense.weight\n",
      "bert.encoder.layer.10.intermediate.dense.bias\n",
      "bert.encoder.layer.10.output.dense.weight\n",
      "bert.encoder.layer.10.output.dense.bias\n",
      "bert.encoder.layer.10.output.LayerNorm.gamma\n",
      "bert.encoder.layer.10.output.LayerNorm.beta\n",
      "bert.encoder.layer.11.attention.self.query.weight\n",
      "bert.encoder.layer.11.attention.self.query.bias\n",
      "bert.encoder.layer.11.attention.self.key.weight\n",
      "bert.encoder.layer.11.attention.self.key.bias\n",
      "bert.encoder.layer.11.attention.self.value.weight\n",
      "bert.encoder.layer.11.attention.self.value.bias\n",
      "bert.encoder.layer.11.attention.output.dense.weight\n",
      "bert.encoder.layer.11.attention.output.dense.bias\n",
      "bert.encoder.layer.11.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.11.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.11.intermediate.dense.weight\n",
      "bert.encoder.layer.11.intermediate.dense.bias\n",
      "bert.encoder.layer.11.output.dense.weight\n",
      "bert.encoder.layer.11.output.dense.bias\n",
      "bert.encoder.layer.11.output.LayerNorm.gamma\n",
      "bert.encoder.layer.11.output.LayerNorm.beta\n",
      "bert.pooler.dense.weight\n",
      "bert.pooler.dense.bias\n",
      "cls.predictions.bias\n",
      "cls.predictions.transform.dense.weight\n",
      "cls.predictions.transform.dense.bias\n",
      "cls.predictions.transform.LayerNorm.gamma\n",
      "cls.predictions.transform.LayerNorm.beta\n",
      "cls.predictions.decoder.weight\n",
      "cls.seq_relationship.weight\n",
      "cls.seq_relationship.bias\n"
     ]
    }
   ],
   "source": [
    "# 学習済みモデルのロード\n",
    "weights_path = \"./weights/pytorch_model.bin\"\n",
    "loaded_state_dict = torch.load(weights_path)\n",
    "\n",
    "for s in loaded_state_dict.keys():\n",
    "    print(s)\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 23,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "embeddings.word_embeddings.weight\n",
      "embeddings.position_embeddings.weight\n",
      "embeddings.token_type_embeddings.weight\n",
      "embeddings.LayerNorm.gamma\n",
      "embeddings.LayerNorm.beta\n",
      "encoder.layer.0.attention.selfattn.query.weight\n",
      "encoder.layer.0.attention.selfattn.query.bias\n",
      "encoder.layer.0.attention.selfattn.key.weight\n",
      "encoder.layer.0.attention.selfattn.key.bias\n",
      "encoder.layer.0.attention.selfattn.value.weight\n",
      "encoder.layer.0.attention.selfattn.value.bias\n",
      "encoder.layer.0.attention.output.dense.weight\n",
      "encoder.layer.0.attention.output.dense.bias\n",
      "encoder.layer.0.attention.output.LayerNorm.gamma\n",
      "encoder.layer.0.attention.output.LayerNorm.beta\n",
      "encoder.layer.0.intermediate.dense.weight\n",
      "encoder.layer.0.intermediate.dense.bias\n",
      "encoder.layer.0.output.dense.weight\n",
      "encoder.layer.0.output.dense.bias\n",
      "encoder.layer.0.output.LayerNorm.gamma\n",
      "encoder.layer.0.output.LayerNorm.beta\n",
      "encoder.layer.1.attention.selfattn.query.weight\n",
      "encoder.layer.1.attention.selfattn.query.bias\n",
      "encoder.layer.1.attention.selfattn.key.weight\n",
      "encoder.layer.1.attention.selfattn.key.bias\n",
      "encoder.layer.1.attention.selfattn.value.weight\n",
      "encoder.layer.1.attention.selfattn.value.bias\n",
      "encoder.layer.1.attention.output.dense.weight\n",
      "encoder.layer.1.attention.output.dense.bias\n",
      "encoder.layer.1.attention.output.LayerNorm.gamma\n",
      "encoder.layer.1.attention.output.LayerNorm.beta\n",
      "encoder.layer.1.intermediate.dense.weight\n",
      "encoder.layer.1.intermediate.dense.bias\n",
      "encoder.layer.1.output.dense.weight\n",
      "encoder.layer.1.output.dense.bias\n",
      "encoder.layer.1.output.LayerNorm.gamma\n",
      "encoder.layer.1.output.LayerNorm.beta\n",
      "encoder.layer.2.attention.selfattn.query.weight\n",
      "encoder.layer.2.attention.selfattn.query.bias\n",
      "encoder.layer.2.attention.selfattn.key.weight\n",
      "encoder.layer.2.attention.selfattn.key.bias\n",
      "encoder.layer.2.attention.selfattn.value.weight\n",
      "encoder.layer.2.attention.selfattn.value.bias\n",
      "encoder.layer.2.attention.output.dense.weight\n",
      "encoder.layer.2.attention.output.dense.bias\n",
      "encoder.layer.2.attention.output.LayerNorm.gamma\n",
      "encoder.layer.2.attention.output.LayerNorm.beta\n",
      "encoder.layer.2.intermediate.dense.weight\n",
      "encoder.layer.2.intermediate.dense.bias\n",
      "encoder.layer.2.output.dense.weight\n",
      "encoder.layer.2.output.dense.bias\n",
      "encoder.layer.2.output.LayerNorm.gamma\n",
      "encoder.layer.2.output.LayerNorm.beta\n",
      "encoder.layer.3.attention.selfattn.query.weight\n",
      "encoder.layer.3.attention.selfattn.query.bias\n",
      "encoder.layer.3.attention.selfattn.key.weight\n",
      "encoder.layer.3.attention.selfattn.key.bias\n",
      "encoder.layer.3.attention.selfattn.value.weight\n",
      "encoder.layer.3.attention.selfattn.value.bias\n",
      "encoder.layer.3.attention.output.dense.weight\n",
      "encoder.layer.3.attention.output.dense.bias\n",
      "encoder.layer.3.attention.output.LayerNorm.gamma\n",
      "encoder.layer.3.attention.output.LayerNorm.beta\n",
      "encoder.layer.3.intermediate.dense.weight\n",
      "encoder.layer.3.intermediate.dense.bias\n",
      "encoder.layer.3.output.dense.weight\n",
      "encoder.layer.3.output.dense.bias\n",
      "encoder.layer.3.output.LayerNorm.gamma\n",
      "encoder.layer.3.output.LayerNorm.beta\n",
      "encoder.layer.4.attention.selfattn.query.weight\n",
      "encoder.layer.4.attention.selfattn.query.bias\n",
      "encoder.layer.4.attention.selfattn.key.weight\n",
      "encoder.layer.4.attention.selfattn.key.bias\n",
      "encoder.layer.4.attention.selfattn.value.weight\n",
      "encoder.layer.4.attention.selfattn.value.bias\n",
      "encoder.layer.4.attention.output.dense.weight\n",
      "encoder.layer.4.attention.output.dense.bias\n",
      "encoder.layer.4.attention.output.LayerNorm.gamma\n",
      "encoder.layer.4.attention.output.LayerNorm.beta\n",
      "encoder.layer.4.intermediate.dense.weight\n",
      "encoder.layer.4.intermediate.dense.bias\n",
      "encoder.layer.4.output.dense.weight\n",
      "encoder.layer.4.output.dense.bias\n",
      "encoder.layer.4.output.LayerNorm.gamma\n",
      "encoder.layer.4.output.LayerNorm.beta\n",
      "encoder.layer.5.attention.selfattn.query.weight\n",
      "encoder.layer.5.attention.selfattn.query.bias\n",
      "encoder.layer.5.attention.selfattn.key.weight\n",
      "encoder.layer.5.attention.selfattn.key.bias\n",
      "encoder.layer.5.attention.selfattn.value.weight\n",
      "encoder.layer.5.attention.selfattn.value.bias\n",
      "encoder.layer.5.attention.output.dense.weight\n",
      "encoder.layer.5.attention.output.dense.bias\n",
      "encoder.layer.5.attention.output.LayerNorm.gamma\n",
      "encoder.layer.5.attention.output.LayerNorm.beta\n",
      "encoder.layer.5.intermediate.dense.weight\n",
      "encoder.layer.5.intermediate.dense.bias\n",
      "encoder.layer.5.output.dense.weight\n",
      "encoder.layer.5.output.dense.bias\n",
      "encoder.layer.5.output.LayerNorm.gamma\n",
      "encoder.layer.5.output.LayerNorm.beta\n",
      "encoder.layer.6.attention.selfattn.query.weight\n",
      "encoder.layer.6.attention.selfattn.query.bias\n",
      "encoder.layer.6.attention.selfattn.key.weight\n",
      "encoder.layer.6.attention.selfattn.key.bias\n",
      "encoder.layer.6.attention.selfattn.value.weight\n",
      "encoder.layer.6.attention.selfattn.value.bias\n",
      "encoder.layer.6.attention.output.dense.weight\n",
      "encoder.layer.6.attention.output.dense.bias\n",
      "encoder.layer.6.attention.output.LayerNorm.gamma\n",
      "encoder.layer.6.attention.output.LayerNorm.beta\n",
      "encoder.layer.6.intermediate.dense.weight\n",
      "encoder.layer.6.intermediate.dense.bias\n",
      "encoder.layer.6.output.dense.weight\n",
      "encoder.layer.6.output.dense.bias\n",
      "encoder.layer.6.output.LayerNorm.gamma\n",
      "encoder.layer.6.output.LayerNorm.beta\n",
      "encoder.layer.7.attention.selfattn.query.weight\n",
      "encoder.layer.7.attention.selfattn.query.bias\n",
      "encoder.layer.7.attention.selfattn.key.weight\n",
      "encoder.layer.7.attention.selfattn.key.bias\n",
      "encoder.layer.7.attention.selfattn.value.weight\n",
      "encoder.layer.7.attention.selfattn.value.bias\n",
      "encoder.layer.7.attention.output.dense.weight\n",
      "encoder.layer.7.attention.output.dense.bias\n",
      "encoder.layer.7.attention.output.LayerNorm.gamma\n",
      "encoder.layer.7.attention.output.LayerNorm.beta\n",
      "encoder.layer.7.intermediate.dense.weight\n",
      "encoder.layer.7.intermediate.dense.bias\n",
      "encoder.layer.7.output.dense.weight\n",
      "encoder.layer.7.output.dense.bias\n",
      "encoder.layer.7.output.LayerNorm.gamma\n",
      "encoder.layer.7.output.LayerNorm.beta\n",
      "encoder.layer.8.attention.selfattn.query.weight\n",
      "encoder.layer.8.attention.selfattn.query.bias\n",
      "encoder.layer.8.attention.selfattn.key.weight\n",
      "encoder.layer.8.attention.selfattn.key.bias\n",
      "encoder.layer.8.attention.selfattn.value.weight\n",
      "encoder.layer.8.attention.selfattn.value.bias\n",
      "encoder.layer.8.attention.output.dense.weight\n",
      "encoder.layer.8.attention.output.dense.bias\n",
      "encoder.layer.8.attention.output.LayerNorm.gamma\n",
      "encoder.layer.8.attention.output.LayerNorm.beta\n",
      "encoder.layer.8.intermediate.dense.weight\n",
      "encoder.layer.8.intermediate.dense.bias\n",
      "encoder.layer.8.output.dense.weight\n",
      "encoder.layer.8.output.dense.bias\n",
      "encoder.layer.8.output.LayerNorm.gamma\n",
      "encoder.layer.8.output.LayerNorm.beta\n",
      "encoder.layer.9.attention.selfattn.query.weight\n",
      "encoder.layer.9.attention.selfattn.query.bias\n",
      "encoder.layer.9.attention.selfattn.key.weight\n",
      "encoder.layer.9.attention.selfattn.key.bias\n",
      "encoder.layer.9.attention.selfattn.value.weight\n",
      "encoder.layer.9.attention.selfattn.value.bias\n",
      "encoder.layer.9.attention.output.dense.weight\n",
      "encoder.layer.9.attention.output.dense.bias\n",
      "encoder.layer.9.attention.output.LayerNorm.gamma\n",
      "encoder.layer.9.attention.output.LayerNorm.beta\n",
      "encoder.layer.9.intermediate.dense.weight\n",
      "encoder.layer.9.intermediate.dense.bias\n",
      "encoder.layer.9.output.dense.weight\n",
      "encoder.layer.9.output.dense.bias\n",
      "encoder.layer.9.output.LayerNorm.gamma\n",
      "encoder.layer.9.output.LayerNorm.beta\n",
      "encoder.layer.10.attention.selfattn.query.weight\n",
      "encoder.layer.10.attention.selfattn.query.bias\n",
      "encoder.layer.10.attention.selfattn.key.weight\n",
      "encoder.layer.10.attention.selfattn.key.bias\n",
      "encoder.layer.10.attention.selfattn.value.weight\n",
      "encoder.layer.10.attention.selfattn.value.bias\n",
      "encoder.layer.10.attention.output.dense.weight\n",
      "encoder.layer.10.attention.output.dense.bias\n",
      "encoder.layer.10.attention.output.LayerNorm.gamma\n",
      "encoder.layer.10.attention.output.LayerNorm.beta\n",
      "encoder.layer.10.intermediate.dense.weight\n",
      "encoder.layer.10.intermediate.dense.bias\n",
      "encoder.layer.10.output.dense.weight\n",
      "encoder.layer.10.output.dense.bias\n",
      "encoder.layer.10.output.LayerNorm.gamma\n",
      "encoder.layer.10.output.LayerNorm.beta\n",
      "encoder.layer.11.attention.selfattn.query.weight\n",
      "encoder.layer.11.attention.selfattn.query.bias\n",
      "encoder.layer.11.attention.selfattn.key.weight\n",
      "encoder.layer.11.attention.selfattn.key.bias\n",
      "encoder.layer.11.attention.selfattn.value.weight\n",
      "encoder.layer.11.attention.selfattn.value.bias\n",
      "encoder.layer.11.attention.output.dense.weight\n",
      "encoder.layer.11.attention.output.dense.bias\n",
      "encoder.layer.11.attention.output.LayerNorm.gamma\n",
      "encoder.layer.11.attention.output.LayerNorm.beta\n",
      "encoder.layer.11.intermediate.dense.weight\n",
      "encoder.layer.11.intermediate.dense.bias\n",
      "encoder.layer.11.output.dense.weight\n",
      "encoder.layer.11.output.dense.bias\n",
      "encoder.layer.11.output.LayerNorm.gamma\n",
      "encoder.layer.11.output.LayerNorm.beta\n",
      "pooler.dense.weight\n",
      "pooler.dense.bias\n"
     ]
    }
   ],
   "source": [
    "# モデルの用意\n",
    "net = BertModel(config)\n",
    "net.eval()\n",
    "\n",
    "# 現在のネットワークモデルのパラメータ名\n",
    "param_names = []  # パラメータの名前を格納していく\n",
    "\n",
    "for name, param in net.named_parameters():\n",
    "    print(name)\n",
    "    param_names.append(name)\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 24,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "bert.embeddings.word_embeddings.weight→embeddings.word_embeddings.weight\n",
      "bert.embeddings.position_embeddings.weight→embeddings.position_embeddings.weight\n",
      "bert.embeddings.token_type_embeddings.weight→embeddings.token_type_embeddings.weight\n",
      "bert.embeddings.LayerNorm.gamma→embeddings.LayerNorm.gamma\n",
      "bert.embeddings.LayerNorm.beta→embeddings.LayerNorm.beta\n",
      "bert.encoder.layer.0.attention.self.query.weight→encoder.layer.0.attention.selfattn.query.weight\n",
      "bert.encoder.layer.0.attention.self.query.bias→encoder.layer.0.attention.selfattn.query.bias\n",
      "bert.encoder.layer.0.attention.self.key.weight→encoder.layer.0.attention.selfattn.key.weight\n",
      "bert.encoder.layer.0.attention.self.key.bias→encoder.layer.0.attention.selfattn.key.bias\n",
      "bert.encoder.layer.0.attention.self.value.weight→encoder.layer.0.attention.selfattn.value.weight\n",
      "bert.encoder.layer.0.attention.self.value.bias→encoder.layer.0.attention.selfattn.value.bias\n",
      "bert.encoder.layer.0.attention.output.dense.weight→encoder.layer.0.attention.output.dense.weight\n",
      "bert.encoder.layer.0.attention.output.dense.bias→encoder.layer.0.attention.output.dense.bias\n",
      "bert.encoder.layer.0.attention.output.LayerNorm.gamma→encoder.layer.0.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.0.attention.output.LayerNorm.beta→encoder.layer.0.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.0.intermediate.dense.weight→encoder.layer.0.intermediate.dense.weight\n",
      "bert.encoder.layer.0.intermediate.dense.bias→encoder.layer.0.intermediate.dense.bias\n",
      "bert.encoder.layer.0.output.dense.weight→encoder.layer.0.output.dense.weight\n",
      "bert.encoder.layer.0.output.dense.bias→encoder.layer.0.output.dense.bias\n",
      "bert.encoder.layer.0.output.LayerNorm.gamma→encoder.layer.0.output.LayerNorm.gamma\n",
      "bert.encoder.layer.0.output.LayerNorm.beta→encoder.layer.0.output.LayerNorm.beta\n",
      "bert.encoder.layer.1.attention.self.query.weight→encoder.layer.1.attention.selfattn.query.weight\n",
      "bert.encoder.layer.1.attention.self.query.bias→encoder.layer.1.attention.selfattn.query.bias\n",
      "bert.encoder.layer.1.attention.self.key.weight→encoder.layer.1.attention.selfattn.key.weight\n",
      "bert.encoder.layer.1.attention.self.key.bias→encoder.layer.1.attention.selfattn.key.bias\n",
      "bert.encoder.layer.1.attention.self.value.weight→encoder.layer.1.attention.selfattn.value.weight\n",
      "bert.encoder.layer.1.attention.self.value.bias→encoder.layer.1.attention.selfattn.value.bias\n",
      "bert.encoder.layer.1.attention.output.dense.weight→encoder.layer.1.attention.output.dense.weight\n",
      "bert.encoder.layer.1.attention.output.dense.bias→encoder.layer.1.attention.output.dense.bias\n",
      "bert.encoder.layer.1.attention.output.LayerNorm.gamma→encoder.layer.1.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.1.attention.output.LayerNorm.beta→encoder.layer.1.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.1.intermediate.dense.weight→encoder.layer.1.intermediate.dense.weight\n",
      "bert.encoder.layer.1.intermediate.dense.bias→encoder.layer.1.intermediate.dense.bias\n",
      "bert.encoder.layer.1.output.dense.weight→encoder.layer.1.output.dense.weight\n",
      "bert.encoder.layer.1.output.dense.bias→encoder.layer.1.output.dense.bias\n",
      "bert.encoder.layer.1.output.LayerNorm.gamma→encoder.layer.1.output.LayerNorm.gamma\n",
      "bert.encoder.layer.1.output.LayerNorm.beta→encoder.layer.1.output.LayerNorm.beta\n",
      "bert.encoder.layer.2.attention.self.query.weight→encoder.layer.2.attention.selfattn.query.weight\n",
      "bert.encoder.layer.2.attention.self.query.bias→encoder.layer.2.attention.selfattn.query.bias\n",
      "bert.encoder.layer.2.attention.self.key.weight→encoder.layer.2.attention.selfattn.key.weight\n",
      "bert.encoder.layer.2.attention.self.key.bias→encoder.layer.2.attention.selfattn.key.bias\n",
      "bert.encoder.layer.2.attention.self.value.weight→encoder.layer.2.attention.selfattn.value.weight\n",
      "bert.encoder.layer.2.attention.self.value.bias→encoder.layer.2.attention.selfattn.value.bias\n",
      "bert.encoder.layer.2.attention.output.dense.weight→encoder.layer.2.attention.output.dense.weight\n",
      "bert.encoder.layer.2.attention.output.dense.bias→encoder.layer.2.attention.output.dense.bias\n",
      "bert.encoder.layer.2.attention.output.LayerNorm.gamma→encoder.layer.2.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.2.attention.output.LayerNorm.beta→encoder.layer.2.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.2.intermediate.dense.weight→encoder.layer.2.intermediate.dense.weight\n",
      "bert.encoder.layer.2.intermediate.dense.bias→encoder.layer.2.intermediate.dense.bias\n",
      "bert.encoder.layer.2.output.dense.weight→encoder.layer.2.output.dense.weight\n",
      "bert.encoder.layer.2.output.dense.bias→encoder.layer.2.output.dense.bias\n",
      "bert.encoder.layer.2.output.LayerNorm.gamma→encoder.layer.2.output.LayerNorm.gamma\n",
      "bert.encoder.layer.2.output.LayerNorm.beta→encoder.layer.2.output.LayerNorm.beta\n",
      "bert.encoder.layer.3.attention.self.query.weight→encoder.layer.3.attention.selfattn.query.weight\n",
      "bert.encoder.layer.3.attention.self.query.bias→encoder.layer.3.attention.selfattn.query.bias\n",
      "bert.encoder.layer.3.attention.self.key.weight→encoder.layer.3.attention.selfattn.key.weight\n",
      "bert.encoder.layer.3.attention.self.key.bias→encoder.layer.3.attention.selfattn.key.bias\n",
      "bert.encoder.layer.3.attention.self.value.weight→encoder.layer.3.attention.selfattn.value.weight\n",
      "bert.encoder.layer.3.attention.self.value.bias→encoder.layer.3.attention.selfattn.value.bias\n",
      "bert.encoder.layer.3.attention.output.dense.weight→encoder.layer.3.attention.output.dense.weight\n",
      "bert.encoder.layer.3.attention.output.dense.bias→encoder.layer.3.attention.output.dense.bias\n",
      "bert.encoder.layer.3.attention.output.LayerNorm.gamma→encoder.layer.3.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.3.attention.output.LayerNorm.beta→encoder.layer.3.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.3.intermediate.dense.weight→encoder.layer.3.intermediate.dense.weight\n",
      "bert.encoder.layer.3.intermediate.dense.bias→encoder.layer.3.intermediate.dense.bias\n",
      "bert.encoder.layer.3.output.dense.weight→encoder.layer.3.output.dense.weight\n",
      "bert.encoder.layer.3.output.dense.bias→encoder.layer.3.output.dense.bias\n",
      "bert.encoder.layer.3.output.LayerNorm.gamma→encoder.layer.3.output.LayerNorm.gamma\n",
      "bert.encoder.layer.3.output.LayerNorm.beta→encoder.layer.3.output.LayerNorm.beta\n",
      "bert.encoder.layer.4.attention.self.query.weight→encoder.layer.4.attention.selfattn.query.weight\n",
      "bert.encoder.layer.4.attention.self.query.bias→encoder.layer.4.attention.selfattn.query.bias\n",
      "bert.encoder.layer.4.attention.self.key.weight→encoder.layer.4.attention.selfattn.key.weight\n",
      "bert.encoder.layer.4.attention.self.key.bias→encoder.layer.4.attention.selfattn.key.bias\n",
      "bert.encoder.layer.4.attention.self.value.weight→encoder.layer.4.attention.selfattn.value.weight\n",
      "bert.encoder.layer.4.attention.self.value.bias→encoder.layer.4.attention.selfattn.value.bias\n",
      "bert.encoder.layer.4.attention.output.dense.weight→encoder.layer.4.attention.output.dense.weight\n",
      "bert.encoder.layer.4.attention.output.dense.bias→encoder.layer.4.attention.output.dense.bias\n",
      "bert.encoder.layer.4.attention.output.LayerNorm.gamma→encoder.layer.4.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.4.attention.output.LayerNorm.beta→encoder.layer.4.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.4.intermediate.dense.weight→encoder.layer.4.intermediate.dense.weight\n",
      "bert.encoder.layer.4.intermediate.dense.bias→encoder.layer.4.intermediate.dense.bias\n",
      "bert.encoder.layer.4.output.dense.weight→encoder.layer.4.output.dense.weight\n",
      "bert.encoder.layer.4.output.dense.bias→encoder.layer.4.output.dense.bias\n",
      "bert.encoder.layer.4.output.LayerNorm.gamma→encoder.layer.4.output.LayerNorm.gamma\n",
      "bert.encoder.layer.4.output.LayerNorm.beta→encoder.layer.4.output.LayerNorm.beta\n",
      "bert.encoder.layer.5.attention.self.query.weight→encoder.layer.5.attention.selfattn.query.weight\n",
      "bert.encoder.layer.5.attention.self.query.bias→encoder.layer.5.attention.selfattn.query.bias\n",
      "bert.encoder.layer.5.attention.self.key.weight→encoder.layer.5.attention.selfattn.key.weight\n",
      "bert.encoder.layer.5.attention.self.key.bias→encoder.layer.5.attention.selfattn.key.bias\n",
      "bert.encoder.layer.5.attention.self.value.weight→encoder.layer.5.attention.selfattn.value.weight\n",
      "bert.encoder.layer.5.attention.self.value.bias→encoder.layer.5.attention.selfattn.value.bias\n",
      "bert.encoder.layer.5.attention.output.dense.weight→encoder.layer.5.attention.output.dense.weight\n",
      "bert.encoder.layer.5.attention.output.dense.bias→encoder.layer.5.attention.output.dense.bias\n",
      "bert.encoder.layer.5.attention.output.LayerNorm.gamma→encoder.layer.5.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.5.attention.output.LayerNorm.beta→encoder.layer.5.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.5.intermediate.dense.weight→encoder.layer.5.intermediate.dense.weight\n",
      "bert.encoder.layer.5.intermediate.dense.bias→encoder.layer.5.intermediate.dense.bias\n",
      "bert.encoder.layer.5.output.dense.weight→encoder.layer.5.output.dense.weight\n",
      "bert.encoder.layer.5.output.dense.bias→encoder.layer.5.output.dense.bias\n",
      "bert.encoder.layer.5.output.LayerNorm.gamma→encoder.layer.5.output.LayerNorm.gamma\n",
      "bert.encoder.layer.5.output.LayerNorm.beta→encoder.layer.5.output.LayerNorm.beta\n",
      "bert.encoder.layer.6.attention.self.query.weight→encoder.layer.6.attention.selfattn.query.weight\n",
      "bert.encoder.layer.6.attention.self.query.bias→encoder.layer.6.attention.selfattn.query.bias\n",
      "bert.encoder.layer.6.attention.self.key.weight→encoder.layer.6.attention.selfattn.key.weight\n",
      "bert.encoder.layer.6.attention.self.key.bias→encoder.layer.6.attention.selfattn.key.bias\n",
      "bert.encoder.layer.6.attention.self.value.weight→encoder.layer.6.attention.selfattn.value.weight\n",
      "bert.encoder.layer.6.attention.self.value.bias→encoder.layer.6.attention.selfattn.value.bias\n",
      "bert.encoder.layer.6.attention.output.dense.weight→encoder.layer.6.attention.output.dense.weight\n",
      "bert.encoder.layer.6.attention.output.dense.bias→encoder.layer.6.attention.output.dense.bias\n",
      "bert.encoder.layer.6.attention.output.LayerNorm.gamma→encoder.layer.6.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.6.attention.output.LayerNorm.beta→encoder.layer.6.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.6.intermediate.dense.weight→encoder.layer.6.intermediate.dense.weight\n",
      "bert.encoder.layer.6.intermediate.dense.bias→encoder.layer.6.intermediate.dense.bias\n",
      "bert.encoder.layer.6.output.dense.weight→encoder.layer.6.output.dense.weight\n",
      "bert.encoder.layer.6.output.dense.bias→encoder.layer.6.output.dense.bias\n",
      "bert.encoder.layer.6.output.LayerNorm.gamma→encoder.layer.6.output.LayerNorm.gamma\n",
      "bert.encoder.layer.6.output.LayerNorm.beta→encoder.layer.6.output.LayerNorm.beta\n",
      "bert.encoder.layer.7.attention.self.query.weight→encoder.layer.7.attention.selfattn.query.weight\n",
      "bert.encoder.layer.7.attention.self.query.bias→encoder.layer.7.attention.selfattn.query.bias\n",
      "bert.encoder.layer.7.attention.self.key.weight→encoder.layer.7.attention.selfattn.key.weight\n",
      "bert.encoder.layer.7.attention.self.key.bias→encoder.layer.7.attention.selfattn.key.bias\n",
      "bert.encoder.layer.7.attention.self.value.weight→encoder.layer.7.attention.selfattn.value.weight\n",
      "bert.encoder.layer.7.attention.self.value.bias→encoder.layer.7.attention.selfattn.value.bias\n",
      "bert.encoder.layer.7.attention.output.dense.weight→encoder.layer.7.attention.output.dense.weight\n",
      "bert.encoder.layer.7.attention.output.dense.bias→encoder.layer.7.attention.output.dense.bias\n",
      "bert.encoder.layer.7.attention.output.LayerNorm.gamma→encoder.layer.7.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.7.attention.output.LayerNorm.beta→encoder.layer.7.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.7.intermediate.dense.weight→encoder.layer.7.intermediate.dense.weight\n",
      "bert.encoder.layer.7.intermediate.dense.bias→encoder.layer.7.intermediate.dense.bias\n",
      "bert.encoder.layer.7.output.dense.weight→encoder.layer.7.output.dense.weight\n",
      "bert.encoder.layer.7.output.dense.bias→encoder.layer.7.output.dense.bias\n",
      "bert.encoder.layer.7.output.LayerNorm.gamma→encoder.layer.7.output.LayerNorm.gamma\n",
      "bert.encoder.layer.7.output.LayerNorm.beta→encoder.layer.7.output.LayerNorm.beta\n",
      "bert.encoder.layer.8.attention.self.query.weight→encoder.layer.8.attention.selfattn.query.weight\n",
      "bert.encoder.layer.8.attention.self.query.bias→encoder.layer.8.attention.selfattn.query.bias\n",
      "bert.encoder.layer.8.attention.self.key.weight→encoder.layer.8.attention.selfattn.key.weight\n",
      "bert.encoder.layer.8.attention.self.key.bias→encoder.layer.8.attention.selfattn.key.bias\n",
      "bert.encoder.layer.8.attention.self.value.weight→encoder.layer.8.attention.selfattn.value.weight\n",
      "bert.encoder.layer.8.attention.self.value.bias→encoder.layer.8.attention.selfattn.value.bias\n",
      "bert.encoder.layer.8.attention.output.dense.weight→encoder.layer.8.attention.output.dense.weight\n",
      "bert.encoder.layer.8.attention.output.dense.bias→encoder.layer.8.attention.output.dense.bias\n",
      "bert.encoder.layer.8.attention.output.LayerNorm.gamma→encoder.layer.8.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.8.attention.output.LayerNorm.beta→encoder.layer.8.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.8.intermediate.dense.weight→encoder.layer.8.intermediate.dense.weight\n",
      "bert.encoder.layer.8.intermediate.dense.bias→encoder.layer.8.intermediate.dense.bias\n",
      "bert.encoder.layer.8.output.dense.weight→encoder.layer.8.output.dense.weight\n",
      "bert.encoder.layer.8.output.dense.bias→encoder.layer.8.output.dense.bias\n",
      "bert.encoder.layer.8.output.LayerNorm.gamma→encoder.layer.8.output.LayerNorm.gamma\n",
      "bert.encoder.layer.8.output.LayerNorm.beta→encoder.layer.8.output.LayerNorm.beta\n",
      "bert.encoder.layer.9.attention.self.query.weight→encoder.layer.9.attention.selfattn.query.weight\n",
      "bert.encoder.layer.9.attention.self.query.bias→encoder.layer.9.attention.selfattn.query.bias\n",
      "bert.encoder.layer.9.attention.self.key.weight→encoder.layer.9.attention.selfattn.key.weight\n",
      "bert.encoder.layer.9.attention.self.key.bias→encoder.layer.9.attention.selfattn.key.bias\n",
      "bert.encoder.layer.9.attention.self.value.weight→encoder.layer.9.attention.selfattn.value.weight\n",
      "bert.encoder.layer.9.attention.self.value.bias→encoder.layer.9.attention.selfattn.value.bias\n",
      "bert.encoder.layer.9.attention.output.dense.weight→encoder.layer.9.attention.output.dense.weight\n",
      "bert.encoder.layer.9.attention.output.dense.bias→encoder.layer.9.attention.output.dense.bias\n",
      "bert.encoder.layer.9.attention.output.LayerNorm.gamma→encoder.layer.9.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.9.attention.output.LayerNorm.beta→encoder.layer.9.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.9.intermediate.dense.weight→encoder.layer.9.intermediate.dense.weight\n",
      "bert.encoder.layer.9.intermediate.dense.bias→encoder.layer.9.intermediate.dense.bias\n",
      "bert.encoder.layer.9.output.dense.weight→encoder.layer.9.output.dense.weight\n",
      "bert.encoder.layer.9.output.dense.bias→encoder.layer.9.output.dense.bias\n",
      "bert.encoder.layer.9.output.LayerNorm.gamma→encoder.layer.9.output.LayerNorm.gamma\n",
      "bert.encoder.layer.9.output.LayerNorm.beta→encoder.layer.9.output.LayerNorm.beta\n",
      "bert.encoder.layer.10.attention.self.query.weight→encoder.layer.10.attention.selfattn.query.weight\n",
      "bert.encoder.layer.10.attention.self.query.bias→encoder.layer.10.attention.selfattn.query.bias\n",
      "bert.encoder.layer.10.attention.self.key.weight→encoder.layer.10.attention.selfattn.key.weight\n",
      "bert.encoder.layer.10.attention.self.key.bias→encoder.layer.10.attention.selfattn.key.bias\n",
      "bert.encoder.layer.10.attention.self.value.weight→encoder.layer.10.attention.selfattn.value.weight\n",
      "bert.encoder.layer.10.attention.self.value.bias→encoder.layer.10.attention.selfattn.value.bias\n",
      "bert.encoder.layer.10.attention.output.dense.weight→encoder.layer.10.attention.output.dense.weight\n",
      "bert.encoder.layer.10.attention.output.dense.bias→encoder.layer.10.attention.output.dense.bias\n",
      "bert.encoder.layer.10.attention.output.LayerNorm.gamma→encoder.layer.10.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.10.attention.output.LayerNorm.beta→encoder.layer.10.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.10.intermediate.dense.weight→encoder.layer.10.intermediate.dense.weight\n",
      "bert.encoder.layer.10.intermediate.dense.bias→encoder.layer.10.intermediate.dense.bias\n",
      "bert.encoder.layer.10.output.dense.weight→encoder.layer.10.output.dense.weight\n",
      "bert.encoder.layer.10.output.dense.bias→encoder.layer.10.output.dense.bias\n",
      "bert.encoder.layer.10.output.LayerNorm.gamma→encoder.layer.10.output.LayerNorm.gamma\n",
      "bert.encoder.layer.10.output.LayerNorm.beta→encoder.layer.10.output.LayerNorm.beta\n",
      "bert.encoder.layer.11.attention.self.query.weight→encoder.layer.11.attention.selfattn.query.weight\n",
      "bert.encoder.layer.11.attention.self.query.bias→encoder.layer.11.attention.selfattn.query.bias\n",
      "bert.encoder.layer.11.attention.self.key.weight→encoder.layer.11.attention.selfattn.key.weight\n",
      "bert.encoder.layer.11.attention.self.key.bias→encoder.layer.11.attention.selfattn.key.bias\n",
      "bert.encoder.layer.11.attention.self.value.weight→encoder.layer.11.attention.selfattn.value.weight\n",
      "bert.encoder.layer.11.attention.self.value.bias→encoder.layer.11.attention.selfattn.value.bias\n",
      "bert.encoder.layer.11.attention.output.dense.weight→encoder.layer.11.attention.output.dense.weight\n",
      "bert.encoder.layer.11.attention.output.dense.bias→encoder.layer.11.attention.output.dense.bias\n",
      "bert.encoder.layer.11.attention.output.LayerNorm.gamma→encoder.layer.11.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.11.attention.output.LayerNorm.beta→encoder.layer.11.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.11.intermediate.dense.weight→encoder.layer.11.intermediate.dense.weight\n",
      "bert.encoder.layer.11.intermediate.dense.bias→encoder.layer.11.intermediate.dense.bias\n",
      "bert.encoder.layer.11.output.dense.weight→encoder.layer.11.output.dense.weight\n",
      "bert.encoder.layer.11.output.dense.bias→encoder.layer.11.output.dense.bias\n",
      "bert.encoder.layer.11.output.LayerNorm.gamma→encoder.layer.11.output.LayerNorm.gamma\n",
      "bert.encoder.layer.11.output.LayerNorm.beta→encoder.layer.11.output.LayerNorm.beta\n",
      "bert.pooler.dense.weight→pooler.dense.weight\n",
      "bert.pooler.dense.bias→pooler.dense.bias\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "IncompatibleKeys(missing_keys=[], unexpected_keys=[])"
      ]
     },
     "execution_count": 24,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# state_dictの名前が違うので前から順番に代入する\n",
    "# 今回、パラメータの名前は違っていても、対応するものは同じ順番になっています\n",
    "\n",
    "# 現在のネットワークの情報をコピーして新たなstate_dictを作成\n",
    "new_state_dict = net.state_dict().copy()\n",
    "\n",
    "# 新たなstate_dictに学習済みの値を代入\n",
    "for index, (key_name, value) in enumerate(loaded_state_dict.items()):\n",
    "    name = param_names[index]  # 現在のネットワークでのパラメータ名を取得\n",
    "    new_state_dict[name] = value  # 値を入れる\n",
    "    print(str(key_name)+\"→\"+str(name))  # 何から何に入ったかを表示\n",
    "\n",
    "    # 現在のネットワークのパラメータを全部ロードしたら終える\n",
    "    if index+1 >= len(param_names):\n",
    "        break\n",
    "\n",
    "# 新たなstate_dictを実装したBERTモデルに与える\n",
    "net.load_state_dict(new_state_dict)\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## BERT用のTokenizerの実装"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 25,
   "metadata": {},
   "outputs": [],
   "source": [
    "# vocabファイルを読み込み、\n",
    "import collections\n",
    "\n",
    "\n",
    "def load_vocab(vocab_file):\n",
    "    \"\"\"text形式のvocabファイルの内容を辞書に格納します\"\"\"\n",
    "    vocab = collections.OrderedDict()  # (単語, id)の順番の辞書変数\n",
    "    ids_to_tokens = collections.OrderedDict()  # (id, 単語)の順番の辞書変数\n",
    "    index = 0\n",
    "\n",
    "    with open(vocab_file, \"r\", encoding=\"utf-8\") as reader:\n",
    "        while True:\n",
    "            token = reader.readline()\n",
    "            if not token:\n",
    "                break\n",
    "            token = token.strip()\n",
    "\n",
    "            # 格納\n",
    "            vocab[token] = index\n",
    "            ids_to_tokens[index] = token\n",
    "            index += 1\n",
    "\n",
    "    return vocab, ids_to_tokens\n",
    "\n",
    "\n",
    "# 実行\n",
    "vocab_file = \"./vocab/bert-base-uncased-vocab.txt\"\n",
    "vocab, ids_to_tokens = load_vocab(vocab_file)\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 26,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "OrderedDict([('[PAD]', 0),\n",
       "             ('[unused0]', 1),\n",
       "             ('[unused1]', 2),\n",
       "             ('[unused2]', 3),\n",
       "             ('[unused3]', 4),\n",
       "             ('[unused4]', 5),\n",
       "             ('[unused5]', 6),\n",
       "             ('[unused6]', 7),\n",
       "             ('[unused7]', 8),\n",
       "             ('[unused8]', 9),\n",
       "             ('[unused9]', 10),\n",
       "             ('[unused10]', 11),\n",
       "             ('[unused11]', 12),\n",
       "             ('[unused12]', 13),\n",
       "             ('[unused13]', 14),\n",
       "             ('[unused14]', 15),\n",
       "             ('[unused15]', 16),\n",
       "             ('[unused16]', 17),\n",
       "             ('[unused17]', 18),\n",
       "             ('[unused18]', 19),\n",
       "             ('[unused19]', 20),\n",
       "             ('[unused20]', 21),\n",
       "             ('[unused21]', 22),\n",
       "             ('[unused22]', 23),\n",
       "             ('[unused23]', 24),\n",
       "             ('[unused24]', 25),\n",
       "             ('[unused25]', 26),\n",
       "             ('[unused26]', 27),\n",
       "             ('[unused27]', 28),\n",
       "             ('[unused28]', 29),\n",
       "             ('[unused29]', 30),\n",
       "             ('[unused30]', 31),\n",
       "             ('[unused31]', 32),\n",
       "             ('[unused32]', 33),\n",
       "             ('[unused33]', 34),\n",
       "             ('[unused34]', 35),\n",
       "             ('[unused35]', 36),\n",
       "             ('[unused36]', 37),\n",
       "             ('[unused37]', 38),\n",
       "             ('[unused38]', 39),\n",
       "             ('[unused39]', 40),\n",
       "             ('[unused40]', 41),\n",
       "             ('[unused41]', 42),\n",
       "             ('[unused42]', 43),\n",
       "             ('[unused43]', 44),\n",
       "             ('[unused44]', 45),\n",
       "             ('[unused45]', 46),\n",
       "             ('[unused46]', 47),\n",
       "             ('[unused47]', 48),\n",
       "             ('[unused48]', 49),\n",
       "             ('[unused49]', 50),\n",
       "             ('[unused50]', 51),\n",
       "             ('[unused51]', 52),\n",
       "             ('[unused52]', 53),\n",
       "             ('[unused53]', 54),\n",
       "             ('[unused54]', 55),\n",
       "             ('[unused55]', 56),\n",
       "             ('[unused56]', 57),\n",
       "             ('[unused57]', 58),\n",
       "             ('[unused58]', 59),\n",
       "             ('[unused59]', 60),\n",
       "             ('[unused60]', 61),\n",
       "             ('[unused61]', 62),\n",
       "             ('[unused62]', 63),\n",
       "             ('[unused63]', 64),\n",
       "             ('[unused64]', 65),\n",
       "             ('[unused65]', 66),\n",
       "             ('[unused66]', 67),\n",
       "             ('[unused67]', 68),\n",
       "             ('[unused68]', 69),\n",
       "             ('[unused69]', 70),\n",
       "             ('[unused70]', 71),\n",
       "             ('[unused71]', 72),\n",
       "             ('[unused72]', 73),\n",
       "             ('[unused73]', 74),\n",
       "             ('[unused74]', 75),\n",
       "             ('[unused75]', 76),\n",
       "             ('[unused76]', 77),\n",
       "             ('[unused77]', 78),\n",
       "             ('[unused78]', 79),\n",
       "             ('[unused79]', 80),\n",
       "             ('[unused80]', 81),\n",
       "             ('[unused81]', 82),\n",
       "             ('[unused82]', 83),\n",
       "             ('[unused83]', 84),\n",
       "             ('[unused84]', 85),\n",
       "             ('[unused85]', 86),\n",
       "             ('[unused86]', 87),\n",
       "             ('[unused87]', 88),\n",
       "             ('[unused88]', 89),\n",
       "             ('[unused89]', 90),\n",
       "             ('[unused90]', 91),\n",
       "             ('[unused91]', 92),\n",
       "             ('[unused92]', 93),\n",
       "             ('[unused93]', 94),\n",
       "             ('[unused94]', 95),\n",
       "             ('[unused95]', 96),\n",
       "             ('[unused96]', 97),\n",
       "             ('[unused97]', 98),\n",
       "             ('[unused98]', 99),\n",
       "             ('[UNK]', 100),\n",
       "             ('[CLS]', 101),\n",
       "             ('[SEP]', 102),\n",
       "             ('[MASK]', 103),\n",
       "             ('[unused99]', 104),\n",
       "             ('[unused100]', 105),\n",
       "             ('[unused101]', 106),\n",
       "             ('[unused102]', 107),\n",
       "             ('[unused103]', 108),\n",
       "             ('[unused104]', 109),\n",
       "             ('[unused105]', 110),\n",
       "             ('[unused106]', 111),\n",
       "             ('[unused107]', 112),\n",
       "             ('[unused108]', 113),\n",
       "             ('[unused109]', 114),\n",
       "             ('[unused110]', 115),\n",
       "             ('[unused111]', 116),\n",
       "             ('[unused112]', 117),\n",
       "             ('[unused113]', 118),\n",
       "             ('[unused114]', 119),\n",
       "             ('[unused115]', 120),\n",
       "             ('[unused116]', 121),\n",
       "             ('[unused117]', 122),\n",
       "             ('[unused118]', 123),\n",
       "             ('[unused119]', 124),\n",
       "             ('[unused120]', 125),\n",
       "             ('[unused121]', 126),\n",
       "             ('[unused122]', 127),\n",
       "             ('[unused123]', 128),\n",
       "             ('[unused124]', 129),\n",
       "             ('[unused125]', 130),\n",
       "             ('[unused126]', 131),\n",
       "             ('[unused127]', 132),\n",
       "             ('[unused128]', 133),\n",
       "             ('[unused129]', 134),\n",
       "             ('[unused130]', 135),\n",
       "             ('[unused131]', 136),\n",
       "             ('[unused132]', 137),\n",
       "             ('[unused133]', 138),\n",
       "             ('[unused134]', 139),\n",
       "             ('[unused135]', 140),\n",
       "             ('[unused136]', 141),\n",
       "             ('[unused137]', 142),\n",
       "             ('[unused138]', 143),\n",
       "             ('[unused139]', 144),\n",
       "             ('[unused140]', 145),\n",
       "             ('[unused141]', 146),\n",
       "             ('[unused142]', 147),\n",
       "             ('[unused143]', 148),\n",
       "             ('[unused144]', 149),\n",
       "             ('[unused145]', 150),\n",
       "             ('[unused146]', 151),\n",
       "             ('[unused147]', 152),\n",
       "             ('[unused148]', 153),\n",
       "             ('[unused149]', 154),\n",
       "             ('[unused150]', 155),\n",
       "             ('[unused151]', 156),\n",
       "             ('[unused152]', 157),\n",
       "             ('[unused153]', 158),\n",
       "             ('[unused154]', 159),\n",
       "             ('[unused155]', 160),\n",
       "             ('[unused156]', 161),\n",
       "             ('[unused157]', 162),\n",
       "             ('[unused158]', 163),\n",
       "             ('[unused159]', 164),\n",
       "             ('[unused160]', 165),\n",
       "             ('[unused161]', 166),\n",
       "             ('[unused162]', 167),\n",
       "             ('[unused163]', 168),\n",
       "             ('[unused164]', 169),\n",
       "             ('[unused165]', 170),\n",
       "             ('[unused166]', 171),\n",
       "             ('[unused167]', 172),\n",
       "             ('[unused168]', 173),\n",
       "             ('[unused169]', 174),\n",
       "             ('[unused170]', 175),\n",
       "             ('[unused171]', 176),\n",
       "             ('[unused172]', 177),\n",
       "             ('[unused173]', 178),\n",
       "             ('[unused174]', 179),\n",
       "             ('[unused175]', 180),\n",
       "             ('[unused176]', 181),\n",
       "             ('[unused177]', 182),\n",
       "             ('[unused178]', 183),\n",
       "             ('[unused179]', 184),\n",
       "             ('[unused180]', 185),\n",
       "             ('[unused181]', 186),\n",
       "             ('[unused182]', 187),\n",
       "             ('[unused183]', 188),\n",
       "             ('[unused184]', 189),\n",
       "             ('[unused185]', 190),\n",
       "             ('[unused186]', 191),\n",
       "             ('[unused187]', 192),\n",
       "             ('[unused188]', 193),\n",
       "             ('[unused189]', 194),\n",
       "             ('[unused190]', 195),\n",
       "             ('[unused191]', 196),\n",
       "             ('[unused192]', 197),\n",
       "             ('[unused193]', 198),\n",
       "             ('[unused194]', 199),\n",
       "             ('[unused195]', 200),\n",
       "             ('[unused196]', 201),\n",
       "             ('[unused197]', 202),\n",
       "             ('[unused198]', 203),\n",
       "             ('[unused199]', 204),\n",
       "             ('[unused200]', 205),\n",
       "             ('[unused201]', 206),\n",
       "             ('[unused202]', 207),\n",
       "             ('[unused203]', 208),\n",
       "             ('[unused204]', 209),\n",
       "             ('[unused205]', 210),\n",
       "             ('[unused206]', 211),\n",
       "             ('[unused207]', 212),\n",
       "             ('[unused208]', 213),\n",
       "             ('[unused209]', 214),\n",
       "             ('[unused210]', 215),\n",
       "             ('[unused211]', 216),\n",
       "             ('[unused212]', 217),\n",
       "             ('[unused213]', 218),\n",
       "             ('[unused214]', 219),\n",
       "             ('[unused215]', 220),\n",
       "             ('[unused216]', 221),\n",
       "             ('[unused217]', 222),\n",
       "             ('[unused218]', 223),\n",
       "             ('[unused219]', 224),\n",
       "             ('[unused220]', 225),\n",
       "             ('[unused221]', 226),\n",
       "             ('[unused222]', 227),\n",
       "             ('[unused223]', 228),\n",
       "             ('[unused224]', 229),\n",
       "             ('[unused225]', 230),\n",
       "             ('[unused226]', 231),\n",
       "             ('[unused227]', 232),\n",
       "             ('[unused228]', 233),\n",
       "             ('[unused229]', 234),\n",
       "             ('[unused230]', 235),\n",
       "             ('[unused231]', 236),\n",
       "             ('[unused232]', 237),\n",
       "             ('[unused233]', 238),\n",
       "             ('[unused234]', 239),\n",
       "             ('[unused235]', 240),\n",
       "             ('[unused236]', 241),\n",
       "             ('[unused237]', 242),\n",
       "             ('[unused238]', 243),\n",
       "             ('[unused239]', 244),\n",
       "             ('[unused240]', 245),\n",
       "             ('[unused241]', 246),\n",
       "             ('[unused242]', 247),\n",
       "             ('[unused243]', 248),\n",
       "             ('[unused244]', 249),\n",
       "             ('[unused245]', 250),\n",
       "             ('[unused246]', 251),\n",
       "             ('[unused247]', 252),\n",
       "             ('[unused248]', 253),\n",
       "             ('[unused249]', 254),\n",
       "             ('[unused250]', 255),\n",
       "             ('[unused251]', 256),\n",
       "             ('[unused252]', 257),\n",
       "             ('[unused253]', 258),\n",
       "             ('[unused254]', 259),\n",
       "             ('[unused255]', 260),\n",
       "             ('[unused256]', 261),\n",
       "             ('[unused257]', 262),\n",
       "             ('[unused258]', 263),\n",
       "             ('[unused259]', 264),\n",
       "             ('[unused260]', 265),\n",
       "             ('[unused261]', 266),\n",
       "             ('[unused262]', 267),\n",
       "             ('[unused263]', 268),\n",
       "             ('[unused264]', 269),\n",
       "             ('[unused265]', 270),\n",
       "             ('[unused266]', 271),\n",
       "             ('[unused267]', 272),\n",
       "             ('[unused268]', 273),\n",
       "             ('[unused269]', 274),\n",
       "             ('[unused270]', 275),\n",
       "             ('[unused271]', 276),\n",
       "             ('[unused272]', 277),\n",
       "             ('[unused273]', 278),\n",
       "             ('[unused274]', 279),\n",
       "             ('[unused275]', 280),\n",
       "             ('[unused276]', 281),\n",
       "             ('[unused277]', 282),\n",
       "             ('[unused278]', 283),\n",
       "             ('[unused279]', 284),\n",
       "             ('[unused280]', 285),\n",
       "             ('[unused281]', 286),\n",
       "             ('[unused282]', 287),\n",
       "             ('[unused283]', 288),\n",
       "             ('[unused284]', 289),\n",
       "             ('[unused285]', 290),\n",
       "             ('[unused286]', 291),\n",
       "             ('[unused287]', 292),\n",
       "             ('[unused288]', 293),\n",
       "             ('[unused289]', 294),\n",
       "             ('[unused290]', 295),\n",
       "             ('[unused291]', 296),\n",
       "             ('[unused292]', 297),\n",
       "             ('[unused293]', 298),\n",
       "             ('[unused294]', 299),\n",
       "             ('[unused295]', 300),\n",
       "             ('[unused296]', 301),\n",
       "             ('[unused297]', 302),\n",
       "             ('[unused298]', 303),\n",
       "             ('[unused299]', 304),\n",
       "             ('[unused300]', 305),\n",
       "             ('[unused301]', 306),\n",
       "             ('[unused302]', 307),\n",
       "             ('[unused303]', 308),\n",
       "             ('[unused304]', 309),\n",
       "             ('[unused305]', 310),\n",
       "             ('[unused306]', 311),\n",
       "             ('[unused307]', 312),\n",
       "             ('[unused308]', 313),\n",
       "             ('[unused309]', 314),\n",
       "             ('[unused310]', 315),\n",
       "             ('[unused311]', 316),\n",
       "             ('[unused312]', 317),\n",
       "             ('[unused313]', 318),\n",
       "             ('[unused314]', 319),\n",
       "             ('[unused315]', 320),\n",
       "             ('[unused316]', 321),\n",
       "             ('[unused317]', 322),\n",
       "             ('[unused318]', 323),\n",
       "             ('[unused319]', 324),\n",
       "             ('[unused320]', 325),\n",
       "             ('[unused321]', 326),\n",
       "             ('[unused322]', 327),\n",
       "             ('[unused323]', 328),\n",
       "             ('[unused324]', 329),\n",
       "             ('[unused325]', 330),\n",
       "             ('[unused326]', 331),\n",
       "             ('[unused327]', 332),\n",
       "             ('[unused328]', 333),\n",
       "             ('[unused329]', 334),\n",
       "             ('[unused330]', 335),\n",
       "             ('[unused331]', 336),\n",
       "             ('[unused332]', 337),\n",
       "             ('[unused333]', 338),\n",
       "             ('[unused334]', 339),\n",
       "             ('[unused335]', 340),\n",
       "             ('[unused336]', 341),\n",
       "             ('[unused337]', 342),\n",
       "             ('[unused338]', 343),\n",
       "             ('[unused339]', 344),\n",
       "             ('[unused340]', 345),\n",
       "             ('[unused341]', 346),\n",
       "             ('[unused342]', 347),\n",
       "             ('[unused343]', 348),\n",
       "             ('[unused344]', 349),\n",
       "             ('[unused345]', 350),\n",
       "             ('[unused346]', 351),\n",
       "             ('[unused347]', 352),\n",
       "             ('[unused348]', 353),\n",
       "             ('[unused349]', 354),\n",
       "             ('[unused350]', 355),\n",
       "             ('[unused351]', 356),\n",
       "             ('[unused352]', 357),\n",
       "             ('[unused353]', 358),\n",
       "             ('[unused354]', 359),\n",
       "             ('[unused355]', 360),\n",
       "             ('[unused356]', 361),\n",
       "             ('[unused357]', 362),\n",
       "             ('[unused358]', 363),\n",
       "             ('[unused359]', 364),\n",
       "             ('[unused360]', 365),\n",
       "             ('[unused361]', 366),\n",
       "             ('[unused362]', 367),\n",
       "             ('[unused363]', 368),\n",
       "             ('[unused364]', 369),\n",
       "             ('[unused365]', 370),\n",
       "             ('[unused366]', 371),\n",
       "             ('[unused367]', 372),\n",
       "             ('[unused368]', 373),\n",
       "             ('[unused369]', 374),\n",
       "             ('[unused370]', 375),\n",
       "             ('[unused371]', 376),\n",
       "             ('[unused372]', 377),\n",
       "             ('[unused373]', 378),\n",
       "             ('[unused374]', 379),\n",
       "             ('[unused375]', 380),\n",
       "             ('[unused376]', 381),\n",
       "             ('[unused377]', 382),\n",
       "             ('[unused378]', 383),\n",
       "             ('[unused379]', 384),\n",
       "             ('[unused380]', 385),\n",
       "             ('[unused381]', 386),\n",
       "             ('[unused382]', 387),\n",
       "             ('[unused383]', 388),\n",
       "             ('[unused384]', 389),\n",
       "             ('[unused385]', 390),\n",
       "             ('[unused386]', 391),\n",
       "             ('[unused387]', 392),\n",
       "             ('[unused388]', 393),\n",
       "             ('[unused389]', 394),\n",
       "             ('[unused390]', 395),\n",
       "             ('[unused391]', 396),\n",
       "             ('[unused392]', 397),\n",
       "             ('[unused393]', 398),\n",
       "             ('[unused394]', 399),\n",
       "             ('[unused395]', 400),\n",
       "             ('[unused396]', 401),\n",
       "             ('[unused397]', 402),\n",
       "             ('[unused398]', 403),\n",
       "             ('[unused399]', 404),\n",
       "             ('[unused400]', 405),\n",
       "             ('[unused401]', 406),\n",
       "             ('[unused402]', 407),\n",
       "             ('[unused403]', 408),\n",
       "             ('[unused404]', 409),\n",
       "             ('[unused405]', 410),\n",
       "             ('[unused406]', 411),\n",
       "             ('[unused407]', 412),\n",
       "             ('[unused408]', 413),\n",
       "             ('[unused409]', 414),\n",
       "             ('[unused410]', 415),\n",
       "             ('[unused411]', 416),\n",
       "             ('[unused412]', 417),\n",
       "             ('[unused413]', 418),\n",
       "             ('[unused414]', 419),\n",
       "             ('[unused415]', 420),\n",
       "             ('[unused416]', 421),\n",
       "             ('[unused417]', 422),\n",
       "             ('[unused418]', 423),\n",
       "             ('[unused419]', 424),\n",
       "             ('[unused420]', 425),\n",
       "             ('[unused421]', 426),\n",
       "             ('[unused422]', 427),\n",
       "             ('[unused423]', 428),\n",
       "             ('[unused424]', 429),\n",
       "             ('[unused425]', 430),\n",
       "             ('[unused426]', 431),\n",
       "             ('[unused427]', 432),\n",
       "             ('[unused428]', 433),\n",
       "             ('[unused429]', 434),\n",
       "             ('[unused430]', 435),\n",
       "             ('[unused431]', 436),\n",
       "             ('[unused432]', 437),\n",
       "             ('[unused433]', 438),\n",
       "             ('[unused434]', 439),\n",
       "             ('[unused435]', 440),\n",
       "             ('[unused436]', 441),\n",
       "             ('[unused437]', 442),\n",
       "             ('[unused438]', 443),\n",
       "             ('[unused439]', 444),\n",
       "             ('[unused440]', 445),\n",
       "             ('[unused441]', 446),\n",
       "             ('[unused442]', 447),\n",
       "             ('[unused443]', 448),\n",
       "             ('[unused444]', 449),\n",
       "             ('[unused445]', 450),\n",
       "             ('[unused446]', 451),\n",
       "             ('[unused447]', 452),\n",
       "             ('[unused448]', 453),\n",
       "             ('[unused449]', 454),\n",
       "             ('[unused450]', 455),\n",
       "             ('[unused451]', 456),\n",
       "             ('[unused452]', 457),\n",
       "             ('[unused453]', 458),\n",
       "             ('[unused454]', 459),\n",
       "             ('[unused455]', 460),\n",
       "             ('[unused456]', 461),\n",
       "             ('[unused457]', 462),\n",
       "             ('[unused458]', 463),\n",
       "             ('[unused459]', 464),\n",
       "             ('[unused460]', 465),\n",
       "             ('[unused461]', 466),\n",
       "             ('[unused462]', 467),\n",
       "             ('[unused463]', 468),\n",
       "             ('[unused464]', 469),\n",
       "             ('[unused465]', 470),\n",
       "             ('[unused466]', 471),\n",
       "             ('[unused467]', 472),\n",
       "             ('[unused468]', 473),\n",
       "             ('[unused469]', 474),\n",
       "             ('[unused470]', 475),\n",
       "             ('[unused471]', 476),\n",
       "             ('[unused472]', 477),\n",
       "             ('[unused473]', 478),\n",
       "             ('[unused474]', 479),\n",
       "             ('[unused475]', 480),\n",
       "             ('[unused476]', 481),\n",
       "             ('[unused477]', 482),\n",
       "             ('[unused478]', 483),\n",
       "             ('[unused479]', 484),\n",
       "             ('[unused480]', 485),\n",
       "             ('[unused481]', 486),\n",
       "             ('[unused482]', 487),\n",
       "             ('[unused483]', 488),\n",
       "             ('[unused484]', 489),\n",
       "             ('[unused485]', 490),\n",
       "             ('[unused486]', 491),\n",
       "             ('[unused487]', 492),\n",
       "             ('[unused488]', 493),\n",
       "             ('[unused489]', 494),\n",
       "             ('[unused490]', 495),\n",
       "             ('[unused491]', 496),\n",
       "             ('[unused492]', 497),\n",
       "             ('[unused493]', 498),\n",
       "             ('[unused494]', 499),\n",
       "             ('[unused495]', 500),\n",
       "             ('[unused496]', 501),\n",
       "             ('[unused497]', 502),\n",
       "             ('[unused498]', 503),\n",
       "             ('[unused499]', 504),\n",
       "             ('[unused500]', 505),\n",
       "             ('[unused501]', 506),\n",
       "             ('[unused502]', 507),\n",
       "             ('[unused503]', 508),\n",
       "             ('[unused504]', 509),\n",
       "             ('[unused505]', 510),\n",
       "             ('[unused506]', 511),\n",
       "             ('[unused507]', 512),\n",
       "             ('[unused508]', 513),\n",
       "             ('[unused509]', 514),\n",
       "             ('[unused510]', 515),\n",
       "             ('[unused511]', 516),\n",
       "             ('[unused512]', 517),\n",
       "             ('[unused513]', 518),\n",
       "             ('[unused514]', 519),\n",
       "             ('[unused515]', 520),\n",
       "             ('[unused516]', 521),\n",
       "             ('[unused517]', 522),\n",
       "             ('[unused518]', 523),\n",
       "             ('[unused519]', 524),\n",
       "             ('[unused520]', 525),\n",
       "             ('[unused521]', 526),\n",
       "             ('[unused522]', 527),\n",
       "             ('[unused523]', 528),\n",
       "             ('[unused524]', 529),\n",
       "             ('[unused525]', 530),\n",
       "             ('[unused526]', 531),\n",
       "             ('[unused527]', 532),\n",
       "             ('[unused528]', 533),\n",
       "             ('[unused529]', 534),\n",
       "             ('[unused530]', 535),\n",
       "             ('[unused531]', 536),\n",
       "             ('[unused532]', 537),\n",
       "             ('[unused533]', 538),\n",
       "             ('[unused534]', 539),\n",
       "             ('[unused535]', 540),\n",
       "             ('[unused536]', 541),\n",
       "             ('[unused537]', 542),\n",
       "             ('[unused538]', 543),\n",
       "             ('[unused539]', 544),\n",
       "             ('[unused540]', 545),\n",
       "             ('[unused541]', 546),\n",
       "             ('[unused542]', 547),\n",
       "             ('[unused543]', 548),\n",
       "             ('[unused544]', 549),\n",
       "             ('[unused545]', 550),\n",
       "             ('[unused546]', 551),\n",
       "             ('[unused547]', 552),\n",
       "             ('[unused548]', 553),\n",
       "             ('[unused549]', 554),\n",
       "             ('[unused550]', 555),\n",
       "             ('[unused551]', 556),\n",
       "             ('[unused552]', 557),\n",
       "             ('[unused553]', 558),\n",
       "             ('[unused554]', 559),\n",
       "             ('[unused555]', 560),\n",
       "             ('[unused556]', 561),\n",
       "             ('[unused557]', 562),\n",
       "             ('[unused558]', 563),\n",
       "             ('[unused559]', 564),\n",
       "             ('[unused560]', 565),\n",
       "             ('[unused561]', 566),\n",
       "             ('[unused562]', 567),\n",
       "             ('[unused563]', 568),\n",
       "             ('[unused564]', 569),\n",
       "             ('[unused565]', 570),\n",
       "             ('[unused566]', 571),\n",
       "             ('[unused567]', 572),\n",
       "             ('[unused568]', 573),\n",
       "             ('[unused569]', 574),\n",
       "             ('[unused570]', 575),\n",
       "             ('[unused571]', 576),\n",
       "             ('[unused572]', 577),\n",
       "             ('[unused573]', 578),\n",
       "             ('[unused574]', 579),\n",
       "             ('[unused575]', 580),\n",
       "             ('[unused576]', 581),\n",
       "             ('[unused577]', 582),\n",
       "             ('[unused578]', 583),\n",
       "             ('[unused579]', 584),\n",
       "             ('[unused580]', 585),\n",
       "             ('[unused581]', 586),\n",
       "             ('[unused582]', 587),\n",
       "             ('[unused583]', 588),\n",
       "             ('[unused584]', 589),\n",
       "             ('[unused585]', 590),\n",
       "             ('[unused586]', 591),\n",
       "             ('[unused587]', 592),\n",
       "             ('[unused588]', 593),\n",
       "             ('[unused589]', 594),\n",
       "             ('[unused590]', 595),\n",
       "             ('[unused591]', 596),\n",
       "             ('[unused592]', 597),\n",
       "             ('[unused593]', 598),\n",
       "             ('[unused594]', 599),\n",
       "             ('[unused595]', 600),\n",
       "             ('[unused596]', 601),\n",
       "             ('[unused597]', 602),\n",
       "             ('[unused598]', 603),\n",
       "             ('[unused599]', 604),\n",
       "             ('[unused600]', 605),\n",
       "             ('[unused601]', 606),\n",
       "             ('[unused602]', 607),\n",
       "             ('[unused603]', 608),\n",
       "             ('[unused604]', 609),\n",
       "             ('[unused605]', 610),\n",
       "             ('[unused606]', 611),\n",
       "             ('[unused607]', 612),\n",
       "             ('[unused608]', 613),\n",
       "             ('[unused609]', 614),\n",
       "             ('[unused610]', 615),\n",
       "             ('[unused611]', 616),\n",
       "             ('[unused612]', 617),\n",
       "             ('[unused613]', 618),\n",
       "             ('[unused614]', 619),\n",
       "             ('[unused615]', 620),\n",
       "             ('[unused616]', 621),\n",
       "             ('[unused617]', 622),\n",
       "             ('[unused618]', 623),\n",
       "             ('[unused619]', 624),\n",
       "             ('[unused620]', 625),\n",
       "             ('[unused621]', 626),\n",
       "             ('[unused622]', 627),\n",
       "             ('[unused623]', 628),\n",
       "             ('[unused624]', 629),\n",
       "             ('[unused625]', 630),\n",
       "             ('[unused626]', 631),\n",
       "             ('[unused627]', 632),\n",
       "             ('[unused628]', 633),\n",
       "             ('[unused629]', 634),\n",
       "             ('[unused630]', 635),\n",
       "             ('[unused631]', 636),\n",
       "             ('[unused632]', 637),\n",
       "             ('[unused633]', 638),\n",
       "             ('[unused634]', 639),\n",
       "             ('[unused635]', 640),\n",
       "             ('[unused636]', 641),\n",
       "             ('[unused637]', 642),\n",
       "             ('[unused638]', 643),\n",
       "             ('[unused639]', 644),\n",
       "             ('[unused640]', 645),\n",
       "             ('[unused641]', 646),\n",
       "             ('[unused642]', 647),\n",
       "             ('[unused643]', 648),\n",
       "             ('[unused644]', 649),\n",
       "             ('[unused645]', 650),\n",
       "             ('[unused646]', 651),\n",
       "             ('[unused647]', 652),\n",
       "             ('[unused648]', 653),\n",
       "             ('[unused649]', 654),\n",
       "             ('[unused650]', 655),\n",
       "             ('[unused651]', 656),\n",
       "             ('[unused652]', 657),\n",
       "             ('[unused653]', 658),\n",
       "             ('[unused654]', 659),\n",
       "             ('[unused655]', 660),\n",
       "             ('[unused656]', 661),\n",
       "             ('[unused657]', 662),\n",
       "             ('[unused658]', 663),\n",
       "             ('[unused659]', 664),\n",
       "             ('[unused660]', 665),\n",
       "             ('[unused661]', 666),\n",
       "             ('[unused662]', 667),\n",
       "             ('[unused663]', 668),\n",
       "             ('[unused664]', 669),\n",
       "             ('[unused665]', 670),\n",
       "             ('[unused666]', 671),\n",
       "             ('[unused667]', 672),\n",
       "             ('[unused668]', 673),\n",
       "             ('[unused669]', 674),\n",
       "             ('[unused670]', 675),\n",
       "             ('[unused671]', 676),\n",
       "             ('[unused672]', 677),\n",
       "             ('[unused673]', 678),\n",
       "             ('[unused674]', 679),\n",
       "             ('[unused675]', 680),\n",
       "             ('[unused676]', 681),\n",
       "             ('[unused677]', 682),\n",
       "             ('[unused678]', 683),\n",
       "             ('[unused679]', 684),\n",
       "             ('[unused680]', 685),\n",
       "             ('[unused681]', 686),\n",
       "             ('[unused682]', 687),\n",
       "             ('[unused683]', 688),\n",
       "             ('[unused684]', 689),\n",
       "             ('[unused685]', 690),\n",
       "             ('[unused686]', 691),\n",
       "             ('[unused687]', 692),\n",
       "             ('[unused688]', 693),\n",
       "             ('[unused689]', 694),\n",
       "             ('[unused690]', 695),\n",
       "             ('[unused691]', 696),\n",
       "             ('[unused692]', 697),\n",
       "             ('[unused693]', 698),\n",
       "             ('[unused694]', 699),\n",
       "             ('[unused695]', 700),\n",
       "             ('[unused696]', 701),\n",
       "             ('[unused697]', 702),\n",
       "             ('[unused698]', 703),\n",
       "             ('[unused699]', 704),\n",
       "             ('[unused700]', 705),\n",
       "             ('[unused701]', 706),\n",
       "             ('[unused702]', 707),\n",
       "             ('[unused703]', 708),\n",
       "             ('[unused704]', 709),\n",
       "             ('[unused705]', 710),\n",
       "             ('[unused706]', 711),\n",
       "             ('[unused707]', 712),\n",
       "             ('[unused708]', 713),\n",
       "             ('[unused709]', 714),\n",
       "             ('[unused710]', 715),\n",
       "             ('[unused711]', 716),\n",
       "             ('[unused712]', 717),\n",
       "             ('[unused713]', 718),\n",
       "             ('[unused714]', 719),\n",
       "             ('[unused715]', 720),\n",
       "             ('[unused716]', 721),\n",
       "             ('[unused717]', 722),\n",
       "             ('[unused718]', 723),\n",
       "             ('[unused719]', 724),\n",
       "             ('[unused720]', 725),\n",
       "             ('[unused721]', 726),\n",
       "             ('[unused722]', 727),\n",
       "             ('[unused723]', 728),\n",
       "             ('[unused724]', 729),\n",
       "             ('[unused725]', 730),\n",
       "             ('[unused726]', 731),\n",
       "             ('[unused727]', 732),\n",
       "             ('[unused728]', 733),\n",
       "             ('[unused729]', 734),\n",
       "             ('[unused730]', 735),\n",
       "             ('[unused731]', 736),\n",
       "             ('[unused732]', 737),\n",
       "             ('[unused733]', 738),\n",
       "             ('[unused734]', 739),\n",
       "             ('[unused735]', 740),\n",
       "             ('[unused736]', 741),\n",
       "             ('[unused737]', 742),\n",
       "             ('[unused738]', 743),\n",
       "             ('[unused739]', 744),\n",
       "             ('[unused740]', 745),\n",
       "             ('[unused741]', 746),\n",
       "             ('[unused742]', 747),\n",
       "             ('[unused743]', 748),\n",
       "             ('[unused744]', 749),\n",
       "             ('[unused745]', 750),\n",
       "             ('[unused746]', 751),\n",
       "             ('[unused747]', 752),\n",
       "             ('[unused748]', 753),\n",
       "             ('[unused749]', 754),\n",
       "             ('[unused750]', 755),\n",
       "             ('[unused751]', 756),\n",
       "             ('[unused752]', 757),\n",
       "             ('[unused753]', 758),\n",
       "             ('[unused754]', 759),\n",
       "             ('[unused755]', 760),\n",
       "             ('[unused756]', 761),\n",
       "             ('[unused757]', 762),\n",
       "             ('[unused758]', 763),\n",
       "             ('[unused759]', 764),\n",
       "             ('[unused760]', 765),\n",
       "             ('[unused761]', 766),\n",
       "             ('[unused762]', 767),\n",
       "             ('[unused763]', 768),\n",
       "             ('[unused764]', 769),\n",
       "             ('[unused765]', 770),\n",
       "             ('[unused766]', 771),\n",
       "             ('[unused767]', 772),\n",
       "             ('[unused768]', 773),\n",
       "             ('[unused769]', 774),\n",
       "             ('[unused770]', 775),\n",
       "             ('[unused771]', 776),\n",
       "             ('[unused772]', 777),\n",
       "             ('[unused773]', 778),\n",
       "             ('[unused774]', 779),\n",
       "             ('[unused775]', 780),\n",
       "             ('[unused776]', 781),\n",
       "             ('[unused777]', 782),\n",
       "             ('[unused778]', 783),\n",
       "             ('[unused779]', 784),\n",
       "             ('[unused780]', 785),\n",
       "             ('[unused781]', 786),\n",
       "             ('[unused782]', 787),\n",
       "             ('[unused783]', 788),\n",
       "             ('[unused784]', 789),\n",
       "             ('[unused785]', 790),\n",
       "             ('[unused786]', 791),\n",
       "             ('[unused787]', 792),\n",
       "             ('[unused788]', 793),\n",
       "             ('[unused789]', 794),\n",
       "             ('[unused790]', 795),\n",
       "             ('[unused791]', 796),\n",
       "             ('[unused792]', 797),\n",
       "             ('[unused793]', 798),\n",
       "             ('[unused794]', 799),\n",
       "             ('[unused795]', 800),\n",
       "             ('[unused796]', 801),\n",
       "             ('[unused797]', 802),\n",
       "             ('[unused798]', 803),\n",
       "             ('[unused799]', 804),\n",
       "             ('[unused800]', 805),\n",
       "             ('[unused801]', 806),\n",
       "             ('[unused802]', 807),\n",
       "             ('[unused803]', 808),\n",
       "             ('[unused804]', 809),\n",
       "             ('[unused805]', 810),\n",
       "             ('[unused806]', 811),\n",
       "             ('[unused807]', 812),\n",
       "             ('[unused808]', 813),\n",
       "             ('[unused809]', 814),\n",
       "             ('[unused810]', 815),\n",
       "             ('[unused811]', 816),\n",
       "             ('[unused812]', 817),\n",
       "             ('[unused813]', 818),\n",
       "             ('[unused814]', 819),\n",
       "             ('[unused815]', 820),\n",
       "             ('[unused816]', 821),\n",
       "             ('[unused817]', 822),\n",
       "             ('[unused818]', 823),\n",
       "             ('[unused819]', 824),\n",
       "             ('[unused820]', 825),\n",
       "             ('[unused821]', 826),\n",
       "             ('[unused822]', 827),\n",
       "             ('[unused823]', 828),\n",
       "             ('[unused824]', 829),\n",
       "             ('[unused825]', 830),\n",
       "             ('[unused826]', 831),\n",
       "             ('[unused827]', 832),\n",
       "             ('[unused828]', 833),\n",
       "             ('[unused829]', 834),\n",
       "             ('[unused830]', 835),\n",
       "             ('[unused831]', 836),\n",
       "             ('[unused832]', 837),\n",
       "             ('[unused833]', 838),\n",
       "             ('[unused834]', 839),\n",
       "             ('[unused835]', 840),\n",
       "             ('[unused836]', 841),\n",
       "             ('[unused837]', 842),\n",
       "             ('[unused838]', 843),\n",
       "             ('[unused839]', 844),\n",
       "             ('[unused840]', 845),\n",
       "             ('[unused841]', 846),\n",
       "             ('[unused842]', 847),\n",
       "             ('[unused843]', 848),\n",
       "             ('[unused844]', 849),\n",
       "             ('[unused845]', 850),\n",
       "             ('[unused846]', 851),\n",
       "             ('[unused847]', 852),\n",
       "             ('[unused848]', 853),\n",
       "             ('[unused849]', 854),\n",
       "             ('[unused850]', 855),\n",
       "             ('[unused851]', 856),\n",
       "             ('[unused852]', 857),\n",
       "             ('[unused853]', 858),\n",
       "             ('[unused854]', 859),\n",
       "             ('[unused855]', 860),\n",
       "             ('[unused856]', 861),\n",
       "             ('[unused857]', 862),\n",
       "             ('[unused858]', 863),\n",
       "             ('[unused859]', 864),\n",
       "             ('[unused860]', 865),\n",
       "             ('[unused861]', 866),\n",
       "             ('[unused862]', 867),\n",
       "             ('[unused863]', 868),\n",
       "             ('[unused864]', 869),\n",
       "             ('[unused865]', 870),\n",
       "             ('[unused866]', 871),\n",
       "             ('[unused867]', 872),\n",
       "             ('[unused868]', 873),\n",
       "             ('[unused869]', 874),\n",
       "             ('[unused870]', 875),\n",
       "             ('[unused871]', 876),\n",
       "             ('[unused872]', 877),\n",
       "             ('[unused873]', 878),\n",
       "             ('[unused874]', 879),\n",
       "             ('[unused875]', 880),\n",
       "             ('[unused876]', 881),\n",
       "             ('[unused877]', 882),\n",
       "             ('[unused878]', 883),\n",
       "             ('[unused879]', 884),\n",
       "             ('[unused880]', 885),\n",
       "             ('[unused881]', 886),\n",
       "             ('[unused882]', 887),\n",
       "             ('[unused883]', 888),\n",
       "             ('[unused884]', 889),\n",
       "             ('[unused885]', 890),\n",
       "             ('[unused886]', 891),\n",
       "             ('[unused887]', 892),\n",
       "             ('[unused888]', 893),\n",
       "             ('[unused889]', 894),\n",
       "             ('[unused890]', 895),\n",
       "             ('[unused891]', 896),\n",
       "             ('[unused892]', 897),\n",
       "             ('[unused893]', 898),\n",
       "             ('[unused894]', 899),\n",
       "             ('[unused895]', 900),\n",
       "             ('[unused896]', 901),\n",
       "             ('[unused897]', 902),\n",
       "             ('[unused898]', 903),\n",
       "             ('[unused899]', 904),\n",
       "             ('[unused900]', 905),\n",
       "             ('[unused901]', 906),\n",
       "             ('[unused902]', 907),\n",
       "             ('[unused903]', 908),\n",
       "             ('[unused904]', 909),\n",
       "             ('[unused905]', 910),\n",
       "             ('[unused906]', 911),\n",
       "             ('[unused907]', 912),\n",
       "             ('[unused908]', 913),\n",
       "             ('[unused909]', 914),\n",
       "             ('[unused910]', 915),\n",
       "             ('[unused911]', 916),\n",
       "             ('[unused912]', 917),\n",
       "             ('[unused913]', 918),\n",
       "             ('[unused914]', 919),\n",
       "             ('[unused915]', 920),\n",
       "             ('[unused916]', 921),\n",
       "             ('[unused917]', 922),\n",
       "             ('[unused918]', 923),\n",
       "             ('[unused919]', 924),\n",
       "             ('[unused920]', 925),\n",
       "             ('[unused921]', 926),\n",
       "             ('[unused922]', 927),\n",
       "             ('[unused923]', 928),\n",
       "             ('[unused924]', 929),\n",
       "             ('[unused925]', 930),\n",
       "             ('[unused926]', 931),\n",
       "             ('[unused927]', 932),\n",
       "             ('[unused928]', 933),\n",
       "             ('[unused929]', 934),\n",
       "             ('[unused930]', 935),\n",
       "             ('[unused931]', 936),\n",
       "             ('[unused932]', 937),\n",
       "             ('[unused933]', 938),\n",
       "             ('[unused934]', 939),\n",
       "             ('[unused935]', 940),\n",
       "             ('[unused936]', 941),\n",
       "             ('[unused937]', 942),\n",
       "             ('[unused938]', 943),\n",
       "             ('[unused939]', 944),\n",
       "             ('[unused940]', 945),\n",
       "             ('[unused941]', 946),\n",
       "             ('[unused942]', 947),\n",
       "             ('[unused943]', 948),\n",
       "             ('[unused944]', 949),\n",
       "             ('[unused945]', 950),\n",
       "             ('[unused946]', 951),\n",
       "             ('[unused947]', 952),\n",
       "             ('[unused948]', 953),\n",
       "             ('[unused949]', 954),\n",
       "             ('[unused950]', 955),\n",
       "             ('[unused951]', 956),\n",
       "             ('[unused952]', 957),\n",
       "             ('[unused953]', 958),\n",
       "             ('[unused954]', 959),\n",
       "             ('[unused955]', 960),\n",
       "             ('[unused956]', 961),\n",
       "             ('[unused957]', 962),\n",
       "             ('[unused958]', 963),\n",
       "             ('[unused959]', 964),\n",
       "             ('[unused960]', 965),\n",
       "             ('[unused961]', 966),\n",
       "             ('[unused962]', 967),\n",
       "             ('[unused963]', 968),\n",
       "             ('[unused964]', 969),\n",
       "             ('[unused965]', 970),\n",
       "             ('[unused966]', 971),\n",
       "             ('[unused967]', 972),\n",
       "             ('[unused968]', 973),\n",
       "             ('[unused969]', 974),\n",
       "             ('[unused970]', 975),\n",
       "             ('[unused971]', 976),\n",
       "             ('[unused972]', 977),\n",
       "             ('[unused973]', 978),\n",
       "             ('[unused974]', 979),\n",
       "             ('[unused975]', 980),\n",
       "             ('[unused976]', 981),\n",
       "             ('[unused977]', 982),\n",
       "             ('[unused978]', 983),\n",
       "             ('[unused979]', 984),\n",
       "             ('[unused980]', 985),\n",
       "             ('[unused981]', 986),\n",
       "             ('[unused982]', 987),\n",
       "             ('[unused983]', 988),\n",
       "             ('[unused984]', 989),\n",
       "             ('[unused985]', 990),\n",
       "             ('[unused986]', 991),\n",
       "             ('[unused987]', 992),\n",
       "             ('[unused988]', 993),\n",
       "             ('[unused989]', 994),\n",
       "             ('[unused990]', 995),\n",
       "             ('[unused991]', 996),\n",
       "             ('[unused992]', 997),\n",
       "             ('[unused993]', 998),\n",
       "             ('!', 999),\n",
       "             ...])"
      ]
     },
     "execution_count": 26,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "vocab"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 27,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "OrderedDict([(0, '[PAD]'),\n",
       "             (1, '[unused0]'),\n",
       "             (2, '[unused1]'),\n",
       "             (3, '[unused2]'),\n",
       "             (4, '[unused3]'),\n",
       "             (5, '[unused4]'),\n",
       "             (6, '[unused5]'),\n",
       "             (7, '[unused6]'),\n",
       "             (8, '[unused7]'),\n",
       "             (9, '[unused8]'),\n",
       "             (10, '[unused9]'),\n",
       "             (11, '[unused10]'),\n",
       "             (12, '[unused11]'),\n",
       "             (13, '[unused12]'),\n",
       "             (14, '[unused13]'),\n",
       "             (15, '[unused14]'),\n",
       "             (16, '[unused15]'),\n",
       "             (17, '[unused16]'),\n",
       "             (18, '[unused17]'),\n",
       "             (19, '[unused18]'),\n",
       "             (20, '[unused19]'),\n",
       "             (21, '[unused20]'),\n",
       "             (22, '[unused21]'),\n",
       "             (23, '[unused22]'),\n",
       "             (24, '[unused23]'),\n",
       "             (25, '[unused24]'),\n",
       "             (26, '[unused25]'),\n",
       "             (27, '[unused26]'),\n",
       "             (28, '[unused27]'),\n",
       "             (29, '[unused28]'),\n",
       "             (30, '[unused29]'),\n",
       "             (31, '[unused30]'),\n",
       "             (32, '[unused31]'),\n",
       "             (33, '[unused32]'),\n",
       "             (34, '[unused33]'),\n",
       "             (35, '[unused34]'),\n",
       "             (36, '[unused35]'),\n",
       "             (37, '[unused36]'),\n",
       "             (38, '[unused37]'),\n",
       "             (39, '[unused38]'),\n",
       "             (40, '[unused39]'),\n",
       "             (41, '[unused40]'),\n",
       "             (42, '[unused41]'),\n",
       "             (43, '[unused42]'),\n",
       "             (44, '[unused43]'),\n",
       "             (45, '[unused44]'),\n",
       "             (46, '[unused45]'),\n",
       "             (47, '[unused46]'),\n",
       "             (48, '[unused47]'),\n",
       "             (49, '[unused48]'),\n",
       "             (50, '[unused49]'),\n",
       "             (51, '[unused50]'),\n",
       "             (52, '[unused51]'),\n",
       "             (53, '[unused52]'),\n",
       "             (54, '[unused53]'),\n",
       "             (55, '[unused54]'),\n",
       "             (56, '[unused55]'),\n",
       "             (57, '[unused56]'),\n",
       "             (58, '[unused57]'),\n",
       "             (59, '[unused58]'),\n",
       "             (60, '[unused59]'),\n",
       "             (61, '[unused60]'),\n",
       "             (62, '[unused61]'),\n",
       "             (63, '[unused62]'),\n",
       "             (64, '[unused63]'),\n",
       "             (65, '[unused64]'),\n",
       "             (66, '[unused65]'),\n",
       "             (67, '[unused66]'),\n",
       "             (68, '[unused67]'),\n",
       "             (69, '[unused68]'),\n",
       "             (70, '[unused69]'),\n",
       "             (71, '[unused70]'),\n",
       "             (72, '[unused71]'),\n",
       "             (73, '[unused72]'),\n",
       "             (74, '[unused73]'),\n",
       "             (75, '[unused74]'),\n",
       "             (76, '[unused75]'),\n",
       "             (77, '[unused76]'),\n",
       "             (78, '[unused77]'),\n",
       "             (79, '[unused78]'),\n",
       "             (80, '[unused79]'),\n",
       "             (81, '[unused80]'),\n",
       "             (82, '[unused81]'),\n",
       "             (83, '[unused82]'),\n",
       "             (84, '[unused83]'),\n",
       "             (85, '[unused84]'),\n",
       "             (86, '[unused85]'),\n",
       "             (87, '[unused86]'),\n",
       "             (88, '[unused87]'),\n",
       "             (89, '[unused88]'),\n",
       "             (90, '[unused89]'),\n",
       "             (91, '[unused90]'),\n",
       "             (92, '[unused91]'),\n",
       "             (93, '[unused92]'),\n",
       "             (94, '[unused93]'),\n",
       "             (95, '[unused94]'),\n",
       "             (96, '[unused95]'),\n",
       "             (97, '[unused96]'),\n",
       "             (98, '[unused97]'),\n",
       "             (99, '[unused98]'),\n",
       "             (100, '[UNK]'),\n",
       "             (101, '[CLS]'),\n",
       "             (102, '[SEP]'),\n",
       "             (103, '[MASK]'),\n",
       "             (104, '[unused99]'),\n",
       "             (105, '[unused100]'),\n",
       "             (106, '[unused101]'),\n",
       "             (107, '[unused102]'),\n",
       "             (108, '[unused103]'),\n",
       "             (109, '[unused104]'),\n",
       "             (110, '[unused105]'),\n",
       "             (111, '[unused106]'),\n",
       "             (112, '[unused107]'),\n",
       "             (113, '[unused108]'),\n",
       "             (114, '[unused109]'),\n",
       "             (115, '[unused110]'),\n",
       "             (116, '[unused111]'),\n",
       "             (117, '[unused112]'),\n",
       "             (118, '[unused113]'),\n",
       "             (119, '[unused114]'),\n",
       "             (120, '[unused115]'),\n",
       "             (121, '[unused116]'),\n",
       "             (122, '[unused117]'),\n",
       "             (123, '[unused118]'),\n",
       "             (124, '[unused119]'),\n",
       "             (125, '[unused120]'),\n",
       "             (126, '[unused121]'),\n",
       "             (127, '[unused122]'),\n",
       "             (128, '[unused123]'),\n",
       "             (129, '[unused124]'),\n",
       "             (130, '[unused125]'),\n",
       "             (131, '[unused126]'),\n",
       "             (132, '[unused127]'),\n",
       "             (133, '[unused128]'),\n",
       "             (134, '[unused129]'),\n",
       "             (135, '[unused130]'),\n",
       "             (136, '[unused131]'),\n",
       "             (137, '[unused132]'),\n",
       "             (138, '[unused133]'),\n",
       "             (139, '[unused134]'),\n",
       "             (140, '[unused135]'),\n",
       "             (141, '[unused136]'),\n",
       "             (142, '[unused137]'),\n",
       "             (143, '[unused138]'),\n",
       "             (144, '[unused139]'),\n",
       "             (145, '[unused140]'),\n",
       "             (146, '[unused141]'),\n",
       "             (147, '[unused142]'),\n",
       "             (148, '[unused143]'),\n",
       "             (149, '[unused144]'),\n",
       "             (150, '[unused145]'),\n",
       "             (151, '[unused146]'),\n",
       "             (152, '[unused147]'),\n",
       "             (153, '[unused148]'),\n",
       "             (154, '[unused149]'),\n",
       "             (155, '[unused150]'),\n",
       "             (156, '[unused151]'),\n",
       "             (157, '[unused152]'),\n",
       "             (158, '[unused153]'),\n",
       "             (159, '[unused154]'),\n",
       "             (160, '[unused155]'),\n",
       "             (161, '[unused156]'),\n",
       "             (162, '[unused157]'),\n",
       "             (163, '[unused158]'),\n",
       "             (164, '[unused159]'),\n",
       "             (165, '[unused160]'),\n",
       "             (166, '[unused161]'),\n",
       "             (167, '[unused162]'),\n",
       "             (168, '[unused163]'),\n",
       "             (169, '[unused164]'),\n",
       "             (170, '[unused165]'),\n",
       "             (171, '[unused166]'),\n",
       "             (172, '[unused167]'),\n",
       "             (173, '[unused168]'),\n",
       "             (174, '[unused169]'),\n",
       "             (175, '[unused170]'),\n",
       "             (176, '[unused171]'),\n",
       "             (177, '[unused172]'),\n",
       "             (178, '[unused173]'),\n",
       "             (179, '[unused174]'),\n",
       "             (180, '[unused175]'),\n",
       "             (181, '[unused176]'),\n",
       "             (182, '[unused177]'),\n",
       "             (183, '[unused178]'),\n",
       "             (184, '[unused179]'),\n",
       "             (185, '[unused180]'),\n",
       "             (186, '[unused181]'),\n",
       "             (187, '[unused182]'),\n",
       "             (188, '[unused183]'),\n",
       "             (189, '[unused184]'),\n",
       "             (190, '[unused185]'),\n",
       "             (191, '[unused186]'),\n",
       "             (192, '[unused187]'),\n",
       "             (193, '[unused188]'),\n",
       "             (194, '[unused189]'),\n",
       "             (195, '[unused190]'),\n",
       "             (196, '[unused191]'),\n",
       "             (197, '[unused192]'),\n",
       "             (198, '[unused193]'),\n",
       "             (199, '[unused194]'),\n",
       "             (200, '[unused195]'),\n",
       "             (201, '[unused196]'),\n",
       "             (202, '[unused197]'),\n",
       "             (203, '[unused198]'),\n",
       "             (204, '[unused199]'),\n",
       "             (205, '[unused200]'),\n",
       "             (206, '[unused201]'),\n",
       "             (207, '[unused202]'),\n",
       "             (208, '[unused203]'),\n",
       "             (209, '[unused204]'),\n",
       "             (210, '[unused205]'),\n",
       "             (211, '[unused206]'),\n",
       "             (212, '[unused207]'),\n",
       "             (213, '[unused208]'),\n",
       "             (214, '[unused209]'),\n",
       "             (215, '[unused210]'),\n",
       "             (216, '[unused211]'),\n",
       "             (217, '[unused212]'),\n",
       "             (218, '[unused213]'),\n",
       "             (219, '[unused214]'),\n",
       "             (220, '[unused215]'),\n",
       "             (221, '[unused216]'),\n",
       "             (222, '[unused217]'),\n",
       "             (223, '[unused218]'),\n",
       "             (224, '[unused219]'),\n",
       "             (225, '[unused220]'),\n",
       "             (226, '[unused221]'),\n",
       "             (227, '[unused222]'),\n",
       "             (228, '[unused223]'),\n",
       "             (229, '[unused224]'),\n",
       "             (230, '[unused225]'),\n",
       "             (231, '[unused226]'),\n",
       "             (232, '[unused227]'),\n",
       "             (233, '[unused228]'),\n",
       "             (234, '[unused229]'),\n",
       "             (235, '[unused230]'),\n",
       "             (236, '[unused231]'),\n",
       "             (237, '[unused232]'),\n",
       "             (238, '[unused233]'),\n",
       "             (239, '[unused234]'),\n",
       "             (240, '[unused235]'),\n",
       "             (241, '[unused236]'),\n",
       "             (242, '[unused237]'),\n",
       "             (243, '[unused238]'),\n",
       "             (244, '[unused239]'),\n",
       "             (245, '[unused240]'),\n",
       "             (246, '[unused241]'),\n",
       "             (247, '[unused242]'),\n",
       "             (248, '[unused243]'),\n",
       "             (249, '[unused244]'),\n",
       "             (250, '[unused245]'),\n",
       "             (251, '[unused246]'),\n",
       "             (252, '[unused247]'),\n",
       "             (253, '[unused248]'),\n",
       "             (254, '[unused249]'),\n",
       "             (255, '[unused250]'),\n",
       "             (256, '[unused251]'),\n",
       "             (257, '[unused252]'),\n",
       "             (258, '[unused253]'),\n",
       "             (259, '[unused254]'),\n",
       "             (260, '[unused255]'),\n",
       "             (261, '[unused256]'),\n",
       "             (262, '[unused257]'),\n",
       "             (263, '[unused258]'),\n",
       "             (264, '[unused259]'),\n",
       "             (265, '[unused260]'),\n",
       "             (266, '[unused261]'),\n",
       "             (267, '[unused262]'),\n",
       "             (268, '[unused263]'),\n",
       "             (269, '[unused264]'),\n",
       "             (270, '[unused265]'),\n",
       "             (271, '[unused266]'),\n",
       "             (272, '[unused267]'),\n",
       "             (273, '[unused268]'),\n",
       "             (274, '[unused269]'),\n",
       "             (275, '[unused270]'),\n",
       "             (276, '[unused271]'),\n",
       "             (277, '[unused272]'),\n",
       "             (278, '[unused273]'),\n",
       "             (279, '[unused274]'),\n",
       "             (280, '[unused275]'),\n",
       "             (281, '[unused276]'),\n",
       "             (282, '[unused277]'),\n",
       "             (283, '[unused278]'),\n",
       "             (284, '[unused279]'),\n",
       "             (285, '[unused280]'),\n",
       "             (286, '[unused281]'),\n",
       "             (287, '[unused282]'),\n",
       "             (288, '[unused283]'),\n",
       "             (289, '[unused284]'),\n",
       "             (290, '[unused285]'),\n",
       "             (291, '[unused286]'),\n",
       "             (292, '[unused287]'),\n",
       "             (293, '[unused288]'),\n",
       "             (294, '[unused289]'),\n",
       "             (295, '[unused290]'),\n",
       "             (296, '[unused291]'),\n",
       "             (297, '[unused292]'),\n",
       "             (298, '[unused293]'),\n",
       "             (299, '[unused294]'),\n",
       "             (300, '[unused295]'),\n",
       "             (301, '[unused296]'),\n",
       "             (302, '[unused297]'),\n",
       "             (303, '[unused298]'),\n",
       "             (304, '[unused299]'),\n",
       "             (305, '[unused300]'),\n",
       "             (306, '[unused301]'),\n",
       "             (307, '[unused302]'),\n",
       "             (308, '[unused303]'),\n",
       "             (309, '[unused304]'),\n",
       "             (310, '[unused305]'),\n",
       "             (311, '[unused306]'),\n",
       "             (312, '[unused307]'),\n",
       "             (313, '[unused308]'),\n",
       "             (314, '[unused309]'),\n",
       "             (315, '[unused310]'),\n",
       "             (316, '[unused311]'),\n",
       "             (317, '[unused312]'),\n",
       "             (318, '[unused313]'),\n",
       "             (319, '[unused314]'),\n",
       "             (320, '[unused315]'),\n",
       "             (321, '[unused316]'),\n",
       "             (322, '[unused317]'),\n",
       "             (323, '[unused318]'),\n",
       "             (324, '[unused319]'),\n",
       "             (325, '[unused320]'),\n",
       "             (326, '[unused321]'),\n",
       "             (327, '[unused322]'),\n",
       "             (328, '[unused323]'),\n",
       "             (329, '[unused324]'),\n",
       "             (330, '[unused325]'),\n",
       "             (331, '[unused326]'),\n",
       "             (332, '[unused327]'),\n",
       "             (333, '[unused328]'),\n",
       "             (334, '[unused329]'),\n",
       "             (335, '[unused330]'),\n",
       "             (336, '[unused331]'),\n",
       "             (337, '[unused332]'),\n",
       "             (338, '[unused333]'),\n",
       "             (339, '[unused334]'),\n",
       "             (340, '[unused335]'),\n",
       "             (341, '[unused336]'),\n",
       "             (342, '[unused337]'),\n",
       "             (343, '[unused338]'),\n",
       "             (344, '[unused339]'),\n",
       "             (345, '[unused340]'),\n",
       "             (346, '[unused341]'),\n",
       "             (347, '[unused342]'),\n",
       "             (348, '[unused343]'),\n",
       "             (349, '[unused344]'),\n",
       "             (350, '[unused345]'),\n",
       "             (351, '[unused346]'),\n",
       "             (352, '[unused347]'),\n",
       "             (353, '[unused348]'),\n",
       "             (354, '[unused349]'),\n",
       "             (355, '[unused350]'),\n",
       "             (356, '[unused351]'),\n",
       "             (357, '[unused352]'),\n",
       "             (358, '[unused353]'),\n",
       "             (359, '[unused354]'),\n",
       "             (360, '[unused355]'),\n",
       "             (361, '[unused356]'),\n",
       "             (362, '[unused357]'),\n",
       "             (363, '[unused358]'),\n",
       "             (364, '[unused359]'),\n",
       "             (365, '[unused360]'),\n",
       "             (366, '[unused361]'),\n",
       "             (367, '[unused362]'),\n",
       "             (368, '[unused363]'),\n",
       "             (369, '[unused364]'),\n",
       "             (370, '[unused365]'),\n",
       "             (371, '[unused366]'),\n",
       "             (372, '[unused367]'),\n",
       "             (373, '[unused368]'),\n",
       "             (374, '[unused369]'),\n",
       "             (375, '[unused370]'),\n",
       "             (376, '[unused371]'),\n",
       "             (377, '[unused372]'),\n",
       "             (378, '[unused373]'),\n",
       "             (379, '[unused374]'),\n",
       "             (380, '[unused375]'),\n",
       "             (381, '[unused376]'),\n",
       "             (382, '[unused377]'),\n",
       "             (383, '[unused378]'),\n",
       "             (384, '[unused379]'),\n",
       "             (385, '[unused380]'),\n",
       "             (386, '[unused381]'),\n",
       "             (387, '[unused382]'),\n",
       "             (388, '[unused383]'),\n",
       "             (389, '[unused384]'),\n",
       "             (390, '[unused385]'),\n",
       "             (391, '[unused386]'),\n",
       "             (392, '[unused387]'),\n",
       "             (393, '[unused388]'),\n",
       "             (394, '[unused389]'),\n",
       "             (395, '[unused390]'),\n",
       "             (396, '[unused391]'),\n",
       "             (397, '[unused392]'),\n",
       "             (398, '[unused393]'),\n",
       "             (399, '[unused394]'),\n",
       "             (400, '[unused395]'),\n",
       "             (401, '[unused396]'),\n",
       "             (402, '[unused397]'),\n",
       "             (403, '[unused398]'),\n",
       "             (404, '[unused399]'),\n",
       "             (405, '[unused400]'),\n",
       "             (406, '[unused401]'),\n",
       "             (407, '[unused402]'),\n",
       "             (408, '[unused403]'),\n",
       "             (409, '[unused404]'),\n",
       "             (410, '[unused405]'),\n",
       "             (411, '[unused406]'),\n",
       "             (412, '[unused407]'),\n",
       "             (413, '[unused408]'),\n",
       "             (414, '[unused409]'),\n",
       "             (415, '[unused410]'),\n",
       "             (416, '[unused411]'),\n",
       "             (417, '[unused412]'),\n",
       "             (418, '[unused413]'),\n",
       "             (419, '[unused414]'),\n",
       "             (420, '[unused415]'),\n",
       "             (421, '[unused416]'),\n",
       "             (422, '[unused417]'),\n",
       "             (423, '[unused418]'),\n",
       "             (424, '[unused419]'),\n",
       "             (425, '[unused420]'),\n",
       "             (426, '[unused421]'),\n",
       "             (427, '[unused422]'),\n",
       "             (428, '[unused423]'),\n",
       "             (429, '[unused424]'),\n",
       "             (430, '[unused425]'),\n",
       "             (431, '[unused426]'),\n",
       "             (432, '[unused427]'),\n",
       "             (433, '[unused428]'),\n",
       "             (434, '[unused429]'),\n",
       "             (435, '[unused430]'),\n",
       "             (436, '[unused431]'),\n",
       "             (437, '[unused432]'),\n",
       "             (438, '[unused433]'),\n",
       "             (439, '[unused434]'),\n",
       "             (440, '[unused435]'),\n",
       "             (441, '[unused436]'),\n",
       "             (442, '[unused437]'),\n",
       "             (443, '[unused438]'),\n",
       "             (444, '[unused439]'),\n",
       "             (445, '[unused440]'),\n",
       "             (446, '[unused441]'),\n",
       "             (447, '[unused442]'),\n",
       "             (448, '[unused443]'),\n",
       "             (449, '[unused444]'),\n",
       "             (450, '[unused445]'),\n",
       "             (451, '[unused446]'),\n",
       "             (452, '[unused447]'),\n",
       "             (453, '[unused448]'),\n",
       "             (454, '[unused449]'),\n",
       "             (455, '[unused450]'),\n",
       "             (456, '[unused451]'),\n",
       "             (457, '[unused452]'),\n",
       "             (458, '[unused453]'),\n",
       "             (459, '[unused454]'),\n",
       "             (460, '[unused455]'),\n",
       "             (461, '[unused456]'),\n",
       "             (462, '[unused457]'),\n",
       "             (463, '[unused458]'),\n",
       "             (464, '[unused459]'),\n",
       "             (465, '[unused460]'),\n",
       "             (466, '[unused461]'),\n",
       "             (467, '[unused462]'),\n",
       "             (468, '[unused463]'),\n",
       "             (469, '[unused464]'),\n",
       "             (470, '[unused465]'),\n",
       "             (471, '[unused466]'),\n",
       "             (472, '[unused467]'),\n",
       "             (473, '[unused468]'),\n",
       "             (474, '[unused469]'),\n",
       "             (475, '[unused470]'),\n",
       "             (476, '[unused471]'),\n",
       "             (477, '[unused472]'),\n",
       "             (478, '[unused473]'),\n",
       "             (479, '[unused474]'),\n",
       "             (480, '[unused475]'),\n",
       "             (481, '[unused476]'),\n",
       "             (482, '[unused477]'),\n",
       "             (483, '[unused478]'),\n",
       "             (484, '[unused479]'),\n",
       "             (485, '[unused480]'),\n",
       "             (486, '[unused481]'),\n",
       "             (487, '[unused482]'),\n",
       "             (488, '[unused483]'),\n",
       "             (489, '[unused484]'),\n",
       "             (490, '[unused485]'),\n",
       "             (491, '[unused486]'),\n",
       "             (492, '[unused487]'),\n",
       "             (493, '[unused488]'),\n",
       "             (494, '[unused489]'),\n",
       "             (495, '[unused490]'),\n",
       "             (496, '[unused491]'),\n",
       "             (497, '[unused492]'),\n",
       "             (498, '[unused493]'),\n",
       "             (499, '[unused494]'),\n",
       "             (500, '[unused495]'),\n",
       "             (501, '[unused496]'),\n",
       "             (502, '[unused497]'),\n",
       "             (503, '[unused498]'),\n",
       "             (504, '[unused499]'),\n",
       "             (505, '[unused500]'),\n",
       "             (506, '[unused501]'),\n",
       "             (507, '[unused502]'),\n",
       "             (508, '[unused503]'),\n",
       "             (509, '[unused504]'),\n",
       "             (510, '[unused505]'),\n",
       "             (511, '[unused506]'),\n",
       "             (512, '[unused507]'),\n",
       "             (513, '[unused508]'),\n",
       "             (514, '[unused509]'),\n",
       "             (515, '[unused510]'),\n",
       "             (516, '[unused511]'),\n",
       "             (517, '[unused512]'),\n",
       "             (518, '[unused513]'),\n",
       "             (519, '[unused514]'),\n",
       "             (520, '[unused515]'),\n",
       "             (521, '[unused516]'),\n",
       "             (522, '[unused517]'),\n",
       "             (523, '[unused518]'),\n",
       "             (524, '[unused519]'),\n",
       "             (525, '[unused520]'),\n",
       "             (526, '[unused521]'),\n",
       "             (527, '[unused522]'),\n",
       "             (528, '[unused523]'),\n",
       "             (529, '[unused524]'),\n",
       "             (530, '[unused525]'),\n",
       "             (531, '[unused526]'),\n",
       "             (532, '[unused527]'),\n",
       "             (533, '[unused528]'),\n",
       "             (534, '[unused529]'),\n",
       "             (535, '[unused530]'),\n",
       "             (536, '[unused531]'),\n",
       "             (537, '[unused532]'),\n",
       "             (538, '[unused533]'),\n",
       "             (539, '[unused534]'),\n",
       "             (540, '[unused535]'),\n",
       "             (541, '[unused536]'),\n",
       "             (542, '[unused537]'),\n",
       "             (543, '[unused538]'),\n",
       "             (544, '[unused539]'),\n",
       "             (545, '[unused540]'),\n",
       "             (546, '[unused541]'),\n",
       "             (547, '[unused542]'),\n",
       "             (548, '[unused543]'),\n",
       "             (549, '[unused544]'),\n",
       "             (550, '[unused545]'),\n",
       "             (551, '[unused546]'),\n",
       "             (552, '[unused547]'),\n",
       "             (553, '[unused548]'),\n",
       "             (554, '[unused549]'),\n",
       "             (555, '[unused550]'),\n",
       "             (556, '[unused551]'),\n",
       "             (557, '[unused552]'),\n",
       "             (558, '[unused553]'),\n",
       "             (559, '[unused554]'),\n",
       "             (560, '[unused555]'),\n",
       "             (561, '[unused556]'),\n",
       "             (562, '[unused557]'),\n",
       "             (563, '[unused558]'),\n",
       "             (564, '[unused559]'),\n",
       "             (565, '[unused560]'),\n",
       "             (566, '[unused561]'),\n",
       "             (567, '[unused562]'),\n",
       "             (568, '[unused563]'),\n",
       "             (569, '[unused564]'),\n",
       "             (570, '[unused565]'),\n",
       "             (571, '[unused566]'),\n",
       "             (572, '[unused567]'),\n",
       "             (573, '[unused568]'),\n",
       "             (574, '[unused569]'),\n",
       "             (575, '[unused570]'),\n",
       "             (576, '[unused571]'),\n",
       "             (577, '[unused572]'),\n",
       "             (578, '[unused573]'),\n",
       "             (579, '[unused574]'),\n",
       "             (580, '[unused575]'),\n",
       "             (581, '[unused576]'),\n",
       "             (582, '[unused577]'),\n",
       "             (583, '[unused578]'),\n",
       "             (584, '[unused579]'),\n",
       "             (585, '[unused580]'),\n",
       "             (586, '[unused581]'),\n",
       "             (587, '[unused582]'),\n",
       "             (588, '[unused583]'),\n",
       "             (589, '[unused584]'),\n",
       "             (590, '[unused585]'),\n",
       "             (591, '[unused586]'),\n",
       "             (592, '[unused587]'),\n",
       "             (593, '[unused588]'),\n",
       "             (594, '[unused589]'),\n",
       "             (595, '[unused590]'),\n",
       "             (596, '[unused591]'),\n",
       "             (597, '[unused592]'),\n",
       "             (598, '[unused593]'),\n",
       "             (599, '[unused594]'),\n",
       "             (600, '[unused595]'),\n",
       "             (601, '[unused596]'),\n",
       "             (602, '[unused597]'),\n",
       "             (603, '[unused598]'),\n",
       "             (604, '[unused599]'),\n",
       "             (605, '[unused600]'),\n",
       "             (606, '[unused601]'),\n",
       "             (607, '[unused602]'),\n",
       "             (608, '[unused603]'),\n",
       "             (609, '[unused604]'),\n",
       "             (610, '[unused605]'),\n",
       "             (611, '[unused606]'),\n",
       "             (612, '[unused607]'),\n",
       "             (613, '[unused608]'),\n",
       "             (614, '[unused609]'),\n",
       "             (615, '[unused610]'),\n",
       "             (616, '[unused611]'),\n",
       "             (617, '[unused612]'),\n",
       "             (618, '[unused613]'),\n",
       "             (619, '[unused614]'),\n",
       "             (620, '[unused615]'),\n",
       "             (621, '[unused616]'),\n",
       "             (622, '[unused617]'),\n",
       "             (623, '[unused618]'),\n",
       "             (624, '[unused619]'),\n",
       "             (625, '[unused620]'),\n",
       "             (626, '[unused621]'),\n",
       "             (627, '[unused622]'),\n",
       "             (628, '[unused623]'),\n",
       "             (629, '[unused624]'),\n",
       "             (630, '[unused625]'),\n",
       "             (631, '[unused626]'),\n",
       "             (632, '[unused627]'),\n",
       "             (633, '[unused628]'),\n",
       "             (634, '[unused629]'),\n",
       "             (635, '[unused630]'),\n",
       "             (636, '[unused631]'),\n",
       "             (637, '[unused632]'),\n",
       "             (638, '[unused633]'),\n",
       "             (639, '[unused634]'),\n",
       "             (640, '[unused635]'),\n",
       "             (641, '[unused636]'),\n",
       "             (642, '[unused637]'),\n",
       "             (643, '[unused638]'),\n",
       "             (644, '[unused639]'),\n",
       "             (645, '[unused640]'),\n",
       "             (646, '[unused641]'),\n",
       "             (647, '[unused642]'),\n",
       "             (648, '[unused643]'),\n",
       "             (649, '[unused644]'),\n",
       "             (650, '[unused645]'),\n",
       "             (651, '[unused646]'),\n",
       "             (652, '[unused647]'),\n",
       "             (653, '[unused648]'),\n",
       "             (654, '[unused649]'),\n",
       "             (655, '[unused650]'),\n",
       "             (656, '[unused651]'),\n",
       "             (657, '[unused652]'),\n",
       "             (658, '[unused653]'),\n",
       "             (659, '[unused654]'),\n",
       "             (660, '[unused655]'),\n",
       "             (661, '[unused656]'),\n",
       "             (662, '[unused657]'),\n",
       "             (663, '[unused658]'),\n",
       "             (664, '[unused659]'),\n",
       "             (665, '[unused660]'),\n",
       "             (666, '[unused661]'),\n",
       "             (667, '[unused662]'),\n",
       "             (668, '[unused663]'),\n",
       "             (669, '[unused664]'),\n",
       "             (670, '[unused665]'),\n",
       "             (671, '[unused666]'),\n",
       "             (672, '[unused667]'),\n",
       "             (673, '[unused668]'),\n",
       "             (674, '[unused669]'),\n",
       "             (675, '[unused670]'),\n",
       "             (676, '[unused671]'),\n",
       "             (677, '[unused672]'),\n",
       "             (678, '[unused673]'),\n",
       "             (679, '[unused674]'),\n",
       "             (680, '[unused675]'),\n",
       "             (681, '[unused676]'),\n",
       "             (682, '[unused677]'),\n",
       "             (683, '[unused678]'),\n",
       "             (684, '[unused679]'),\n",
       "             (685, '[unused680]'),\n",
       "             (686, '[unused681]'),\n",
       "             (687, '[unused682]'),\n",
       "             (688, '[unused683]'),\n",
       "             (689, '[unused684]'),\n",
       "             (690, '[unused685]'),\n",
       "             (691, '[unused686]'),\n",
       "             (692, '[unused687]'),\n",
       "             (693, '[unused688]'),\n",
       "             (694, '[unused689]'),\n",
       "             (695, '[unused690]'),\n",
       "             (696, '[unused691]'),\n",
       "             (697, '[unused692]'),\n",
       "             (698, '[unused693]'),\n",
       "             (699, '[unused694]'),\n",
       "             (700, '[unused695]'),\n",
       "             (701, '[unused696]'),\n",
       "             (702, '[unused697]'),\n",
       "             (703, '[unused698]'),\n",
       "             (704, '[unused699]'),\n",
       "             (705, '[unused700]'),\n",
       "             (706, '[unused701]'),\n",
       "             (707, '[unused702]'),\n",
       "             (708, '[unused703]'),\n",
       "             (709, '[unused704]'),\n",
       "             (710, '[unused705]'),\n",
       "             (711, '[unused706]'),\n",
       "             (712, '[unused707]'),\n",
       "             (713, '[unused708]'),\n",
       "             (714, '[unused709]'),\n",
       "             (715, '[unused710]'),\n",
       "             (716, '[unused711]'),\n",
       "             (717, '[unused712]'),\n",
       "             (718, '[unused713]'),\n",
       "             (719, '[unused714]'),\n",
       "             (720, '[unused715]'),\n",
       "             (721, '[unused716]'),\n",
       "             (722, '[unused717]'),\n",
       "             (723, '[unused718]'),\n",
       "             (724, '[unused719]'),\n",
       "             (725, '[unused720]'),\n",
       "             (726, '[unused721]'),\n",
       "             (727, '[unused722]'),\n",
       "             (728, '[unused723]'),\n",
       "             (729, '[unused724]'),\n",
       "             (730, '[unused725]'),\n",
       "             (731, '[unused726]'),\n",
       "             (732, '[unused727]'),\n",
       "             (733, '[unused728]'),\n",
       "             (734, '[unused729]'),\n",
       "             (735, '[unused730]'),\n",
       "             (736, '[unused731]'),\n",
       "             (737, '[unused732]'),\n",
       "             (738, '[unused733]'),\n",
       "             (739, '[unused734]'),\n",
       "             (740, '[unused735]'),\n",
       "             (741, '[unused736]'),\n",
       "             (742, '[unused737]'),\n",
       "             (743, '[unused738]'),\n",
       "             (744, '[unused739]'),\n",
       "             (745, '[unused740]'),\n",
       "             (746, '[unused741]'),\n",
       "             (747, '[unused742]'),\n",
       "             (748, '[unused743]'),\n",
       "             (749, '[unused744]'),\n",
       "             (750, '[unused745]'),\n",
       "             (751, '[unused746]'),\n",
       "             (752, '[unused747]'),\n",
       "             (753, '[unused748]'),\n",
       "             (754, '[unused749]'),\n",
       "             (755, '[unused750]'),\n",
       "             (756, '[unused751]'),\n",
       "             (757, '[unused752]'),\n",
       "             (758, '[unused753]'),\n",
       "             (759, '[unused754]'),\n",
       "             (760, '[unused755]'),\n",
       "             (761, '[unused756]'),\n",
       "             (762, '[unused757]'),\n",
       "             (763, '[unused758]'),\n",
       "             (764, '[unused759]'),\n",
       "             (765, '[unused760]'),\n",
       "             (766, '[unused761]'),\n",
       "             (767, '[unused762]'),\n",
       "             (768, '[unused763]'),\n",
       "             (769, '[unused764]'),\n",
       "             (770, '[unused765]'),\n",
       "             (771, '[unused766]'),\n",
       "             (772, '[unused767]'),\n",
       "             (773, '[unused768]'),\n",
       "             (774, '[unused769]'),\n",
       "             (775, '[unused770]'),\n",
       "             (776, '[unused771]'),\n",
       "             (777, '[unused772]'),\n",
       "             (778, '[unused773]'),\n",
       "             (779, '[unused774]'),\n",
       "             (780, '[unused775]'),\n",
       "             (781, '[unused776]'),\n",
       "             (782, '[unused777]'),\n",
       "             (783, '[unused778]'),\n",
       "             (784, '[unused779]'),\n",
       "             (785, '[unused780]'),\n",
       "             (786, '[unused781]'),\n",
       "             (787, '[unused782]'),\n",
       "             (788, '[unused783]'),\n",
       "             (789, '[unused784]'),\n",
       "             (790, '[unused785]'),\n",
       "             (791, '[unused786]'),\n",
       "             (792, '[unused787]'),\n",
       "             (793, '[unused788]'),\n",
       "             (794, '[unused789]'),\n",
       "             (795, '[unused790]'),\n",
       "             (796, '[unused791]'),\n",
       "             (797, '[unused792]'),\n",
       "             (798, '[unused793]'),\n",
       "             (799, '[unused794]'),\n",
       "             (800, '[unused795]'),\n",
       "             (801, '[unused796]'),\n",
       "             (802, '[unused797]'),\n",
       "             (803, '[unused798]'),\n",
       "             (804, '[unused799]'),\n",
       "             (805, '[unused800]'),\n",
       "             (806, '[unused801]'),\n",
       "             (807, '[unused802]'),\n",
       "             (808, '[unused803]'),\n",
       "             (809, '[unused804]'),\n",
       "             (810, '[unused805]'),\n",
       "             (811, '[unused806]'),\n",
       "             (812, '[unused807]'),\n",
       "             (813, '[unused808]'),\n",
       "             (814, '[unused809]'),\n",
       "             (815, '[unused810]'),\n",
       "             (816, '[unused811]'),\n",
       "             (817, '[unused812]'),\n",
       "             (818, '[unused813]'),\n",
       "             (819, '[unused814]'),\n",
       "             (820, '[unused815]'),\n",
       "             (821, '[unused816]'),\n",
       "             (822, '[unused817]'),\n",
       "             (823, '[unused818]'),\n",
       "             (824, '[unused819]'),\n",
       "             (825, '[unused820]'),\n",
       "             (826, '[unused821]'),\n",
       "             (827, '[unused822]'),\n",
       "             (828, '[unused823]'),\n",
       "             (829, '[unused824]'),\n",
       "             (830, '[unused825]'),\n",
       "             (831, '[unused826]'),\n",
       "             (832, '[unused827]'),\n",
       "             (833, '[unused828]'),\n",
       "             (834, '[unused829]'),\n",
       "             (835, '[unused830]'),\n",
       "             (836, '[unused831]'),\n",
       "             (837, '[unused832]'),\n",
       "             (838, '[unused833]'),\n",
       "             (839, '[unused834]'),\n",
       "             (840, '[unused835]'),\n",
       "             (841, '[unused836]'),\n",
       "             (842, '[unused837]'),\n",
       "             (843, '[unused838]'),\n",
       "             (844, '[unused839]'),\n",
       "             (845, '[unused840]'),\n",
       "             (846, '[unused841]'),\n",
       "             (847, '[unused842]'),\n",
       "             (848, '[unused843]'),\n",
       "             (849, '[unused844]'),\n",
       "             (850, '[unused845]'),\n",
       "             (851, '[unused846]'),\n",
       "             (852, '[unused847]'),\n",
       "             (853, '[unused848]'),\n",
       "             (854, '[unused849]'),\n",
       "             (855, '[unused850]'),\n",
       "             (856, '[unused851]'),\n",
       "             (857, '[unused852]'),\n",
       "             (858, '[unused853]'),\n",
       "             (859, '[unused854]'),\n",
       "             (860, '[unused855]'),\n",
       "             (861, '[unused856]'),\n",
       "             (862, '[unused857]'),\n",
       "             (863, '[unused858]'),\n",
       "             (864, '[unused859]'),\n",
       "             (865, '[unused860]'),\n",
       "             (866, '[unused861]'),\n",
       "             (867, '[unused862]'),\n",
       "             (868, '[unused863]'),\n",
       "             (869, '[unused864]'),\n",
       "             (870, '[unused865]'),\n",
       "             (871, '[unused866]'),\n",
       "             (872, '[unused867]'),\n",
       "             (873, '[unused868]'),\n",
       "             (874, '[unused869]'),\n",
       "             (875, '[unused870]'),\n",
       "             (876, '[unused871]'),\n",
       "             (877, '[unused872]'),\n",
       "             (878, '[unused873]'),\n",
       "             (879, '[unused874]'),\n",
       "             (880, '[unused875]'),\n",
       "             (881, '[unused876]'),\n",
       "             (882, '[unused877]'),\n",
       "             (883, '[unused878]'),\n",
       "             (884, '[unused879]'),\n",
       "             (885, '[unused880]'),\n",
       "             (886, '[unused881]'),\n",
       "             (887, '[unused882]'),\n",
       "             (888, '[unused883]'),\n",
       "             (889, '[unused884]'),\n",
       "             (890, '[unused885]'),\n",
       "             (891, '[unused886]'),\n",
       "             (892, '[unused887]'),\n",
       "             (893, '[unused888]'),\n",
       "             (894, '[unused889]'),\n",
       "             (895, '[unused890]'),\n",
       "             (896, '[unused891]'),\n",
       "             (897, '[unused892]'),\n",
       "             (898, '[unused893]'),\n",
       "             (899, '[unused894]'),\n",
       "             (900, '[unused895]'),\n",
       "             (901, '[unused896]'),\n",
       "             (902, '[unused897]'),\n",
       "             (903, '[unused898]'),\n",
       "             (904, '[unused899]'),\n",
       "             (905, '[unused900]'),\n",
       "             (906, '[unused901]'),\n",
       "             (907, '[unused902]'),\n",
       "             (908, '[unused903]'),\n",
       "             (909, '[unused904]'),\n",
       "             (910, '[unused905]'),\n",
       "             (911, '[unused906]'),\n",
       "             (912, '[unused907]'),\n",
       "             (913, '[unused908]'),\n",
       "             (914, '[unused909]'),\n",
       "             (915, '[unused910]'),\n",
       "             (916, '[unused911]'),\n",
       "             (917, '[unused912]'),\n",
       "             (918, '[unused913]'),\n",
       "             (919, '[unused914]'),\n",
       "             (920, '[unused915]'),\n",
       "             (921, '[unused916]'),\n",
       "             (922, '[unused917]'),\n",
       "             (923, '[unused918]'),\n",
       "             (924, '[unused919]'),\n",
       "             (925, '[unused920]'),\n",
       "             (926, '[unused921]'),\n",
       "             (927, '[unused922]'),\n",
       "             (928, '[unused923]'),\n",
       "             (929, '[unused924]'),\n",
       "             (930, '[unused925]'),\n",
       "             (931, '[unused926]'),\n",
       "             (932, '[unused927]'),\n",
       "             (933, '[unused928]'),\n",
       "             (934, '[unused929]'),\n",
       "             (935, '[unused930]'),\n",
       "             (936, '[unused931]'),\n",
       "             (937, '[unused932]'),\n",
       "             (938, '[unused933]'),\n",
       "             (939, '[unused934]'),\n",
       "             (940, '[unused935]'),\n",
       "             (941, '[unused936]'),\n",
       "             (942, '[unused937]'),\n",
       "             (943, '[unused938]'),\n",
       "             (944, '[unused939]'),\n",
       "             (945, '[unused940]'),\n",
       "             (946, '[unused941]'),\n",
       "             (947, '[unused942]'),\n",
       "             (948, '[unused943]'),\n",
       "             (949, '[unused944]'),\n",
       "             (950, '[unused945]'),\n",
       "             (951, '[unused946]'),\n",
       "             (952, '[unused947]'),\n",
       "             (953, '[unused948]'),\n",
       "             (954, '[unused949]'),\n",
       "             (955, '[unused950]'),\n",
       "             (956, '[unused951]'),\n",
       "             (957, '[unused952]'),\n",
       "             (958, '[unused953]'),\n",
       "             (959, '[unused954]'),\n",
       "             (960, '[unused955]'),\n",
       "             (961, '[unused956]'),\n",
       "             (962, '[unused957]'),\n",
       "             (963, '[unused958]'),\n",
       "             (964, '[unused959]'),\n",
       "             (965, '[unused960]'),\n",
       "             (966, '[unused961]'),\n",
       "             (967, '[unused962]'),\n",
       "             (968, '[unused963]'),\n",
       "             (969, '[unused964]'),\n",
       "             (970, '[unused965]'),\n",
       "             (971, '[unused966]'),\n",
       "             (972, '[unused967]'),\n",
       "             (973, '[unused968]'),\n",
       "             (974, '[unused969]'),\n",
       "             (975, '[unused970]'),\n",
       "             (976, '[unused971]'),\n",
       "             (977, '[unused972]'),\n",
       "             (978, '[unused973]'),\n",
       "             (979, '[unused974]'),\n",
       "             (980, '[unused975]'),\n",
       "             (981, '[unused976]'),\n",
       "             (982, '[unused977]'),\n",
       "             (983, '[unused978]'),\n",
       "             (984, '[unused979]'),\n",
       "             (985, '[unused980]'),\n",
       "             (986, '[unused981]'),\n",
       "             (987, '[unused982]'),\n",
       "             (988, '[unused983]'),\n",
       "             (989, '[unused984]'),\n",
       "             (990, '[unused985]'),\n",
       "             (991, '[unused986]'),\n",
       "             (992, '[unused987]'),\n",
       "             (993, '[unused988]'),\n",
       "             (994, '[unused989]'),\n",
       "             (995, '[unused990]'),\n",
       "             (996, '[unused991]'),\n",
       "             (997, '[unused992]'),\n",
       "             (998, '[unused993]'),\n",
       "             (999, '!'),\n",
       "             ...])"
      ]
     },
     "execution_count": 27,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "ids_to_tokens"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 28,
   "metadata": {},
   "outputs": [],
   "source": [
    "from utils.tokenizer import BasicTokenizer, WordpieceTokenizer\n",
    "\n",
    "# BasicTokenizer, WordpieceTokenizerは、引用文献[2]そのままです\n",
    "# https://github.com/huggingface/pytorch-pretrained-BERT/blob/master/pytorch_pretrained_bert/tokenization.py\n",
    "# これらはsub-wordで単語分割を行うクラスになります。\n",
    "\n",
    "\n",
    "class BertTokenizer(object):\n",
    "    '''BERT用の文章の単語分割クラスを実装'''\n",
    "\n",
    "    def __init__(self, vocab_file, do_lower_case=True):\n",
    "        '''\n",
    "        vocab_file：ボキャブラリーへのパス\n",
    "        do_lower_case：前処理で単語を小文字化するかどうか\n",
    "        '''\n",
    "\n",
    "        # ボキャブラリーのロード\n",
    "        self.vocab, self.ids_to_tokens = load_vocab(vocab_file)\n",
    "\n",
    "        # 分割処理の関数をフォルダ「utils」からimoprt、sub-wordで単語分割を行う\n",
    "        never_split = (\"[UNK]\", \"[SEP]\", \"[PAD]\", \"[CLS]\", \"[MASK]\")\n",
    "        # (注釈)上記の単語は途中で分割させない。これで一つの単語とみなす\n",
    "\n",
    "        self.basic_tokenizer = BasicTokenizer(do_lower_case=do_lower_case,\n",
    "                                              never_split=never_split)\n",
    "        self.wordpiece_tokenizer = WordpieceTokenizer(vocab=self.vocab)\n",
    "\n",
    "    def tokenize(self, text):\n",
    "        '''文章を単語に分割する関数'''\n",
    "        split_tokens = []  # 分割後の単語たち\n",
    "        for token in self.basic_tokenizer.tokenize(text):\n",
    "            for sub_token in self.wordpiece_tokenizer.tokenize(token):\n",
    "                split_tokens.append(sub_token)\n",
    "        return split_tokens\n",
    "\n",
    "    def convert_tokens_to_ids(self, tokens):\n",
    "        \"\"\"分割された単語リストをIDに変換する関数\"\"\"\n",
    "        ids = []\n",
    "        for token in tokens:\n",
    "            ids.append(self.vocab[token])\n",
    "\n",
    "        return ids\n",
    "\n",
    "    def convert_ids_to_tokens(self, ids):\n",
    "        \"\"\"IDを単語に変換する関数\"\"\"\n",
    "        tokens = []\n",
    "        for i in ids:\n",
    "            tokens.append(self.ids_to_tokens[i])\n",
    "        return tokens\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Bankの文脈による意味変化を単語ベクトルとして求める"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 29,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "['[CLS]', 'i', 'accessed', 'the', 'bank', 'account', '.', '[SEP]']\n"
     ]
    }
   ],
   "source": [
    "# 文章1：銀行口座にアクセスしました。\n",
    "text_1 = \"[CLS] I accessed the bank account. [SEP]\"\n",
    "\n",
    "# 文章2：彼は敷金を銀行口座に振り込みました。\n",
    "text_2 = \"[CLS] He transferred the deposit money into the bank account. [SEP]\"\n",
    "\n",
    "# 文章3：川岸でサッカーをします。\n",
    "text_3 = \"[CLS] We play soccer at the bank of the river. [SEP]\"\n",
    "\n",
    "# 単語分割Tokenizerを用意\n",
    "tokenizer = BertTokenizer(\n",
    "    vocab_file=\"./vocab/bert-base-uncased-vocab.txt\", do_lower_case=True)\n",
    "\n",
    "# 文章を単語分割\n",
    "tokenized_text_1 = tokenizer.tokenize(text_1)\n",
    "tokenized_text_2 = tokenizer.tokenize(text_2)\n",
    "tokenized_text_3 = tokenizer.tokenize(text_3)\n",
    "\n",
    "# 確認\n",
    "print(tokenized_text_1)\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 30,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "tensor([[  101,  1045, 11570,  1996,  2924,  4070,  1012,   102]])\n"
     ]
    }
   ],
   "source": [
    "# 単語をIDに変換する\n",
    "indexed_tokens_1 = tokenizer.convert_tokens_to_ids(tokenized_text_1)\n",
    "indexed_tokens_2 = tokenizer.convert_tokens_to_ids(tokenized_text_2)\n",
    "indexed_tokens_3 = tokenizer.convert_tokens_to_ids(tokenized_text_3)\n",
    "\n",
    "# 各文章のbankの位置\n",
    "bank_posi_1 = np.where(np.array(tokenized_text_1) == \"bank\")[0][0]  # 4\n",
    "bank_posi_2 = np.where(np.array(tokenized_text_2) == \"bank\")[0][0]  # 8\n",
    "bank_posi_3 = np.where(np.array(tokenized_text_3) == \"bank\")[0][0]  # 6\n",
    "\n",
    "# seqId（1文目か2文目かは今回は必要ない）\n",
    "\n",
    "# リストをPyTorchのテンソルに\n",
    "tokens_tensor_1 = torch.tensor([indexed_tokens_1])\n",
    "tokens_tensor_2 = torch.tensor([indexed_tokens_2])\n",
    "tokens_tensor_3 = torch.tensor([indexed_tokens_3])\n",
    "\n",
    "# bankの単語id\n",
    "bank_word_id = tokenizer.convert_tokens_to_ids([\"bank\"])[0]\n",
    "\n",
    "# 確認\n",
    "print(tokens_tensor_1)\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 31,
   "metadata": {},
   "outputs": [],
   "source": [
    "# 文章をBERTで処理\n",
    "with torch.no_grad():\n",
    "    encoded_layers_1, _ = net(tokens_tensor_1, output_all_encoded_layers=True)\n",
    "    encoded_layers_2, _ = net(tokens_tensor_2, output_all_encoded_layers=True)\n",
    "    encoded_layers_3, _ = net(tokens_tensor_3, output_all_encoded_layers=True)\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 32,
   "metadata": {},
   "outputs": [],
   "source": [
    "# bankの初期の単語ベクトル表現\n",
    "# これはEmbeddingsモジュールから取り出し、単語bankのidに応じた単語ベクトルなので3文で共通している\n",
    "bank_vector_0 = net.embeddings.word_embeddings.weight[bank_word_id]\n",
    "\n",
    "# 文章1のBertLayerモジュール1段目から出力されるbankの特徴量ベクトル\n",
    "bank_vector_1_1 = encoded_layers_1[0][0, bank_posi_1]\n",
    "\n",
    "# 文章1のBertLayerモジュール最終12段目から出力されるのbankの特徴量ベクトル\n",
    "bank_vector_1_12 = encoded_layers_1[11][0, bank_posi_1]\n",
    "\n",
    "# 文章2、3も同様に\n",
    "bank_vector_2_1 = encoded_layers_2[0][0, bank_posi_2]\n",
    "bank_vector_2_12 = encoded_layers_2[11][0, bank_posi_2]\n",
    "bank_vector_3_1 = encoded_layers_3[0][0, bank_posi_3]\n",
    "bank_vector_3_12 = encoded_layers_3[11][0, bank_posi_3]\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 33,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "bankの初期ベクトル と 文章1の1段目のbankの類似度： tensor(0.6814, grad_fn=<DivBackward0>)\n",
      "bankの初期ベクトル と 文章1の12段目のbankの類似度： tensor(0.2276, grad_fn=<DivBackward0>)\n",
      "文章1の1層目のbank と 文章2の1段目のbankの類似度： tensor(0.8968)\n",
      "文章1の1層目のbank と 文章3の1段目のbankの類似度： tensor(0.7584)\n",
      "文章1の12層目のbank と 文章2の12段目のbankの類似度： tensor(0.8796)\n",
      "文章1の12層目のbank と 文章3の12段目のbankの類似度： tensor(0.4814)\n"
     ]
    }
   ],
   "source": [
    "# コサイン類似度を計算\n",
    "import torch.nn.functional as F\n",
    "\n",
    "print(\"bankの初期ベクトル と 文章1の1段目のbankの類似度：\",\n",
    "      F.cosine_similarity(bank_vector_0, bank_vector_1_1, dim=0))\n",
    "print(\"bankの初期ベクトル と 文章1の12段目のbankの類似度：\",\n",
    "      F.cosine_similarity(bank_vector_0, bank_vector_1_12, dim=0))\n",
    "\n",
    "print(\"文章1の1層目のbank と 文章2の1段目のbankの類似度：\",\n",
    "      F.cosine_similarity(bank_vector_1_1, bank_vector_2_1, dim=0))\n",
    "print(\"文章1の1層目のbank と 文章3の1段目のbankの類似度：\",\n",
    "      F.cosine_similarity(bank_vector_1_1, bank_vector_3_1, dim=0))\n",
    "\n",
    "print(\"文章1の12層目のbank と 文章2の12段目のbankの類似度：\",\n",
    "      F.cosine_similarity(bank_vector_1_12, bank_vector_2_12, dim=0))\n",
    "print(\"文章1の12層目のbank と 文章3の12段目のbankの類似度：\",\n",
    "      F.cosine_similarity(bank_vector_1_12, bank_vector_3_12, dim=0))\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "ここまでの内容をフォルダ「utils」のbert.pyに別途保存しておき、次節からはこちらから読み込むようにします"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# 付録"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 事前学習課題用のモジュールを実装"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 34,
   "metadata": {},
   "outputs": [],
   "source": [
    "class BertPreTrainingHeads(nn.Module):\n",
    "    '''BERTの事前学習課題を行うアダプターモジュール'''\n",
    "\n",
    "    def __init__(self, config, bert_model_embedding_weights):\n",
    "        super(BertPreTrainingHeads, self).__init__()\n",
    "\n",
    "        # 事前学習課題：Masked Language Model用のモジュール\n",
    "        self.predictions = MaskedWordPredictions(config)\n",
    "\n",
    "        # 事前学習課題：Next Sentence Prediction用のモジュール\n",
    "        self.seq_relationship = nn.Linear(config.hidden_size, 2)\n",
    "\n",
    "    def forward(self, sequence_output, pooled_output):\n",
    "        '''入力情報\n",
    "        sequence_output:[batch_size, seq_len, hidden_size]\n",
    "        pooled_output:[batch_size, hidden_size]\n",
    "        '''\n",
    "        # 入力のマスクされた各単語がどの単語かを判定\n",
    "        # 出力 [minibatch, seq_len, vocab_size]\n",
    "        prediction_scores = self.predictions(sequence_output)\n",
    "\n",
    "        # 先頭単語の特徴量から1文目と2文目がつながっているかを判定\n",
    "        seq_relationship_score = self.seq_relationship(\n",
    "            pooled_output)  # 出力 [minibatch, 2]\n",
    "\n",
    "        return prediction_scores, seq_relationship_score\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 35,
   "metadata": {},
   "outputs": [],
   "source": [
    "class BertPreTrainingHeads(nn.Module):\n",
    "    def __init__(self, config):\n",
    "        '''BERTの事前学習課題を行うアダプターモジュール'''\n",
    "        super(BertPreTrainingHeads, self).__init__()\n",
    "\n",
    "        # 事前学習課題：Masked Language Model用のモジュール\n",
    "        self.predictions = MaskedWordPredictions(config)\n",
    "\n",
    "        # 事前学習課題：Next Sentence Prediction用のモジュール\n",
    "        self.seq_relationship = SeqRelationship(config, out_features=2)\n",
    "\n",
    "    def forward(self, sequence_output, pooled_output):\n",
    "        '''入力情報\n",
    "        sequence_output:[batch_size, seq_len, hidden_size]\n",
    "        pooled_output:[batch_size, hidden_size]\n",
    "        '''\n",
    "        # 入力のマスクされた各単語がどの単語かを判定\n",
    "        # 出力 [batch_size, seq_len, hidden_size]\n",
    "        prediction_scores = self.predictions(sequence_output)\n",
    "\n",
    "        # 先頭単語の特徴量から1文目と2文目がつながっているかを判定\n",
    "        seq_relationship_score = self.seq_relationship(\n",
    "            pooled_output)  # 出力 [batch_size, 2]\n",
    "\n",
    "        return prediction_scores, seq_relationship_score\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 36,
   "metadata": {},
   "outputs": [],
   "source": [
    "# 事前学習課題：Masked Language Model用のモジュール\n",
    "\n",
    "\n",
    "class MaskedWordPredictions(nn.Module):\n",
    "    def __init__(self, config):\n",
    "        '''事前学習課題：Masked Language Model用のモジュール\n",
    "        元の[2]の実装では、BertLMPredictionHeadという名前です。\n",
    "        '''\n",
    "        super(MaskedWordPredictions, self).__init__()\n",
    "\n",
    "        # BERTから出力された特徴量を変換するモジュール（入出力のサイズは同じ）\n",
    "        self.transform = BertPredictionHeadTransform(config)\n",
    "\n",
    "        # self.transformの出力から、各位置の単語がどれかを当てる全結合層\n",
    "        self.decoder = nn.Linear(in_features=config.hidden_size,  # 'hidden_size': 768\n",
    "                                 out_features=config.vocab_size,  # 'vocab_size': 30522\n",
    "                                 bias=False)\n",
    "        # バイアス項\n",
    "        self.bias = nn.Parameter(torch.zeros(\n",
    "            config.vocab_size))  # 'vocab_size': 30522\n",
    "\n",
    "    def forward(self, hidden_states):\n",
    "        '''\n",
    "        hidden_states：BERTからの出力[batch_size, seq_len, hidden_size]\n",
    "        '''\n",
    "        # BERTから出力された特徴量を変換\n",
    "        # 出力サイズ：[batch_size, seq_len, hidden_size]\n",
    "        hidden_states = self.transform(hidden_states)\n",
    "\n",
    "        # 各位置の単語がボキャブラリーのどの単語なのかをクラス分類で予測\n",
    "        # 出力サイズ：[batch_size, seq_len, vocab_size]\n",
    "        hidden_states = self.decoder(hidden_states) + self.bias\n",
    "\n",
    "        return hidden_states\n",
    "\n",
    "\n",
    "class BertPredictionHeadTransform(nn.Module):\n",
    "    '''MaskedWordPredictionsにて、BERTからの特徴量を変換するモジュール（入出力のサイズは同じ）'''\n",
    "\n",
    "    def __init__(self, config):\n",
    "        super(BertPredictionHeadTransform, self).__init__()\n",
    "\n",
    "        # 全結合層 'hidden_size': 768\n",
    "        self.dense = nn.Linear(config.hidden_size, config.hidden_size)\n",
    "\n",
    "        # 活性化関数gelu\n",
    "        self.transform_act_fn = gelu\n",
    "\n",
    "        # LayerNormalization\n",
    "        self.LayerNorm = BertLayerNorm(config.hidden_size, eps=1e-12)\n",
    "\n",
    "    def forward(self, hidden_states):\n",
    "        '''hidden_statesはsequence_output:[minibatch, seq_len, hidden_size]'''\n",
    "        # 全結合層で特徴量変換し、活性化関数geluを計算したあと、LayerNormalizationする\n",
    "        hidden_states = self.dense(hidden_states)\n",
    "        hidden_states = self.transform_act_fn(hidden_states)\n",
    "        hidden_states = self.LayerNorm(hidden_states)\n",
    "        return hidden_states\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 37,
   "metadata": {},
   "outputs": [],
   "source": [
    "# 事前学習課題：Next Sentence Prediction用のモジュール\n",
    "class SeqRelationship(nn.Module):\n",
    "    def __init__(self, config, out_features):\n",
    "        '''事前学習課題：Next Sentence Prediction用のモジュール\n",
    "        元の引用[2]の実装では、とくにクラスとして用意はしていない。\n",
    "        ただの全結合層に、わざわざ名前をつけた。\n",
    "        '''\n",
    "        super(SeqRelationship, self).__init__()\n",
    "\n",
    "        # 先頭単語の特徴量から1文目と2文目がつながっているかを判定するクラス分類の全結合層\n",
    "        self.seq_relationship = nn.Linear(config.hidden_size, out_features)\n",
    "\n",
    "    def forward(self, pooled_output):\n",
    "        return self.seq_relationship(pooled_output)\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 38,
   "metadata": {},
   "outputs": [],
   "source": [
    "class BertForMaskedLM(nn.Module):\n",
    "    '''BERTモデルに、事前学習課題用のアダプターモジュール\n",
    "    BertPreTrainingHeadsをつなげたモデル'''\n",
    "\n",
    "    def __init__(self, config, net_bert):\n",
    "        super(BertForMaskedLM, self).__init__()\n",
    "\n",
    "        # BERTモジュール\n",
    "        self.bert = net_bert  # BERTモデル\n",
    "\n",
    "        # 事前学習課題用のアダプターモジュール\n",
    "        self.cls = BertPreTrainingHeads(config)\n",
    "\n",
    "    def forward(self, input_ids, token_type_ids=None, attention_mask=None):\n",
    "        '''\n",
    "        input_ids： [batch_size, sequence_length]の文章の単語IDの羅列\n",
    "        token_type_ids： [batch_size, sequence_length]の、各単語が1文目なのか、2文目なのかを示すid\n",
    "        attention_mask：Transformerのマスクと同じ働きのマスキングです\n",
    "        '''\n",
    "\n",
    "        # BERTの基本モデル部分の順伝搬\n",
    "        encoded_layers, pooled_output = self.bert(\n",
    "            input_ids, token_type_ids, attention_mask, output_all_encoded_layers=False, attention_show_flg=False)\n",
    "\n",
    "        # 事前学習課題の推論を実施\n",
    "        prediction_scores, seq_relationship_score = self.cls(\n",
    "            encoded_layers, pooled_output)\n",
    "\n",
    "        return prediction_scores, seq_relationship_score\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 学習済みモデルのロード部分を実装します"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 39,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "bert.embeddings.word_embeddings.weight→bert.embeddings.word_embeddings.weight\n",
      "bert.embeddings.position_embeddings.weight→bert.embeddings.position_embeddings.weight\n",
      "bert.embeddings.token_type_embeddings.weight→bert.embeddings.token_type_embeddings.weight\n",
      "bert.embeddings.LayerNorm.gamma→bert.embeddings.LayerNorm.gamma\n",
      "bert.embeddings.LayerNorm.beta→bert.embeddings.LayerNorm.beta\n",
      "bert.encoder.layer.0.attention.self.query.weight→bert.encoder.layer.0.attention.selfattn.query.weight\n",
      "bert.encoder.layer.0.attention.self.query.bias→bert.encoder.layer.0.attention.selfattn.query.bias\n",
      "bert.encoder.layer.0.attention.self.key.weight→bert.encoder.layer.0.attention.selfattn.key.weight\n",
      "bert.encoder.layer.0.attention.self.key.bias→bert.encoder.layer.0.attention.selfattn.key.bias\n",
      "bert.encoder.layer.0.attention.self.value.weight→bert.encoder.layer.0.attention.selfattn.value.weight\n",
      "bert.encoder.layer.0.attention.self.value.bias→bert.encoder.layer.0.attention.selfattn.value.bias\n",
      "bert.encoder.layer.0.attention.output.dense.weight→bert.encoder.layer.0.attention.output.dense.weight\n",
      "bert.encoder.layer.0.attention.output.dense.bias→bert.encoder.layer.0.attention.output.dense.bias\n",
      "bert.encoder.layer.0.attention.output.LayerNorm.gamma→bert.encoder.layer.0.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.0.attention.output.LayerNorm.beta→bert.encoder.layer.0.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.0.intermediate.dense.weight→bert.encoder.layer.0.intermediate.dense.weight\n",
      "bert.encoder.layer.0.intermediate.dense.bias→bert.encoder.layer.0.intermediate.dense.bias\n",
      "bert.encoder.layer.0.output.dense.weight→bert.encoder.layer.0.output.dense.weight\n",
      "bert.encoder.layer.0.output.dense.bias→bert.encoder.layer.0.output.dense.bias\n",
      "bert.encoder.layer.0.output.LayerNorm.gamma→bert.encoder.layer.0.output.LayerNorm.gamma\n",
      "bert.encoder.layer.0.output.LayerNorm.beta→bert.encoder.layer.0.output.LayerNorm.beta\n",
      "bert.encoder.layer.1.attention.self.query.weight→bert.encoder.layer.1.attention.selfattn.query.weight\n",
      "bert.encoder.layer.1.attention.self.query.bias→bert.encoder.layer.1.attention.selfattn.query.bias\n",
      "bert.encoder.layer.1.attention.self.key.weight→bert.encoder.layer.1.attention.selfattn.key.weight\n",
      "bert.encoder.layer.1.attention.self.key.bias→bert.encoder.layer.1.attention.selfattn.key.bias\n",
      "bert.encoder.layer.1.attention.self.value.weight→bert.encoder.layer.1.attention.selfattn.value.weight\n",
      "bert.encoder.layer.1.attention.self.value.bias→bert.encoder.layer.1.attention.selfattn.value.bias\n",
      "bert.encoder.layer.1.attention.output.dense.weight→bert.encoder.layer.1.attention.output.dense.weight\n",
      "bert.encoder.layer.1.attention.output.dense.bias→bert.encoder.layer.1.attention.output.dense.bias\n",
      "bert.encoder.layer.1.attention.output.LayerNorm.gamma→bert.encoder.layer.1.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.1.attention.output.LayerNorm.beta→bert.encoder.layer.1.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.1.intermediate.dense.weight→bert.encoder.layer.1.intermediate.dense.weight\n",
      "bert.encoder.layer.1.intermediate.dense.bias→bert.encoder.layer.1.intermediate.dense.bias\n",
      "bert.encoder.layer.1.output.dense.weight→bert.encoder.layer.1.output.dense.weight\n",
      "bert.encoder.layer.1.output.dense.bias→bert.encoder.layer.1.output.dense.bias\n",
      "bert.encoder.layer.1.output.LayerNorm.gamma→bert.encoder.layer.1.output.LayerNorm.gamma\n",
      "bert.encoder.layer.1.output.LayerNorm.beta→bert.encoder.layer.1.output.LayerNorm.beta\n",
      "bert.encoder.layer.2.attention.self.query.weight→bert.encoder.layer.2.attention.selfattn.query.weight\n",
      "bert.encoder.layer.2.attention.self.query.bias→bert.encoder.layer.2.attention.selfattn.query.bias\n",
      "bert.encoder.layer.2.attention.self.key.weight→bert.encoder.layer.2.attention.selfattn.key.weight\n",
      "bert.encoder.layer.2.attention.self.key.bias→bert.encoder.layer.2.attention.selfattn.key.bias\n",
      "bert.encoder.layer.2.attention.self.value.weight→bert.encoder.layer.2.attention.selfattn.value.weight\n",
      "bert.encoder.layer.2.attention.self.value.bias→bert.encoder.layer.2.attention.selfattn.value.bias\n",
      "bert.encoder.layer.2.attention.output.dense.weight→bert.encoder.layer.2.attention.output.dense.weight\n",
      "bert.encoder.layer.2.attention.output.dense.bias→bert.encoder.layer.2.attention.output.dense.bias\n",
      "bert.encoder.layer.2.attention.output.LayerNorm.gamma→bert.encoder.layer.2.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.2.attention.output.LayerNorm.beta→bert.encoder.layer.2.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.2.intermediate.dense.weight→bert.encoder.layer.2.intermediate.dense.weight\n",
      "bert.encoder.layer.2.intermediate.dense.bias→bert.encoder.layer.2.intermediate.dense.bias\n",
      "bert.encoder.layer.2.output.dense.weight→bert.encoder.layer.2.output.dense.weight\n",
      "bert.encoder.layer.2.output.dense.bias→bert.encoder.layer.2.output.dense.bias\n",
      "bert.encoder.layer.2.output.LayerNorm.gamma→bert.encoder.layer.2.output.LayerNorm.gamma\n",
      "bert.encoder.layer.2.output.LayerNorm.beta→bert.encoder.layer.2.output.LayerNorm.beta\n",
      "bert.encoder.layer.3.attention.self.query.weight→bert.encoder.layer.3.attention.selfattn.query.weight\n",
      "bert.encoder.layer.3.attention.self.query.bias→bert.encoder.layer.3.attention.selfattn.query.bias\n",
      "bert.encoder.layer.3.attention.self.key.weight→bert.encoder.layer.3.attention.selfattn.key.weight\n",
      "bert.encoder.layer.3.attention.self.key.bias→bert.encoder.layer.3.attention.selfattn.key.bias\n",
      "bert.encoder.layer.3.attention.self.value.weight→bert.encoder.layer.3.attention.selfattn.value.weight\n",
      "bert.encoder.layer.3.attention.self.value.bias→bert.encoder.layer.3.attention.selfattn.value.bias\n",
      "bert.encoder.layer.3.attention.output.dense.weight→bert.encoder.layer.3.attention.output.dense.weight\n",
      "bert.encoder.layer.3.attention.output.dense.bias→bert.encoder.layer.3.attention.output.dense.bias\n",
      "bert.encoder.layer.3.attention.output.LayerNorm.gamma→bert.encoder.layer.3.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.3.attention.output.LayerNorm.beta→bert.encoder.layer.3.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.3.intermediate.dense.weight→bert.encoder.layer.3.intermediate.dense.weight\n",
      "bert.encoder.layer.3.intermediate.dense.bias→bert.encoder.layer.3.intermediate.dense.bias\n",
      "bert.encoder.layer.3.output.dense.weight→bert.encoder.layer.3.output.dense.weight\n",
      "bert.encoder.layer.3.output.dense.bias→bert.encoder.layer.3.output.dense.bias\n",
      "bert.encoder.layer.3.output.LayerNorm.gamma→bert.encoder.layer.3.output.LayerNorm.gamma\n",
      "bert.encoder.layer.3.output.LayerNorm.beta→bert.encoder.layer.3.output.LayerNorm.beta\n",
      "bert.encoder.layer.4.attention.self.query.weight→bert.encoder.layer.4.attention.selfattn.query.weight\n",
      "bert.encoder.layer.4.attention.self.query.bias→bert.encoder.layer.4.attention.selfattn.query.bias\n",
      "bert.encoder.layer.4.attention.self.key.weight→bert.encoder.layer.4.attention.selfattn.key.weight\n",
      "bert.encoder.layer.4.attention.self.key.bias→bert.encoder.layer.4.attention.selfattn.key.bias\n",
      "bert.encoder.layer.4.attention.self.value.weight→bert.encoder.layer.4.attention.selfattn.value.weight\n",
      "bert.encoder.layer.4.attention.self.value.bias→bert.encoder.layer.4.attention.selfattn.value.bias\n",
      "bert.encoder.layer.4.attention.output.dense.weight→bert.encoder.layer.4.attention.output.dense.weight\n",
      "bert.encoder.layer.4.attention.output.dense.bias→bert.encoder.layer.4.attention.output.dense.bias\n",
      "bert.encoder.layer.4.attention.output.LayerNorm.gamma→bert.encoder.layer.4.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.4.attention.output.LayerNorm.beta→bert.encoder.layer.4.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.4.intermediate.dense.weight→bert.encoder.layer.4.intermediate.dense.weight\n",
      "bert.encoder.layer.4.intermediate.dense.bias→bert.encoder.layer.4.intermediate.dense.bias\n",
      "bert.encoder.layer.4.output.dense.weight→bert.encoder.layer.4.output.dense.weight\n",
      "bert.encoder.layer.4.output.dense.bias→bert.encoder.layer.4.output.dense.bias\n",
      "bert.encoder.layer.4.output.LayerNorm.gamma→bert.encoder.layer.4.output.LayerNorm.gamma\n",
      "bert.encoder.layer.4.output.LayerNorm.beta→bert.encoder.layer.4.output.LayerNorm.beta\n",
      "bert.encoder.layer.5.attention.self.query.weight→bert.encoder.layer.5.attention.selfattn.query.weight\n",
      "bert.encoder.layer.5.attention.self.query.bias→bert.encoder.layer.5.attention.selfattn.query.bias\n",
      "bert.encoder.layer.5.attention.self.key.weight→bert.encoder.layer.5.attention.selfattn.key.weight\n",
      "bert.encoder.layer.5.attention.self.key.bias→bert.encoder.layer.5.attention.selfattn.key.bias\n",
      "bert.encoder.layer.5.attention.self.value.weight→bert.encoder.layer.5.attention.selfattn.value.weight\n",
      "bert.encoder.layer.5.attention.self.value.bias→bert.encoder.layer.5.attention.selfattn.value.bias\n",
      "bert.encoder.layer.5.attention.output.dense.weight→bert.encoder.layer.5.attention.output.dense.weight\n",
      "bert.encoder.layer.5.attention.output.dense.bias→bert.encoder.layer.5.attention.output.dense.bias\n",
      "bert.encoder.layer.5.attention.output.LayerNorm.gamma→bert.encoder.layer.5.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.5.attention.output.LayerNorm.beta→bert.encoder.layer.5.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.5.intermediate.dense.weight→bert.encoder.layer.5.intermediate.dense.weight\n",
      "bert.encoder.layer.5.intermediate.dense.bias→bert.encoder.layer.5.intermediate.dense.bias\n",
      "bert.encoder.layer.5.output.dense.weight→bert.encoder.layer.5.output.dense.weight\n",
      "bert.encoder.layer.5.output.dense.bias→bert.encoder.layer.5.output.dense.bias\n",
      "bert.encoder.layer.5.output.LayerNorm.gamma→bert.encoder.layer.5.output.LayerNorm.gamma\n",
      "bert.encoder.layer.5.output.LayerNorm.beta→bert.encoder.layer.5.output.LayerNorm.beta\n",
      "bert.encoder.layer.6.attention.self.query.weight→bert.encoder.layer.6.attention.selfattn.query.weight\n",
      "bert.encoder.layer.6.attention.self.query.bias→bert.encoder.layer.6.attention.selfattn.query.bias\n",
      "bert.encoder.layer.6.attention.self.key.weight→bert.encoder.layer.6.attention.selfattn.key.weight\n",
      "bert.encoder.layer.6.attention.self.key.bias→bert.encoder.layer.6.attention.selfattn.key.bias\n",
      "bert.encoder.layer.6.attention.self.value.weight→bert.encoder.layer.6.attention.selfattn.value.weight\n",
      "bert.encoder.layer.6.attention.self.value.bias→bert.encoder.layer.6.attention.selfattn.value.bias\n",
      "bert.encoder.layer.6.attention.output.dense.weight→bert.encoder.layer.6.attention.output.dense.weight\n",
      "bert.encoder.layer.6.attention.output.dense.bias→bert.encoder.layer.6.attention.output.dense.bias\n",
      "bert.encoder.layer.6.attention.output.LayerNorm.gamma→bert.encoder.layer.6.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.6.attention.output.LayerNorm.beta→bert.encoder.layer.6.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.6.intermediate.dense.weight→bert.encoder.layer.6.intermediate.dense.weight\n",
      "bert.encoder.layer.6.intermediate.dense.bias→bert.encoder.layer.6.intermediate.dense.bias\n",
      "bert.encoder.layer.6.output.dense.weight→bert.encoder.layer.6.output.dense.weight\n",
      "bert.encoder.layer.6.output.dense.bias→bert.encoder.layer.6.output.dense.bias\n",
      "bert.encoder.layer.6.output.LayerNorm.gamma→bert.encoder.layer.6.output.LayerNorm.gamma\n",
      "bert.encoder.layer.6.output.LayerNorm.beta→bert.encoder.layer.6.output.LayerNorm.beta\n",
      "bert.encoder.layer.7.attention.self.query.weight→bert.encoder.layer.7.attention.selfattn.query.weight\n",
      "bert.encoder.layer.7.attention.self.query.bias→bert.encoder.layer.7.attention.selfattn.query.bias\n",
      "bert.encoder.layer.7.attention.self.key.weight→bert.encoder.layer.7.attention.selfattn.key.weight\n",
      "bert.encoder.layer.7.attention.self.key.bias→bert.encoder.layer.7.attention.selfattn.key.bias\n",
      "bert.encoder.layer.7.attention.self.value.weight→bert.encoder.layer.7.attention.selfattn.value.weight\n",
      "bert.encoder.layer.7.attention.self.value.bias→bert.encoder.layer.7.attention.selfattn.value.bias\n",
      "bert.encoder.layer.7.attention.output.dense.weight→bert.encoder.layer.7.attention.output.dense.weight\n",
      "bert.encoder.layer.7.attention.output.dense.bias→bert.encoder.layer.7.attention.output.dense.bias\n",
      "bert.encoder.layer.7.attention.output.LayerNorm.gamma→bert.encoder.layer.7.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.7.attention.output.LayerNorm.beta→bert.encoder.layer.7.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.7.intermediate.dense.weight→bert.encoder.layer.7.intermediate.dense.weight\n",
      "bert.encoder.layer.7.intermediate.dense.bias→bert.encoder.layer.7.intermediate.dense.bias\n",
      "bert.encoder.layer.7.output.dense.weight→bert.encoder.layer.7.output.dense.weight\n",
      "bert.encoder.layer.7.output.dense.bias→bert.encoder.layer.7.output.dense.bias\n",
      "bert.encoder.layer.7.output.LayerNorm.gamma→bert.encoder.layer.7.output.LayerNorm.gamma\n",
      "bert.encoder.layer.7.output.LayerNorm.beta→bert.encoder.layer.7.output.LayerNorm.beta\n",
      "bert.encoder.layer.8.attention.self.query.weight→bert.encoder.layer.8.attention.selfattn.query.weight\n",
      "bert.encoder.layer.8.attention.self.query.bias→bert.encoder.layer.8.attention.selfattn.query.bias\n",
      "bert.encoder.layer.8.attention.self.key.weight→bert.encoder.layer.8.attention.selfattn.key.weight\n",
      "bert.encoder.layer.8.attention.self.key.bias→bert.encoder.layer.8.attention.selfattn.key.bias\n",
      "bert.encoder.layer.8.attention.self.value.weight→bert.encoder.layer.8.attention.selfattn.value.weight\n",
      "bert.encoder.layer.8.attention.self.value.bias→bert.encoder.layer.8.attention.selfattn.value.bias\n",
      "bert.encoder.layer.8.attention.output.dense.weight→bert.encoder.layer.8.attention.output.dense.weight\n",
      "bert.encoder.layer.8.attention.output.dense.bias→bert.encoder.layer.8.attention.output.dense.bias\n",
      "bert.encoder.layer.8.attention.output.LayerNorm.gamma→bert.encoder.layer.8.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.8.attention.output.LayerNorm.beta→bert.encoder.layer.8.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.8.intermediate.dense.weight→bert.encoder.layer.8.intermediate.dense.weight\n",
      "bert.encoder.layer.8.intermediate.dense.bias→bert.encoder.layer.8.intermediate.dense.bias\n",
      "bert.encoder.layer.8.output.dense.weight→bert.encoder.layer.8.output.dense.weight\n",
      "bert.encoder.layer.8.output.dense.bias→bert.encoder.layer.8.output.dense.bias\n",
      "bert.encoder.layer.8.output.LayerNorm.gamma→bert.encoder.layer.8.output.LayerNorm.gamma\n",
      "bert.encoder.layer.8.output.LayerNorm.beta→bert.encoder.layer.8.output.LayerNorm.beta\n",
      "bert.encoder.layer.9.attention.self.query.weight→bert.encoder.layer.9.attention.selfattn.query.weight\n",
      "bert.encoder.layer.9.attention.self.query.bias→bert.encoder.layer.9.attention.selfattn.query.bias\n",
      "bert.encoder.layer.9.attention.self.key.weight→bert.encoder.layer.9.attention.selfattn.key.weight\n",
      "bert.encoder.layer.9.attention.self.key.bias→bert.encoder.layer.9.attention.selfattn.key.bias\n",
      "bert.encoder.layer.9.attention.self.value.weight→bert.encoder.layer.9.attention.selfattn.value.weight\n",
      "bert.encoder.layer.9.attention.self.value.bias→bert.encoder.layer.9.attention.selfattn.value.bias\n",
      "bert.encoder.layer.9.attention.output.dense.weight→bert.encoder.layer.9.attention.output.dense.weight\n",
      "bert.encoder.layer.9.attention.output.dense.bias→bert.encoder.layer.9.attention.output.dense.bias\n",
      "bert.encoder.layer.9.attention.output.LayerNorm.gamma→bert.encoder.layer.9.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.9.attention.output.LayerNorm.beta→bert.encoder.layer.9.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.9.intermediate.dense.weight→bert.encoder.layer.9.intermediate.dense.weight\n",
      "bert.encoder.layer.9.intermediate.dense.bias→bert.encoder.layer.9.intermediate.dense.bias\n",
      "bert.encoder.layer.9.output.dense.weight→bert.encoder.layer.9.output.dense.weight\n",
      "bert.encoder.layer.9.output.dense.bias→bert.encoder.layer.9.output.dense.bias\n",
      "bert.encoder.layer.9.output.LayerNorm.gamma→bert.encoder.layer.9.output.LayerNorm.gamma\n",
      "bert.encoder.layer.9.output.LayerNorm.beta→bert.encoder.layer.9.output.LayerNorm.beta\n",
      "bert.encoder.layer.10.attention.self.query.weight→bert.encoder.layer.10.attention.selfattn.query.weight\n",
      "bert.encoder.layer.10.attention.self.query.bias→bert.encoder.layer.10.attention.selfattn.query.bias\n",
      "bert.encoder.layer.10.attention.self.key.weight→bert.encoder.layer.10.attention.selfattn.key.weight\n",
      "bert.encoder.layer.10.attention.self.key.bias→bert.encoder.layer.10.attention.selfattn.key.bias\n",
      "bert.encoder.layer.10.attention.self.value.weight→bert.encoder.layer.10.attention.selfattn.value.weight\n",
      "bert.encoder.layer.10.attention.self.value.bias→bert.encoder.layer.10.attention.selfattn.value.bias\n",
      "bert.encoder.layer.10.attention.output.dense.weight→bert.encoder.layer.10.attention.output.dense.weight\n",
      "bert.encoder.layer.10.attention.output.dense.bias→bert.encoder.layer.10.attention.output.dense.bias\n",
      "bert.encoder.layer.10.attention.output.LayerNorm.gamma→bert.encoder.layer.10.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.10.attention.output.LayerNorm.beta→bert.encoder.layer.10.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.10.intermediate.dense.weight→bert.encoder.layer.10.intermediate.dense.weight\n",
      "bert.encoder.layer.10.intermediate.dense.bias→bert.encoder.layer.10.intermediate.dense.bias\n",
      "bert.encoder.layer.10.output.dense.weight→bert.encoder.layer.10.output.dense.weight\n",
      "bert.encoder.layer.10.output.dense.bias→bert.encoder.layer.10.output.dense.bias\n",
      "bert.encoder.layer.10.output.LayerNorm.gamma→bert.encoder.layer.10.output.LayerNorm.gamma\n",
      "bert.encoder.layer.10.output.LayerNorm.beta→bert.encoder.layer.10.output.LayerNorm.beta\n",
      "bert.encoder.layer.11.attention.self.query.weight→bert.encoder.layer.11.attention.selfattn.query.weight\n",
      "bert.encoder.layer.11.attention.self.query.bias→bert.encoder.layer.11.attention.selfattn.query.bias\n",
      "bert.encoder.layer.11.attention.self.key.weight→bert.encoder.layer.11.attention.selfattn.key.weight\n",
      "bert.encoder.layer.11.attention.self.key.bias→bert.encoder.layer.11.attention.selfattn.key.bias\n",
      "bert.encoder.layer.11.attention.self.value.weight→bert.encoder.layer.11.attention.selfattn.value.weight\n",
      "bert.encoder.layer.11.attention.self.value.bias→bert.encoder.layer.11.attention.selfattn.value.bias\n",
      "bert.encoder.layer.11.attention.output.dense.weight→bert.encoder.layer.11.attention.output.dense.weight\n",
      "bert.encoder.layer.11.attention.output.dense.bias→bert.encoder.layer.11.attention.output.dense.bias\n",
      "bert.encoder.layer.11.attention.output.LayerNorm.gamma→bert.encoder.layer.11.attention.output.LayerNorm.gamma\n",
      "bert.encoder.layer.11.attention.output.LayerNorm.beta→bert.encoder.layer.11.attention.output.LayerNorm.beta\n",
      "bert.encoder.layer.11.intermediate.dense.weight→bert.encoder.layer.11.intermediate.dense.weight\n",
      "bert.encoder.layer.11.intermediate.dense.bias→bert.encoder.layer.11.intermediate.dense.bias\n",
      "bert.encoder.layer.11.output.dense.weight→bert.encoder.layer.11.output.dense.weight\n",
      "bert.encoder.layer.11.output.dense.bias→bert.encoder.layer.11.output.dense.bias\n",
      "bert.encoder.layer.11.output.LayerNorm.gamma→bert.encoder.layer.11.output.LayerNorm.gamma\n",
      "bert.encoder.layer.11.output.LayerNorm.beta→bert.encoder.layer.11.output.LayerNorm.beta\n",
      "bert.pooler.dense.weight→bert.pooler.dense.weight\n",
      "bert.pooler.dense.bias→bert.pooler.dense.bias\n",
      "cls.predictions.bias→cls.predictions.bias\n",
      "cls.predictions.transform.dense.weight→cls.predictions.transform.dense.weight\n",
      "cls.predictions.transform.dense.bias→cls.predictions.transform.dense.bias\n",
      "cls.predictions.transform.LayerNorm.gamma→cls.predictions.transform.LayerNorm.gamma\n",
      "cls.predictions.transform.LayerNorm.beta→cls.predictions.transform.LayerNorm.beta\n",
      "cls.predictions.decoder.weight→cls.predictions.decoder.weight\n",
      "cls.seq_relationship.weight→cls.seq_relationship.seq_relationship.weight\n",
      "cls.seq_relationship.bias→cls.seq_relationship.seq_relationship.bias\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "IncompatibleKeys(missing_keys=[], unexpected_keys=[])"
      ]
     },
     "execution_count": 39,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# BERTの基本モデル\n",
    "net_bert = BertModel(config)\n",
    "net_bert.eval()\n",
    "\n",
    "# 事前学習課題のアダプターモジュールを搭載したBERT\n",
    "net = BertForMaskedLM(config, net_bert)\n",
    "net.eval()\n",
    "\n",
    "# 学習済みの重みをロード\n",
    "weights_path = \"./weights/pytorch_model.bin\"\n",
    "loaded_state_dict = torch.load(weights_path)\n",
    "\n",
    "\n",
    "# 現在のネットワークモデルのパラメータ名\n",
    "param_names = []  # パラメータの名前を格納していく\n",
    "\n",
    "for name, param in net.named_parameters():\n",
    "    param_names.append(name)\n",
    "\n",
    "\n",
    "# 現在のネットワークの情報をコピーして新たなstate_dictを作成\n",
    "new_state_dict = net.state_dict().copy()\n",
    "\n",
    "# 新たなstate_dictに学習済みの値を代入\n",
    "for index, (key_name, value) in enumerate(loaded_state_dict.items()):\n",
    "    name = param_names[index]  # 現在のネットワークでのパラメータ名を取得\n",
    "    new_state_dict[name] = value  # 値を入れる\n",
    "    print(str(key_name)+\"→\"+str(name))  # 何から何に入ったかを表示\n",
    "\n",
    "    # 現在のネットワークのパラメータを全部ロードしたら終える\n",
    "    if index+1 >= len(param_names):\n",
    "        break\n",
    "\n",
    "# 新たなstate_dictを構築したBERTモデルに与える\n",
    "net.load_state_dict(new_state_dict)\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 事前学習課題Masked Language Modelを試す"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 40,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "['[CLS]', 'i', 'accessed', 'the', 'bank', 'account', '.', '[SEP]', 'we', 'play', 'soccer', 'at', 'the', 'bank', 'of', 'the', 'river', '.', '[SEP]']\n"
     ]
    }
   ],
   "source": [
    "# 入力する文章を用意\n",
    "text = \"[CLS] I accessed the bank account. [SEP] We play soccer at the bank of the river. [SEP]\"\n",
    "\n",
    "# 単語分割Tokenizerを用意\n",
    "tokenizer = BertTokenizer(\n",
    "    vocab_file=\"./vocab/bert-base-uncased-vocab.txt\", do_lower_case=True)\n",
    "\n",
    "# 文章を単語分割\n",
    "tokenized_text = tokenizer.tokenize(text)\n",
    "\n",
    "print(tokenized_text)\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 41,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "['[CLS]', 'i', 'accessed', 'the', 'bank', 'account', '.', '[SEP]', 'we', 'play', 'soccer', 'at', 'the', '[MASK]', 'of', 'the', 'river', '.', '[SEP]']\n"
     ]
    }
   ],
   "source": [
    "# 単語をマスクする。今回は13単語目のbankをマスクして当てさせる\n",
    "masked_index = 13\n",
    "tokenized_text[masked_index] = '[MASK]'\n",
    "\n",
    "print(tokenized_text)  # 13単語目が[MASK]になっている\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 42,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[101, 1045, 11570, 1996, 2924, 4070, 1012, 102, 2057, 2377, 4715, 2012, 1996, 103, 1997, 1996, 2314, 1012, 102]\n"
     ]
    }
   ],
   "source": [
    "# 単語をIDに変換する\n",
    "indexed_tokens = tokenizer.convert_tokens_to_ids(tokenized_text)\n",
    "print(indexed_tokens)\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 43,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]\n"
     ]
    }
   ],
   "source": [
    "# 1文目に0を2文目に1を入れた文章IDを用意\n",
    "\n",
    "\n",
    "def seq2id(indexed_tokens):\n",
    "    '''分かち書きされた単語ID列を文章IDに。[SEP]で分ける'''\n",
    "\n",
    "    segments_ids = []\n",
    "    seq_id = 0\n",
    "\n",
    "    for word_id in indexed_tokens:\n",
    "        segments_ids.append(seq_id)  # seq_id=o or 1を追加\n",
    "\n",
    "        # [SEP]を発見したら2文目になるので以降idを1に\n",
    "        if word_id == 102:  # IDの102が[SEP]である\n",
    "            seq_id = 1\n",
    "\n",
    "    return segments_ids\n",
    "\n",
    "\n",
    "# 実行\n",
    "segments_ids = seq2id(indexed_tokens)\n",
    "print(segments_ids)\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 44,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "bank\n"
     ]
    }
   ],
   "source": [
    "# モデルで推論\n",
    "\n",
    "# リストをPyTorchのテンソルにしてモデルに入力\n",
    "tokens_tensor = torch.tensor([indexed_tokens])\n",
    "segments_tensors = torch.tensor([segments_ids])\n",
    "\n",
    "# 推論\n",
    "with torch.no_grad():\n",
    "    prediction_scores, seq_relationship_score = net(\n",
    "        tokens_tensor, segments_tensors)\n",
    "\n",
    "# 推論したIDを単語に戻す\n",
    "predicted_index = torch.argmax(prediction_scores[0, masked_index]).item()\n",
    "predicted_token = tokenizer.convert_ids_to_tokens([predicted_index])[0]\n",
    "print(predicted_token)\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 事前学習課題Next Sentence Predictionを試す"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 47,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "tensor([[-1.5349,  3.1654]])\n",
      "tensor([[0.1773, 0.9595]])\n"
     ]
    }
   ],
   "source": [
    "print(seq_relationship_score)\n",
    "print(torch.sigmoid(seq_relationship_score))\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "# 上記の出力を見ると、クラス0：2つの文章が意味のあるれんぞくしたもの、クラス1：2つの文章は無関係\n",
    "# のうち、クラス1の無関係と判定。\n",
    "\n",
    "# 入力文章は\n",
    "# text = \"[CLS] I accessed the bank account. [SEP] We play soccer at the bank of the river. [SEP]\"\n",
    "# と無関係だったので、きちんとNext Sentence Predictionができていることが分かる。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "以上"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.6.7"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
