GPT2-SCP-ContainmentProcedures / special_tokens_map.json
Azaghast's picture
add tokenizer
fe62aee
raw
history blame contribute delete
363 Bytes
{"bos_token": {"content": "[BOS]", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, "eos_token": {"content": "[EOS]", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, "unk_token": {"content": "<|endoftext|>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, "pad_token": "[PAD]"}