nlx-gpt / VQAX_p /nle_gpt2_tokenizer_0 /special_tokens_map.json
Fawaz's picture
Upload VQAX_p/nle_gpt2_tokenizer_0/special_tokens_map.json
123ec57
raw
history blame
453 Bytes
{"bos_token": {"content": "<|endoftext|>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, "eos_token": {"content": "<|endoftext|>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, "unk_token": {"content": "<|endoftext|>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, "pad_token": "<pad>", "additional_special_tokens": ["<question>", "<answer>", "<explanation>"]}