gpt2-wikitext2 / special_tokens_map.json
Akash7897's picture
add tokenizer
28f2a2e
raw
history blame
90 Bytes
{"bos_token": "<|endoftext|>", "eos_token": "<|endoftext|>", "unk_token": "<|endoftext|>"}