compatible-gpt2 / special_tokens_map.json
keyonvafa's picture
add tokenizer
90be645
raw
history blame
90 Bytes
{"bos_token": "<|endoftext|>", "eos_token": "<|endoftext|>", "unk_token": "<|endoftext|>"}