gpt-m-large / special_tokens_map.json
augustocsc's picture
Upload tokenizer
a2549a6
raw
history blame
127 Bytes
{
"bos_token": "<|startoftext|>",
"eos_token": "<|endoftext|>",
"pad_token": "<|pad|>",
"unk_token": "<|endoftext|>"
}