levanter-gpt / special_tokens_map.json
dlwh's picture
apparently one needs the tokenizer
ede4e43
raw
history blame contribute delete
99 Bytes
{
"bos_token": "<|endoftext|>",
"eos_token": "<|endoftext|>",
"unk_token": "<|endoftext|>"
}