claude-tokenizer / tokenizer_config.json
xianbao's picture
xianbao HF staff
Update tokenizer_config.json
01ec25d verified
raw
history blame
215 Bytes
{
"add_prefix_space": false,
"bos_token": "<EOT>",
"clean_up_tokenization_spaces": true,
"eos_token": "<EOT>",
"model_max_length": 200000,
"tokenizer_class": "GPT2TokenizerFast",
"unk_token": "<EOT>"
}