vicuna-7b / tokenizer_config.json
AlekseyKorshuk's picture
Upload tokenizer
482263c
raw
history blame contribute delete
247 Bytes
{
"bos_token": "",
"eos_token": "",
"model_max_length": 2048,
"padding_side": "right",
"special_tokens_map_file": "models/decapoda-research_llama-7b-hf/special_tokens_map.json",
"tokenizer_class": "LlamaTokenizer",
"unk_token": ""
}