geb-1.3b / tokenizer_config.json
luxq's picture
upload config, tokenizer and modeling file
93e390f verified
raw
history blame
No virus
236 Bytes
{
"name_or_path": "",
"remove_space": false,
"do_lower_case": false,
"tokenizer_class": "GEBTokenizer",
"auto_map": {
"AutoTokenizer": [
"tokenization_geb.GEBTokenizer",
null
]
}
}