banglabert_large / tokenizer_config.json
abhik1505040's picture
Add initial files
155c8bb
raw
history blame contribute delete
119 Bytes
{"do_lower_case": false, "tokenize_chinese_chars": false, "special_tokens_map_file": null, "full_tokenizer_file": null}