pretrained-model-2 / tokenizer_config.json
AnonymousNLP's picture
Submit Files
a47829e
raw
history blame contribute delete
62 Bytes
{"special_tokens_map_file": null, "full_tokenizer_file": null}