framing_narrative / tokenizer_config.json
juliamendelsohn's picture
commit from juliame
d2de03f
raw
history blame
198 Bytes
{"model_max_length": 512, "do_lower_case": false, "special_tokens_map_file": "/shared/2/projects/framing/models/finetune/roberta_cased_09-01-20/special_tokens_map.json", "full_tokenizer_file": null}