framing_immigration_specific / tokenizer_config.json
juliamendelsohn's picture
commit from juliame
9caec0f
{"model_max_length": 512, "do_lower_case": false, "special_tokens_map_file": "/shared/2/projects/framing/models/finetune/roberta_cased_09-01-20/special_tokens_map.json", "full_tokenizer_file": null}