gpt2-igc-is / tokenizer_config.json
Jón Daðason
Adding model
3162693
raw
history blame
94 Bytes
{"model_max_length": 512, "special_tokens_map_file": null, "tokenizer_class": "GPT2Tokenizer"}