german-gpt2-faust / tokenizer_config.json
stefan-it's picture
vocab: add vocab and tokenizer configuration for fine-tuned model (Faust I and II German GPT-2)
5611a7d
raw
history blame contribute delete
No virus
62 Bytes
{"special_tokens_map_file": null, "full_tokenizer_file": null}