german_gpt_small / tokenizer_config.json
vsc33938
model uploaded by nikhilnagaraj
138f32a
{"pad_token": "<|endoftext|>", "special_tokens_map_file": null, "full_tokenizer_file": null}