gpt2-small-dutch-finetune-oscar / tokenizer_config.json
Thomas Dehaene
Initial commit
937ba9e
raw
history blame contribute delete
92 Bytes
{"pad_token": "<|endoftext|>", "special_tokens_map_file": null, "full_tokenizer_file": null}