gpt2-finnish / tokenizer_config.json
aapot's picture
Saving weights and logs of step 10000
12280fc
raw
history blame contribute delete
208 Bytes
{"unk_token": "<|endoftext|>", "bos_token": "<|endoftext|>", "eos_token": "<|endoftext|>", "add_prefix_space": false, "special_tokens_map_file": null, "name_or_path": "./", "tokenizer_class": "GPT2Tokenizer"}