free-small-1epoch / tokenizer_config.json
Heatz's picture
Upload tokenizer
a073fc1
raw
history blame contribute delete
218 Bytes
{
"name_or_path": "free-smallgpt-1epoch",
"special_tokens": [
"<unk>",
"|endoftext|"
],
"special_tokens_map_file": "gpt2_small/special_tokens_map.json",
"tokenizer_class": "PreTrainedTokenizerFast"
}