hebrew-gpt_neo-small / tokenizer_config.json
Norod78's picture
Hebrew gpt_neo small model ckpt-80500 commit from $USER
d6e6407
raw
history blame
216 Bytes
{"do_lower_case": false, "max_len": 1024, "bos_token": "<|startoftext|>", "eos_token": "<|endoftext|>", "unk_token": "<|endoftext|>", "special_tokens_map_file": "special_tokens_map.json", "full_tokenizer_file": null}