summary_loop10 / special_tokens_map.json
philippelaban's picture
Initial commit of tokenizer and model
17b3036
raw
history blame contribute delete
89 Bytes
{"bos_token": "ĠST", "eos_token": "END", "unk_token": "<|endoftext|>", "pad_token": "!"}