japanese-roberta-question-answering / special_tokens_map.json
Younes Belkada
add tokenizer
1d7b146
raw
history blame contribute delete
153 Bytes
{"bos_token": "<s>", "eos_token": "</s>", "unk_token": "<unk>", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]"}