rand_model / special_tokens_map.json
dSiddhesh's picture
Upload tokenizer
767492f
raw
history blame contribute delete
239 Bytes
{
"cls_token": "[CLS]",
"mask_token": "[MASK]",
"pad_token": {
"content": "[PAD]",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"sep_token": "[SEP]",
"unk_token": "[UNK]"
}