Nicola De Cao
fixing tokenizer
b11fda7
raw
history blame contribute delete
No virus
237 Bytes
{"model_max_length": 512, "unk_token": "[UNK]", "cls_token": "[CLS]", "sep_token": "[SEP]", "pad_token": "[PAD]", "mask_token": "[MASK]", "model_input_names": ["input_ids", "attention_mask"], "tokenizer_class": "PreTrainedTokenizerFast"}