Transformers
English
Inference Endpoints
babylm_tokenizer_32k / tokenizer_config.json
omarmomen's picture
add tokenizer
eda0844
raw
history blame contribute delete
470 Bytes
{"errors": "replace", "bos_token": "<s>", "eos_token": "</s>", "sep_token": "</s>", "cls_token": "<s>", "unk_token": "<unk>", "pad_token": "<pad>", "mask_token": {"content": "<mask>", "single_word": false, "lstrip": true, "rstrip": false, "normalized": false, "__type": "AddedToken"}, "add_prefix_space": false, "trim_offsets": true, "model_max_length": 512, "special_tokens_map_file": null, "name_or_path": "babylm_tokenizer_32k", "tokenizer_class": "RobertaTokenizer"}