wav2vec2-large-xls-r-300m-turkish-colab / special_tokens_map.json
masapasa's picture
add tokenizer
7709397
raw
history blame contribute delete
309 Bytes
{"bos_token": "<s>", "eos_token": "</s>", "unk_token": "[UNK]", "pad_token": "[PAD]", "additional_special_tokens": [{"content": "<s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, {"content": "</s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}]}