gpt-2-fast-serbian-tokenizer / special_tokens_map.json

Commit History

Upload tokenizer
4957253

datatab commited on