Add special_tokens_map.json

#1
by seanmor5 - opened
No description provided.

Hey, we're planning to use this model in our Elixir library, but we need the special tokens map for the loaded fast tokenizer to be valid.

deepseek-admin changed pull request status to merged

Sign up or log in to comment