xu-song's picture
add more tokenizers
a1b0cd0
raw
history blame
143 Bytes
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("deepseek-ai/deepseek-llm-7b-base", trust_remote_code=True)