xu-song's picture
update
f331792
raw
history blame
162 Bytes
"""
这里有个 emoji.json 是干嘛的?
"""
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("abeja/gpt-neox-japanese-2.7b")