tokenizer缺少merges.txt

#13
by spespusliar - opened

加载tokenizer时报错envs/base2/lib/python3.9/site-packages/transformers/models/qwen2/tokenization_qwen2.py", line 179, in init
with open(merges_file, encoding="utf-8") as merges_handle:

TypeError: expected str, bytes or os.PathLike object, not NoneType
tokenization_qwen中包含了字典VOCAB_FILES_NAMES = {
"vocab_file": "vocab.json",
"merges_file": "merges.txt",
"tokenizer_file": "tokenizer.json",
},但是没有看到merges.txt

Alibaba-NLP org

update now

zyznull changed discussion status to closed

Sign up or log in to comment