为什么我模型输出的是乱码
#7 opened about 9 hours ago
by
Handshow
llama.cpp加载的时候提示llm_load_vocab: missing pre-tokenizer type, using: 'default'这个有影响吗?
#6 opened 14 days ago
by
maqy1995
上下文长度只有512?
3
#3 opened 17 days ago
by
YUCYU
为什么这个包导入ollama用Ollama运行就乱讲一通?
9
#2 opened 27 days ago
by
Kollcn