ollama部署下载到本地的模型文件

#6
by hubblebubblepig - opened

参照ollama文档进行本地模型文件的部署
运行ollama run 报错:error loading model...
在ollama的server.log中发现下列错误信息:
"llama_model_load: error loading model: error loading model vocabulary: cannot find tokenizer merges in model file"
"Failed to load dynamic library"

Sign up or log in to comment