oobabooga-windows 出错 ! 选的是llmam.cpp

#11
by aldennX - opened

ERROR:Failed to load the model.
Traceback (most recent call last):
File "C:\Users\Administrator\text-generation-webui\server.py", line 68, in load_model_wrapper
shared.model, shared.tokenizer = load_model(shared.model_name, loader)
File "C:\Users\Administrator\text-generation-webui\modules\models.py", line 78, in load_model
output = load_func_maploader
File "C:\Users\Administrator\text-generation-webui\modules\models.py", line 232, in llamacpp_loader
from modules.llamacpp_model import LlamaCppModel
File "C:\Users\Administrator\text-generation-webui\modules\llamacpp_model.py", line 11, in
import llama_cpp
File "C:\Users\Administrator.conda\envs\textgen\lib\site-packages\llama_cpp_init_.py", line 1, in
from .llama_cpp import *
File "C:\Users\Administrator.conda\envs\textgen\lib\site-packages\llama_cpp\llama_cpp.py", line 1292, in
llama_backend_init(c_bool(False))
File "C:\Users\Administrator.conda\envs\textgen\lib\site-packages\llama_cpp\llama_cpp.py", line 403, in llama_backend_init
return _lib.llama_backend_init(numa)
OSError: [WinError -1073741795] Windows Error 0xc000001d

Sign up or log in to comment