选择后报错
Traceback (most recent call last):
File "C:\Users\Administrator\text-generation-webui\server.py", line 68, in load_model_wrapper
shared.model, shared.tokenizer = load_model(shared.model_name, loader)
File "C:\Users\Administrator\text-generation-webui\modules\models.py", line 78, in load_model
output = load_func_maploader
File "C:\Users\Administrator\text-generation-webui\modules\models.py", line 232, in llamacpp_loader
from modules.llamacpp_model import LlamaCppModel
File "C:\Users\Administrator\text-generation-webui\modules\llamacpp_model.py", line 11, in
import llama_cpp
File "C:\Users\Administrator.conda\envs\textgen\lib\site-packages\llama_cpp_init_.py", line 1, in
from .llama_cpp import *
File "C:\Users\Administrator.conda\envs\textgen\lib\site-packages\llama_cpp\llama_cpp.py", line 1292, in
llama_backend_init(c_bool(False))
File "C:\Users\Administrator.conda\envs\textgen\lib\site-packages\llama_cpp\llama_cpp.py", line 403, in llama_backend_init
return _lib.llama_backend_init(numa)
OSError: [WinError -1073741795] Windows Error 0xc000001d
ggml的格式,llama_cpp已经不支持,需要ggnf