How can I solve this problem?

#16
by aldennX - opened

Traceback (most recent call last):
File "C:\Users\Administrator\text-generation-webui\server.py", line 68, in load_model_wrapper
shared.model, shared.tokenizer = load_model(shared.model_name, loader)
File "C:\Users\Administrator\text-generation-webui\modules\models.py", line 78, in load_model
output = load_func_maploader
File "C:\Users\Administrator\text-generation-webui\modules\models.py", line 232, in llamacpp_loader
from modules.llamacpp_model import LlamaCppModel
File "C:\Users\Administrator\text-generation-webui\modules\llamacpp_model.py", line 11, in
import llama_cpp
File "C:\Users\Administrator.conda\envs\textgen\lib\site-packages\llama_cpp_init_.py", line 1, in
from .llama_cpp import *
File "C:\Users\Administrator.conda\envs\textgen\lib\site-packages\llama_cpp\llama_cpp.py", line 1292, in
llama_backend_init(c_bool(False))
File "C:\Users\Administrator.conda\envs\textgen\lib\site-packages\llama_cpp\llama_cpp.py", line 403, in llama_backend_init
return _lib.llama_backend_init(numa)
OSError: [WinError -1073741795] Windows Error 0xc000001d

windo10 - CPU

This is a GPTQ model and it looks like you're trying to load it with Loader = llama.cpp.

Please use the AutoGPTQ loader.

Error using AutoGPTQ

ERROR:The model could not be loaded because its checkpoint file in .bin/.pt/.safetensors format could not be located.

ERROR:The model could not be loaded because its checkpoint file in .bin/.pt/.safetensors format could not be located.
ERROR:No model is loaded! Select one in the Model tab.

This is a GPTQ model and it looks like you're trying to load it with Loader = llama.cpp.

Please use the AutoGPTQ loader.

ERROR:The model could not be loaded because its checkpoint file in .bin/.pt/.safetensors format could not be located.
ERROR:No model is loaded! Select one in the Model tab.

models:
chinese-alpaca-2-7b.ggmlv3.q4_0.bin
Chinese-Llama-2-7b.ggmlv3.q4_0.bin

models:
chinese-alpaca-2-7b.ggmlv3.q4_0.bin
Chinese-Llama-2-7b.ggmlv3.q4_0.bin

What do you mean, models: chinese-alpaca GGML? This is the Falcon 7B Instruct GPTQ model.

Please follow the instructions in the README for using this Falcon 7B Instruct GPTQ model.

aldennX changed discussion status to closed

Sign up or log in to comment