I am using GPU and the following is an error prompt. May I know how to resolve it

#15
by aldennX - opened

bin C:\Users\Administrator\Desktop\oobabooga_windows\installer_files\env\lib\site-packages\bitsandbytes\libbitsandbytes_cpu.so
C:\Users\Administrator\Desktop\oobabooga_windows\installer_files\env\lib\site-packages\bitsandbytes\cextension.py:34: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers, 8-bit multiplication, and GPU quantization are unavailable.
warn("The installed version of bitsandbytes was compiled without GPU support. "
function 'cadam32bit_grad_fp32' not found
2023-08-05 20:46:42 INFO:Loading the extension "gallery"...
Running on local URL: http://127.0.0.1:7860

To create a public link, set share=True in launch().

There's no errors there, that's all fine. You can ignore the bitsandbytes messages as you're not using it.

Now there is a new problem, which I have solved.

windows10-CPU

Traceback (most recent call last):
File "C:\Users\Administrator\text-generation-webui\server.py", line 68, in load_model_wrapper
shared.model, shared.tokenizer = load_model(shared.model_name, loader)
File "C:\Users\Administrator\text-generation-webui\modules\models.py", line 78, in load_model
output = load_func_maploader
File "C:\Users\Administrator\text-generation-webui\modules\models.py", line 232, in llamacpp_loader
from modules.llamacpp_model import LlamaCppModel
File "C:\Users\Administrator\text-generation-webui\modules\llamacpp_model.py", line 11, in
import llama_cpp
File "C:\Users\Administrator.conda\envs\textgen\lib\site-packages\llama_cpp_init_.py", line 1, in
from .llama_cpp import *
File "C:\Users\Administrator.conda\envs\textgen\lib\site-packages\llama_cpp\llama_cpp.py", line 1292, in
llama_backend_init(c_bool(False))
File "C:\Users\Administrator.conda\envs\textgen\lib\site-packages\llama_cpp\llama_cpp.py", line 403, in llama_backend_init
return _lib.llama_backend_init(numa)
OSError: [WinError -1073741795] Windows Error 0xc000001d

aldennX changed discussion status to closed

Sign up or log in to comment