ctransformers: OSError No such file or directory issue

#4
by lazyDataScientist - opened

After running this code I am receiving this error. Note: i am running this in a CoLab environment.

from ctransformers import AutoModelForCausalLM

# Set gpu_layers to the number of layers to offload to GPU. Set to 0 if no GPU acceleration is available on your system.
llm = AutoModelForCausalLM.from_pretrained("TheBloke/Mistral-7B-Instruct-v0.1-GGUF", model_file="mistral-7b-instruct-v0.1.Q2_K.gguf", model_type="mistral", gpu_layers=50)
---------------------------------------------------------------------------
OSError                                   Traceback (most recent call last)
<ipython-input-7-a38c39cda463> in <cell line: 4>()
      2 
      3 # Set gpu_layers to the number of layers to offload to GPU. Set to 0 if no GPU acceleration is available on your system.
----> 4 llm = AutoModelForCausalLM.from_pretrained("TheBloke/Mistral-7B-Instruct-v0.1-GGUF", model_file="mistral-7b-instruct-v0.1.Q2_K.gguf", model_type="mistral", gpu_layers=50)

3 frames
/usr/local/lib/python3.10/dist-packages/ctransformers/hub.py in from_pretrained(cls, model_path_or_repo_id, model_type, model_file, config, lib, local_files_only, revision, hf, **kwargs)
    173             )
    174 
--> 175         llm = LLM(
    176             model_path=model_path,
    177             model_type=model_type,

/usr/local/lib/python3.10/dist-packages/ctransformers/llm.py in __init__(self, model_path, model_type, config, lib)
    244             model_type = "gguf"
    245 
--> 246         self._lib = load_library(lib, gpu=config.gpu_layers > 0)
    247         self._llm = self._lib.ctransformers_llm_create(
    248             model_path.encode(),

/usr/local/lib/python3.10/dist-packages/ctransformers/llm.py in load_library(path, gpu)
    124     if "cuda" in path:
    125         load_cuda()
--> 126     lib = CDLL(path)
    127 
    128     lib.ctransformers_llm_create.argtypes = [

/usr/lib/python3.10/ctypes/__init__.py in __init__(self, name, mode, handle, use_errno, use_last_error, winmode)
    372 
    373         if handle is None:
--> 374             self._handle = _dlopen(self._name, mode)
    375         else:
    376             self._handle = handle

OSError: libcudart.so.12: cannot open shared object file: No such file or directory

I'm getting a different error while running ctransformers in my laptop: "Model type 'mistral' is not supported."

I've used falcon 7b and wizardlm 7b in the past with the exact same setup. Not sure whether ctransformers is actually compatible with this model or not :s

I have a similar issue.

RuntimeError: Failed to create LLM 'mistral' from 'Models\mistral-7b-v0.1.Q4_K_M.gguf'

After updating CTransformers, the issue was solved on my side (apologies for delay on reply).

Sign up or log in to comment