ModuleNotFoundError: No module named 'llama_inference_offload'
#6
by
KongfuAi
- opened
How can i resolve the problem about No module named 'llama_inference_offload'
You need to install GPTQ-for-LLaMa inside text-generation-webui
.
There are basic instructions in my README, and you can see more here: https://github.com/oobabooga/text-generation-webui/wiki/GPTQ-models-(4-bit-mode)