config.json
Do you have a unique config.json file for this? I have tried ones from other models but none work.
This is a GGML model so no config.json is required. What are you trying to do?
If you want to do GPU inference, then check out my 4bit GPTQ repo for this model: https://huggingface.co/TheBloke/alpaca-lora-65B-GPTQ-4bit
I am attempting to run a CPU instance of it. I have had success with the model from Pi: Pi3141/alpaca-lora-30B-ggml, this includes a params.json file. I attempted to edit this for your model but its not working. I will be honest, my understanding of this is limited! I am still learning about all the libraries and requirements to get it working so am using oobabooga which may be causing the issue with your model.
Nevermind, its working now, I just needed to move the file out of its sub directory!
OK glad it's working now!