How can I load this model locally?

#2
by seanjk - opened

I have fine tuned a unsloth/gemma-7b-bnb-4bit already and I want to load this model to check the difference between the fine tuned model and this model. However, whatever what method I use even the Inference API on model card, it only said that No package metadata was found for bitsandbytes

Sign up or log in to comment