this model/quant is generating bad output

#1
by vasilee - opened

loaded the cookbook in google colab and ran first 2 cells (installation and model loading) and skipped to the inference and got this

image.png

update: it is not for this model (Phi-3.5-mini-instruct-bnb-4bit), it is for the other, (for Phi-3.5-mini-instruct, with no bnb-4bit ending),
after updating the notebook to Phi-3.5-mini-instruct-bnb-4bit it worked fine

image.png

Sign up or log in to comment