RuntimeError with Vicuna-7B, using with oobabooga text-generation-webui

#1
by syddharth - opened

One gets the following error when trying to use the LoRA with Vicuna-7B model.
RuntimeError: Error(s) in loading state_dict for PeftModelForCausalLM:
size mismatch for base_model.model.lm_head.lora_B.llama-deus-7b-v3-lora.weight: copying a param with shape torch.Size([32000, 128]) from checkpoint, the shape in current model is torch.Size([32001, 128]).

Can the model only be used with Llama models?

Got it running with llama-7b-hf.

Sorry didnt see til just now, yeah I believe it would only work with the base model, but I've never actually tried merging a lora with a different fine tune :o

Sign up or log in to comment