it seems that it is a llama-7b LoRA

#1
by AzzHug - opened

RuntimeError: Error(s) in loading state_dict for PeftModelForCausalLM:
size mismatch for base_model.model.model.layers.0.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([8, 4096]) from checkpoint, the shape in current model is torch.Size([8, 5120]).

perhaps you uploaded the wrong file?

Sign up or log in to comment