How am I suppose to merge the lora to llama?
#1
by
deleted
- opened
This is lora weight, you should use it with peft library in conjunction with llama-2-7b base model
This is lora weight, you should use it with peft library in conjunction with llama-2-7b base model