Adapter compatibility
#2
by
yizirui
- opened
Hello! Thanks for sharing your amazing pre-trained adapters; they’ve been really helpful in our research! I have a quick question: are these adapters in this repo specifically for the Llama2-7B model only? Should we finetune other adapters for other models like Open-Llama, Llama2-13B, or the new Llama3 family? Also, do you have plans to release more pre-trained adapters? Looking forward to your insights!
Hi, thanks for attention. Yes it is specified for llama-2-7b which is our major focus, for other llms you can use our fientune code in github for efficient tuning.
yuchen005
changed discussion status to
closed
yuchen005
changed discussion status to
open