PEFT

Why did you use CodeLlama ?

#1
by Undi95 - opened

Hello, just asking this, why CodeLlama ? Why not the OG Llama ?
Does the result of the loRA is good ? I aim to do a Lewd LLM only for that : Lewd
Your project interest me.

I also want to use the original LLaMA 2 33B for finetuning, but meta hasn't released it, so, the only LLaMA 2 model with a 33B size released by meta is CodeLLaMA. And, for some reason, LLM community seems to think that the language model that trained the code is also good for RP. Is the logic of the code affecting the RP as well?

Would it be possible to have the same train but done on Llama 2 13B ? We really need this bro.
But I undertand your point then, thanks for the reply!

Sign up or log in to comment