CodeLlama and LoRA

#17
by hk11 - opened

Does CodeLlama supports LoRA ? What should be the target model ?

config = LoraConfig( r=16, lora_alpha=32 target_modules= ?, lora_dropout=0.05, bias="none", task_type="CAUSAL_LM" )

i went ahead with , removing target_modules from the LoraConfig

config = LoraConfig( r=16, lora_alpha=32 , lora_dropout=0.05, bias="none", task_type="CAUSAL_LM" )

when i printed the trainable parameters

trainable params: 19660800 || all params: 33763631104 || trainable%: 0.05823070373989121

is this is the right configuration to go for fine-tunning ?

Sign up or log in to comment