Target modules {'out_proj', 'Wqkv'} is not found in the phi-2 model how can I fix this error?

#115
by roy1109 - opened

from transformers import BitsAndBytesConfig

Quantization + LoRA = QLoRA

bnb_4bit = BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_use_double_quant=True,
bnb_4bit_quant_type="nf4",
bnb_4bit_compute_dtype=torch.bfloat16
)

bnb_8bit = BitsAndBytesConfig(
load_in_8bit=True,
)

model = AutoModelForCausalLM.from_pretrained(
model_id,
quantization_config=bnb_4bit, # 4λΉ„νŠΈ or 8λΉ„νŠΈ 쀑 선택
device_map="auto",
trust_remote_code=True
)

model

#next code
from peft import LoraConfig, get_peft_model

config = LoraConfig(
r=8,
lora_alpha=32,
lora_dropout=0.05,
bias="none",
task_type="CAUSAL_LM",
target_modules=["Wqkv", "out_proj", ]
)

model = get_peft_model(model, config)
model.print_trainable_parameters()

this is my error message

ValueError Traceback (most recent call last)
in <cell line: 13>()
11 )
12
---> 13 model = get_peft_model(model, config)
14 model.print_trainable_parameters()

5 frames
/usr/local/lib/python3.10/dist-packages/peft/tuners/tuners_utils.py in inject_adapter(self, model, adapter_name)
303
304 if not is_target_modules_in_base_model:
--> 305 raise ValueError(
306 f"Target modules {peft_config.target_modules} not found in the base model. "
307 f"Please check the target modules and try again."

ValueError: Target modules {'out_proj', 'Wqkv'} not found in the base model. Please check the target modules and try again.

This code was worked few months ago, but now Is doesn't work. How should I change the target modules?

changetarget_modules from ["Wqkv", "out_proj", ] to ["q_proj", "k_proj", "v_proj", "dense"] would solve the problem.

changetarget_modules from ["Wqkv", "out_proj", ] to ["q_proj", "k_proj", "v_proj", "dense"] would solve the problem.

thanks!

Sign up or log in to comment