Hijab2 / adapter_config.json
Tjay143's picture
Upload adapter_config.json
10b6b59 verified
raw
history blame
136 Bytes
{
"adapter_type": "lora",
"lora_alpha": 16,
"lora_dropout": 0.1,
"lora_r": 8,
"target_modules": ["q_proj", "v_proj"]
}