finCapstoneV.51 / adapter_config.json
Sandykgoyal's picture
Upload folder using huggingface_hub
f6778c6 verified
raw
history blame contribute delete
184 Bytes
{
"base_model_name_or_path": "meta-llama/Llama-3.2-3B-Instruct",
"peft_type": "LORA",
"r": 8,
"lora_alpha": 16,
"lora_dropout": 0.0,
"task_type": "CAUSAL_LM"
}