license: apache-2.0 | |
library_name: adapter-transformers | |
it;s test model, base model is Mistral-7b-v0.1, and finetune dataset is open-platypus_2.5w | |
Fine-Tuning Information | |
GPU: RTX4090 (single core / 24564MiB) | |
model: meta-llama/Llama-2-13b-hf | |
peft_type: LoRA | |
lora_rank: 16 |