File size: 593 Bytes
b580c0d
 
 
e77bc3d
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
---
license: mit
---

LoRA weights for [`lmsys/vicuna-7b-delta-v0`](https://huggingface.co/lmsys/vicuna-7b-delta-v0)


Trained on 'taesiri/webnlg-triplets-explanation-v1' for 4 epochs. 

Command:

```
 WORLD_SIZE=2 CUDA_VISIBLE_DEVICES=0,1 torchrun --nproc_per_node=2 --master_port=1234 finetune.py     --base_model='./checkpoints/lmsys-vicuna-7B-HF'     --data_path 'taesiri/webnlg-triplets-explanation-v1'     --num_epochs=4     --cutoff_len=512     --group_by_length     --lora_target_modules='[q_proj,k_proj,v_proj,o_proj]'     --lora_r=8      --micro_batch_size=8     --batch_size=32
```