7e1743f dcb7842
1
2
3
4
5
6
7
8
9
10
11
12
13
--- license: mit --- This is a test DPO finetune of Microsoft phi-2 Two DPO datasets are used: Intel/orca_dpo_pairs argilla/ultrafeedback-binarized-preferences-cleaned Training was for 1 epoch as a qlora with Rank 64, Delta 128