File size: 691 Bytes
953c86e
1106fd7
 
 
 
 
 
 
 
 
 
 
 
953c86e
d1bd63c
 
1106fd7
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
---
library_name: peft
tags:
- alignment-handbook
- generated_from_trainer
- trl
- dpo
base_model: DUAL-GPO/zephyr-7b-dpo-new-lora-v1-merged
datasets:
- HuggingFaceH4/ultrafeedback_binarized
model-index:
- name: zephyr-7b-dpo-0k-15k-i1
  results: []
license: apache-2.0
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# zephyr-7b-dpo-0k-15k-i1

This model is a fine-tuned version of [DUAL-GPO/zephyr-7b-dpo-new-lora-v1-merged](https://huggingface.co/DUAL-GPO/zephyr-7b-dpo-new-lora-v1-merged) on the HuggingFaceH4/ultrafeedback_binarized dataset.