PeterV09 commited on
Commit
5a720df
1 Parent(s): 93fb4b1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -11,7 +11,7 @@ language:
11
  # Model Card for Deita 7B V1.0 SFT (6k)
12
 
13
  Deita is an open-sourced project designed to facilitate **Automatic Data Selection** for instruction tuning in Large Language Models (LLMs).
14
- DeitaDeita 7B V1.0 SFT (6k) is a fine-tuned version of Mistral-7B-v0.1 that was trained on 6k automatically selected lightweight, high-quality alignment SFT data: [Deita 6K V0](https://huggingface.co/datasets/hkust-nlp/deita-6k-v0).
15
 
16
  ## Model description
17
 
@@ -35,7 +35,7 @@ DeitaDeita 7B V1.0 SFT (6k) is a fine-tuned version of Mistral-7B-v0.1 that was
35
  | **Open-sourced Models based on Mistral-7B** | | | | | |
36
  | Mistral-7B-Instruct-v0.1 | -- | -- | 6.84 | 69.65 | 60.45 |
37
  | Zephyr-7B-sft | SFT | 200K SFT | 5.32 | 75.12 | 60.93 |
38
- | $\text{Zephyr-7B-}\beta$ | SFT + DPO | 200K SFT + 60K DPO | 7.34 | 90.60 | 66.36 |
39
  | OpenChat-3.5 | C-RLFT | >70K C-RLFT | 7.81 | 88.51 | -- |
40
  | Starling-7B | C-RLFT + APA | >70K C-RLFT + 183K APA | 8.09 | 91.99 | -- |
41
  | Random | SFT | 10K SFT | 5.89 | 56.90 | 61.72 |
 
11
  # Model Card for Deita 7B V1.0 SFT (6k)
12
 
13
  Deita is an open-sourced project designed to facilitate **Automatic Data Selection** for instruction tuning in Large Language Models (LLMs).
14
+ Deita 7B V1.0 SFT (6k) is a fine-tuned version of Mistral-7B-v0.1 that was trained on 6k automatically selected lightweight, high-quality alignment SFT data: [Deita 6K V0](https://huggingface.co/datasets/hkust-nlp/deita-6k-v0).
15
 
16
  ## Model description
17
 
 
35
  | **Open-sourced Models based on Mistral-7B** | | | | | |
36
  | Mistral-7B-Instruct-v0.1 | -- | -- | 6.84 | 69.65 | 60.45 |
37
  | Zephyr-7B-sft | SFT | 200K SFT | 5.32 | 75.12 | 60.93 |
38
+ | Zephyr-7B-beta | SFT + DPO | 200K SFT + 60K DPO | 7.34 | 90.60 | 66.36 |
39
  | OpenChat-3.5 | C-RLFT | >70K C-RLFT | 7.81 | 88.51 | -- |
40
  | Starling-7B | C-RLFT + APA | >70K C-RLFT + 183K APA | 8.09 | 91.99 | -- |
41
  | Random | SFT | 10K SFT | 5.89 | 56.90 | 61.72 |