natolambert
commited on
Commit
β’
eee6ee9
1
Parent(s):
0709c67
Update README.md
Browse files
README.md
CHANGED
@@ -44,23 +44,23 @@ All smaller DPO'd models have strong performance per model size in the category
|
|
44 |
| Model | Size | Alignment | MT-Bench (score) | AlpacaEval (win rate %) |
|
45 |
|-------------|-----|----|---------------|--------------|
|
46 |
| **Tulu-v2-7b** πͺ | **7B** | **dDPO** | **TODO** | **TODO** |
|
47 |
-
| **Tulu-v2-13b** πͺ | **13B** | **dDPO** | **TODO** | **TODO** |
|
48 |
-
| **Tulu-v2-70b** πͺ | **70B** | **dDPO** | **TODO** | **TODO** |
|
49 |
| **Tulu-v2-dpo-7b** πͺ | **7B** | **dDPO** | **TODO** | **TODO** |
|
50 |
-
| **Tulu-v2-dpo-13b** πͺ | **13B** | **dDPO** | **TODO** | **TODO** |
|
51 |
-
| **Tulu-v2-dpo-70b** πͺ | **70B** | **dDPO** | **TODO** | **TODO** |
|
52 |
| StableLM-Tuned-Ξ± | 7B| dSFT |2.75| -|
|
53 |
| MPT-Chat | 7B |dSFT |5.42| -|
|
54 |
| Xwin-LMv0.1 | 7B| dPPO| 6.19| 87.83|
|
55 |
| Mistral-Instructv0.1 | 7B| - | 6.84 |-|
|
56 |
| Zephyr-7b-Ξ± |7B| dDPO| 6.88| -|
|
57 |
| Zephyr-7b-Ξ² πͺ | 7B | dDPO | 7.34 | 90.60 |
|
|
|
|
|
58 |
| Falcon-Instruct | 40B |dSFT |5.17 |45.71|
|
59 |
| Guanaco | 65B | SFT |6.41| 71.80|
|
60 |
| Llama2-Chat | 70B |RLHF |6.86| 92.66|
|
61 |
| Vicuna v1.3 | 33B |dSFT |7.12 |88.99|
|
62 |
| WizardLM v1.0 | 70B |dSFT |7.71 |-|
|
63 |
| Xwin-LM v0.1 | 70B |dPPO |- |95.57|
|
|
|
|
|
64 |
| GPT-3.5-turbo | - |RLHF |7.94 |89.37|
|
65 |
| Claude 2 | - |RLHF |8.06| 91.36|
|
66 |
| GPT-4 | -| RLHF |8.99| 95.28|
|
|
|
44 |
| Model | Size | Alignment | MT-Bench (score) | AlpacaEval (win rate %) |
|
45 |
|-------------|-----|----|---------------|--------------|
|
46 |
| **Tulu-v2-7b** πͺ | **7B** | **dDPO** | **TODO** | **TODO** |
|
|
|
|
|
47 |
| **Tulu-v2-dpo-7b** πͺ | **7B** | **dDPO** | **TODO** | **TODO** |
|
|
|
|
|
48 |
| StableLM-Tuned-Ξ± | 7B| dSFT |2.75| -|
|
49 |
| MPT-Chat | 7B |dSFT |5.42| -|
|
50 |
| Xwin-LMv0.1 | 7B| dPPO| 6.19| 87.83|
|
51 |
| Mistral-Instructv0.1 | 7B| - | 6.84 |-|
|
52 |
| Zephyr-7b-Ξ± |7B| dDPO| 6.88| -|
|
53 |
| Zephyr-7b-Ξ² πͺ | 7B | dDPO | 7.34 | 90.60 |
|
54 |
+
| **Tulu-v2-13b** πͺ | **13B** | **dDPO** | **TODO** | **TODO** |
|
55 |
+
| **Tulu-v2-dpo-13b** πͺ | **13B** | **dDPO** | **TODO** | **TODO** |
|
56 |
| Falcon-Instruct | 40B |dSFT |5.17 |45.71|
|
57 |
| Guanaco | 65B | SFT |6.41| 71.80|
|
58 |
| Llama2-Chat | 70B |RLHF |6.86| 92.66|
|
59 |
| Vicuna v1.3 | 33B |dSFT |7.12 |88.99|
|
60 |
| WizardLM v1.0 | 70B |dSFT |7.71 |-|
|
61 |
| Xwin-LM v0.1 | 70B |dPPO |- |95.57|
|
62 |
+
| **Tulu-v2-70b** πͺ | **70B** | **dDPO** | **TODO** | **TODO** |
|
63 |
+
| **Tulu-v2-dpo-70b** πͺ | **70B** | **dDPO** | **TODO** | **TODO** |
|
64 |
| GPT-3.5-turbo | - |RLHF |7.94 |89.37|
|
65 |
| Claude 2 | - |RLHF |8.06| 91.36|
|
66 |
| GPT-4 | -| RLHF |8.99| 95.28|
|