Upload README.md
Browse files
README.md
CHANGED
@@ -330,7 +330,12 @@ And thank you again to a16z for their generous grant.
|
|
330 |
# Original model card: Trong-Hieu Nguyen-Mau's V1Olet Merged DPO 7B v3
|
331 |
|
332 |
|
333 |
-
|
|
|
|
|
|
|
|
|
|
|
334 |
|
335 |
Training data:
|
336 |
comparison_gpt4_en,en_orca_dpo
|
|
|
330 |
# Original model card: Trong-Hieu Nguyen-Mau's V1Olet Merged DPO 7B v3
|
331 |
|
332 |
|
333 |
+
We are ranked *4th* on the overall leaderboard and **1st** in the 7B leaderboard! 🔥🔥🔥
|
334 |
+
|
335 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/63c06fba8d1175e3399c16e6/yEPpr0V-D9V4m1a2pMuQs.png)
|
336 |
+
|
337 |
+
DPO from the model ranked *6th* on the overall leaderboard and **1st** in the 7B leaderboard (12th December 2023) - v1olet/v1olet_marcoroni-go-bruins-merge-7B
|
338 |
+
https://huggingface.co/v1olet/v1olet_marcoroni-go-bruins-merge-7B
|
339 |
|
340 |
Training data:
|
341 |
comparison_gpt4_en,en_orca_dpo
|