Update README.md
Browse files
README.md
CHANGED
@@ -7,8 +7,7 @@
|
|
7 |
* [One of Best MoE Model reviewed by reddit commity](https://www.reddit.com/r/LocalLLaMA/comments/1916896/llm_comparisontest_confirm_leaderboard_big_news/)
|
8 |
* [Another review by reddit commity](https://www.reddit.com/r/LocalLLaMA/comments/191mvlp/i_have_tried_mixtral_34bx2_moe_also_named_yi/)
|
9 |
*
|
10 |
-
* Highest score MoE Model (2024/1/10)
|
11 |
-
* Average Score 76.66
|
12 |
|
13 |
This is my first English & Chinese MoE Model based on
|
14 |
* [jondurbin/bagel-dpo-34b-v0.2]
|
|
|
7 |
* [One of Best MoE Model reviewed by reddit commity](https://www.reddit.com/r/LocalLLaMA/comments/1916896/llm_comparisontest_confirm_leaderboard_big_news/)
|
8 |
* [Another review by reddit commity](https://www.reddit.com/r/LocalLLaMA/comments/191mvlp/i_have_tried_mixtral_34bx2_moe_also_named_yi/)
|
9 |
*
|
10 |
+
* Highest score MoE Model ranked by Open LLM Leaderboard (2024/1/10) [Average Score 76.66](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|
|
|
11 |
|
12 |
This is my first English & Chinese MoE Model based on
|
13 |
* [jondurbin/bagel-dpo-34b-v0.2]
|