OrionZheng
commited on
Commit
•
983d53d
1
Parent(s):
105c3f9
Update README.md
Browse files
README.md
CHANGED
@@ -8,7 +8,7 @@ license: apache-2.0
|
|
8 |
</p>
|
9 |
<hr>
|
10 |
|
11 |
-
# OpenMoE
|
12 |
OpenMoE is a project aimed at igniting the open-source MoE community! We are releasing a family of open-sourced Mixture-of-Experts (MoE) Large Language Models.
|
13 |
|
14 |
Our project began in the summer of 2023. On August 22, 2023, we released the first batch of intermediate checkpoints (OpenMoE-base&8B), along with the data and code [[Twitter]](https://twitter.com/xuefz/status/1693696988611739947?s=61&t=Xc2k2W7vU_hlpNizGDCmOw). Subsequently, the OpenMoE-8B training was completed in November, 2023. After that, we embarked on explorations on 34B scale model, which is still ongoing.
|
|
|
8 |
</p>
|
9 |
<hr>
|
10 |
|
11 |
+
# OpenMoE-890B
|
12 |
OpenMoE is a project aimed at igniting the open-source MoE community! We are releasing a family of open-sourced Mixture-of-Experts (MoE) Large Language Models.
|
13 |
|
14 |
Our project began in the summer of 2023. On August 22, 2023, we released the first batch of intermediate checkpoints (OpenMoE-base&8B), along with the data and code [[Twitter]](https://twitter.com/xuefz/status/1693696988611739947?s=61&t=Xc2k2W7vU_hlpNizGDCmOw). Subsequently, the OpenMoE-8B training was completed in November, 2023. After that, we embarked on explorations on 34B scale model, which is still ongoing.
|