Muennighoff commited on
Commit
ed327d8
1 Parent(s): 8f8468c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -13,7 +13,7 @@ co2_eq_emissions: 1
13
 
14
  # Model Summary
15
 
16
- > OLMoE is a Mixture-of-Experts LLM with 1.2B active and 6.9B total parameters. It yields state-of-the-art performance among models with a similar cost (1B) and is competitive with much larger models like Llama2-13B. OLMoE is 100% open-source.
17
 
18
  - Code: https://github.com/allenai/OLMoE
19
  - Paper:
 
13
 
14
  # Model Summary
15
 
16
+ > OLMoE-1B-7B is a Mixture-of-Experts LLM with 1B active and 7B total parameters released in August 2024 (0824). It yields state-of-the-art performance among models with a similar cost (1B) and is competitive with much larger models like Llama2-13B. OLMoE is 100% open-source.
17
 
18
  - Code: https://github.com/allenai/OLMoE
19
  - Paper: