Muennighoff commited on
Commit
e775da2
1 Parent(s): f8770d2

Clarify branches

Browse files
Files changed (1) hide show
  1. README.md +8 -1
README.md CHANGED
@@ -13,7 +13,7 @@ co2_eq_emissions: 1
13
 
14
  # Model Summary
15
 
16
- > OLMoE is a ...
17
 
18
  Links to all resources & instructions to reproduce: https://github.com/allenai/OLMoE
19
 
@@ -44,6 +44,13 @@ out = list_repo_refs("OLMoE/OLMoE-1B-7B-0824")
44
  branches = [b.name for b in out.branches]
45
  ```
46
 
 
 
 
 
 
 
 
47
  # Citation
48
 
49
  ```bibtex
 
13
 
14
  # Model Summary
15
 
16
+ > OLMoE is a Mixture-of-Experts LLM with 1.2B active and 6.9B total parameters. It yields state-of-the-art performance among models with a similar cost (1B) and is competitive with much larger models like Llama2-13B. OLMoE is 100% open-source.
17
 
18
  Links to all resources & instructions to reproduce: https://github.com/allenai/OLMoE
19
 
 
44
  branches = [b.name for b in out.branches]
45
  ```
46
 
47
+ Important branches:
48
+ - `step1200000-tokens5033B`: Pretraining checkpoint used for annealing. There are a few more checkpoints after this one but we did not use them.
49
+ - `main`: Checkpoint annealed from `step1200000-tokens5033B` for an additional 100B tokens. We use this checkpoint for finetuning our chat model.
50
+ - `fp32`: FP32 version of `main`. The model weights were stored in FP32 during training but we did not observe any performance drop from casting them BF16 after training so we upload all weights in BF16. If you want the original FP32 checkpoint for `main` you can use this one. You will find that it yields slightly different results but should perform around the same on benchmarks.
51
+
52
+ The main branch contains the annealed checkpoint.
53
+
54
  # Citation
55
 
56
  ```bibtex