princeton-nlp commited on
Commit
9d56412
1 Parent(s): 5617df2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -9,7 +9,9 @@ License: Must comply with license of Pythia since it's a model derived from Pyth
9
 
10
  Sheared-Pythia-160m is a model pruned and further pre-trained from [EleutherAI/pythia-410m](https://huggingface.co/EleutherAI/pythia-410m). We dynamically load data from different domains in the Pile dataset to prune and contune pre-train the model. We use 0.4B tokens for pruning and 50B tokens for continued pre-training the pruned model. This model can be loaded with HuggingFace via
11
 
 
12
  model = GPTNeoXForCausalLM.from_pretrained("princeton-nlp/Sheared-Pythia-140m")
 
13
 
14
  The model's overall performance is better than EleutherAI/pythia-160m.
15
 
 
9
 
10
  Sheared-Pythia-160m is a model pruned and further pre-trained from [EleutherAI/pythia-410m](https://huggingface.co/EleutherAI/pythia-410m). We dynamically load data from different domains in the Pile dataset to prune and contune pre-train the model. We use 0.4B tokens for pruning and 50B tokens for continued pre-training the pruned model. This model can be loaded with HuggingFace via
11
 
12
+ ```
13
  model = GPTNeoXForCausalLM.from_pretrained("princeton-nlp/Sheared-Pythia-140m")
14
+ ```
15
 
16
  The model's overall performance is better than EleutherAI/pythia-160m.
17