timinar commited on
Commit
f8abda2
1 Parent(s): 8a6eadc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -0
README.md CHANGED
@@ -11,6 +11,7 @@ Our submission to the `strict-small` track of the [BabyLM challenge](https://bab
11
  Baby Llama is a 58M-parameter model, distilled from an ensemble consisting of LLaMA-360M and GPT2-705M, both trained on the `babylm_10M` dataset.
12
 
13
  See the associated paper (arXiv number **TBA**) for a detailed discussion of the training procedure and of the model performance.
 
14
 
15
  ### Hyperparameters for the tasks that require fine-tuning
16
 
 
11
  Baby Llama is a 58M-parameter model, distilled from an ensemble consisting of LLaMA-360M and GPT2-705M, both trained on the `babylm_10M` dataset.
12
 
13
  See the associated paper (arXiv number **TBA**) for a detailed discussion of the training procedure and of the model performance.
14
+ The training code is available at [https://github.com/timinar/BabyLlama](https://github.com/timinar/BabyLlama).
15
 
16
  ### Hyperparameters for the tasks that require fine-tuning
17