PY007 commited on
Commit
b2be58a
1 Parent(s): 430bd46

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -22,7 +22,7 @@ The TinyLlama project aims to **pretrain** a **1.1B Llama model on 3 trillion to
22
  We adopted exactly the same architecture and tokenizer as Llama 2. This means TinyLlama can be plugged and played in many open-source projects built upon Llama. Besides, TinyLlama is compact with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint.
23
 
24
  #### This Model
25
- This is an intermediate checkpoint with 50K steps and 105B tokens.
26
 
27
  #### Releases Schedule
28
  We will be rolling out intermediate checkpoints following the below schedule. We also include some baseline models for comparison.
 
22
  We adopted exactly the same architecture and tokenizer as Llama 2. This means TinyLlama can be plugged and played in many open-source projects built upon Llama. Besides, TinyLlama is compact with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint.
23
 
24
  #### This Model
25
+ This is an intermediate checkpoint with 995K steps and 2003B tokens.
26
 
27
  #### Releases Schedule
28
  We will be rolling out intermediate checkpoints following the below schedule. We also include some baseline models for comparison.