AdamCodd commited on
Commit
1169945
1 Parent(s): 6b800ee

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +40 -0
README.md CHANGED
@@ -9,6 +9,46 @@ Obviously, it's still not perfect (I won't lie, the original dataset was very fl
9
 
10
  If you want to support me, you can [here](https://ko-fi.com/adamcodd).
11
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
12
  ## Acknowledgment and citation
13
  ```bibtex
14
  @inproceedings{bien-etal-2020-recipenlg,
 
9
 
10
  If you want to support me, you can [here](https://ko-fi.com/adamcodd).
11
 
12
+ ## Token Distribution Analysis
13
+
14
+ We analyzed the token count distribution of the dataset using the LLAMA-3 tokenizer. This information is crucial for understanding the length of prompts and planning model training.
15
+
16
+ ### Key Statistics
17
+ - **Minimum tokens**: 164
18
+ - **Maximum tokens**: 3,285
19
+ - **Median (50th percentile)**: 274 tokens
20
+
21
+ ### Decile Distribution
22
+
23
+ ```ascii
24
+ | Percentile | Token Count |
25
+ |------------|-------------|
26
+ | 10% | 192 |
27
+ | 20% | 209 |
28
+ | 30% | 228 |
29
+ | 40% | 249 |
30
+ | 50% | 274 |
31
+ | 60% | 302 |
32
+ | 70% | 337 |
33
+ | 80% | 386 |
34
+ | 90% | 467 |
35
+ | 100% | 3,285 |
36
+ ```
37
+
38
+ ### Interpretation
39
+
40
+ 1. **Range**: The dataset contains prompts ranging from 164 to 3,285 tokens, indicating significant variation in prompt lengths.
41
+
42
+ 2. **Central Tendency**: The median token count is 274, meaning half of the prompts have 274 tokens or fewer.
43
+
44
+ 3. **Distribution**:
45
+ - 90% of prompts have 467 tokens or fewer.
46
+ - There's a notable jump from the 90th percentile (467 tokens) to the maximum (3,285 tokens), suggesting some outliers with very high token counts.
47
+
48
+ 4. **Implications for Training**:
49
+ - A sequence length of 400-500 tokens would cover the majority of prompts.
50
+ - Special handling may be needed for outliers with high token counts (e.g., truncation or splitting).
51
+
52
  ## Acknowledgment and citation
53
  ```bibtex
54
  @inproceedings{bien-etal-2020-recipenlg,