Text Generation
PyTorch
12 languages
causal-lm
rwkv
BlinkDL commited on
Commit
050fd2b
1 Parent(s): ef91075

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -0
README.md CHANGED
@@ -38,3 +38,9 @@ RWKV-5 trained on 100+ world languages (70% English, 15% multilang, 15% code).
38
  World = Some_Pile + Some_SlimPajama + Some_OSCAR + All_Wikipedia + All_ChatGPT_Data_I_can_find
39
 
40
  RWKV-5 training: set --my_testing "2r4" in latest RWKV-LM
 
 
 
 
 
 
 
38
  World = Some_Pile + Some_SlimPajama + Some_OSCAR + All_Wikipedia + All_ChatGPT_Data_I_can_find
39
 
40
  RWKV-5 training: set --my_testing "2r4" in latest RWKV-LM
41
+
42
+ World v1 = 0.59T tokens
43
+
44
+ World v2 = 1.12T tokens
45
+
46
+ Imagine what happens when we use more data :)