Text Generation
PyTorch
12 languages
causal-lm
rwkv
BlinkDL commited on
Commit
81a1a9b
1 Parent(s): d73a4fc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -27,6 +27,8 @@ datasets:
27
 
28
  Use https://huggingface.co/BlinkDL/rwkv-4-world for best models. I am still training RWKV-5.
29
 
 
 
30
  ## Model Description
31
 
32
  RWKV-5 trained on 100+ world languages (70% English, 15% multilang, 15% code).
@@ -35,6 +37,4 @@ World = Some_Pile + Some_RedPajama + Some_OSCAR + All_Wikipedia + All_ChatGPT_Da
35
 
36
  Training: set --my_testing "r" for latest RWKV-LM
37
 
38
- Inference: Use rwkv pip package 0.8.6+ for RWKV-5. Might overflow in fp16. Use fp32.
39
-
40
- inference algorithm reference: https://github.com/BlinkDL/ChatRWKV/blob/main/RWKV_v5_demo.py
 
27
 
28
  Use https://huggingface.co/BlinkDL/rwkv-4-world for best models. I am still training RWKV-5.
29
 
30
+ Use rwkv pip package 0.8.7+ for RWKV-5 inference.
31
+
32
  ## Model Description
33
 
34
  RWKV-5 trained on 100+ world languages (70% English, 15% multilang, 15% code).
 
37
 
38
  Training: set --my_testing "r" for latest RWKV-LM
39
 
40
+ Inference algorithm reference: https://github.com/BlinkDL/ChatRWKV/blob/main/RWKV_v5_demo.py