xiaol commited on
Commit
cdc4023
1 Parent(s): 4d509e3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -2
README.md CHANGED
@@ -13,6 +13,7 @@ In comparison with the previous released Mobius, the improvements include:
13
  * Significant performance improvement;
14
  * Multilingual support ;
15
  * Stable support of 128K context length.
 
16
 
17
 
18
  ## Usage
@@ -21,7 +22,7 @@ We encourage you use few shots to use this model, Desipte Directly use User: xxx
21
  ## More details
22
  Mobius 12B 128k based on RWKV v5.2 arch, which is leading state based RNN+CNN+Transformer Mixed language large language model which focus opensouce community
23
  * 10~100 trainning/inference cost reduce;
24
- * state based,which mean good at learning compression feature from language;
25
  * community support.
26
 
27
  ## requirements
@@ -33,4 +34,4 @@ Mobius 12B 128k based on RWKV v5.2 arch, which is leading state based RNN+CNN+Tr
33
  ## future plan
34
  If you need a HF version let us know
35
 
36
- [Mobius-Chat-12B-128k](https://huggingface.co/TimeMobius/Mobius-Chat-12B-128k)
 
13
  * Significant performance improvement;
14
  * Multilingual support ;
15
  * Stable support of 128K context length.
16
+ * Base model [Mobius-mega-12B-128k-base](https://huggingface.co/TimeMobius/Moibus-mega-12B-128k-base)
17
 
18
 
19
  ## Usage
 
22
  ## More details
23
  Mobius 12B 128k based on RWKV v5.2 arch, which is leading state based RNN+CNN+Transformer Mixed language large language model which focus opensouce community
24
  * 10~100 trainning/inference cost reduce;
25
+ * state based,selected memory, which mean good at grok;
26
  * community support.
27
 
28
  ## requirements
 
34
  ## future plan
35
  If you need a HF version let us know
36
 
37
+ [Mobius-Chat-12B-128k](https://huggingface.co/TimeMobius/Mobius-Chat-12B-128k)