File size: 1,248 Bytes
6b78503
b7fc0ad
 
 
 
 
 
 
6b78503
b7fc0ad
 
 
6b78503
b7fc0ad
 
 
 
4ea6cc2
fc06392
4ea6cc2
 
 
446c8ea
4ea6cc2
446c8ea
cdb8bcd
0435981
7ea6007
 
 
 
27f2132
287c586
23506d3
9c2bb24
22bcd84
5819854
 
c5ff4c5
641a084
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
---
language:
- en
tags:
- pytorch
- text-generation
- causal-lm
- rwkv
license: apache-2.0
datasets:
- the_pile

---

# RWKV-4 "Raven"-series Models

## Model Description

These are RWKV-4-Pile models 3B/7B/14B finetuned on Alpaca, CodeAlpaca, Guanaco, GPT4All, ShareGPT and more. Ctxlen 8192 in v9.

Gradio Demo: https://huggingface.co/spaces/BlinkDL/Raven-RWKV-7B

Use https://github.com/BlinkDL/ChatRWKV to run them.

See https://github.com/BlinkDL/RWKV-LM for details on the RWKV Language Model (100% RNN).

Best Prompt Format for Raven models, Bob is user, Alice is bot (NOTE: no space after final "Alice:"). You can use \n within xxxxxxxxxxx, but avoid \n\n.
```
Bob: xxxxxxxxxxxxxxxxxx\n\nAlice:
Bob: xxxxxxxxxxxxxxxxxx\n\nAlice: xxxxxxxxxxxxx\n\nBob: xxxxxxxxxxxxxxxx\n\nAlice:
```
New models will be named like Eng99%-Other1%, Eng86%-Chn10%-JpnEspKor2%-Other2%, etc.
Language ratios determined by amount of ChatGPT data. Please share more ChatGPT data to increase the ratio of your language.

Old models:
* RWKV-4-Raven-Eng : 99% English + 1% Multilang
* RWKV-4-Raven-EngAndMore : 96% English + 2% Chn Jpn + 2% Multilang (More Jpn than v6 "EngChnJpn")
* RWKV-4-Raven-ChnEng : 49% English + 50% Chinese + 1% Multilang

License: Apache 2.0