add RWKV-4-Raven-{7,14}B-v12-Eng98
Browse filesdon't have the patience to quantize these right now, but uploading
the float16/float32 versions so that people don't have to pay the high
RAM cost of conversion
- README.md +2 -0
- RWKV-4-Raven-14B-v12-Eng98-20230523-ctx8192-f16.bin +3 -0
- RWKV-4-Raven-14B-v12-Eng98-20230523-ctx8192-f32-00.bin +3 -0
- RWKV-4-Raven-14B-v12-Eng98-20230523-ctx8192-f32-01.bin +3 -0
- RWKV-4-Raven-7B-v12-Eng98-20230521-ctx8192-f16.bin +3 -0
- RWKV-4-Raven-7B-v12-Eng98-20230521-ctx8192-f32.bin +3 -0
README.md
CHANGED
@@ -17,7 +17,9 @@ These models retain the original models' license (Apache 2.0).
|
|
17 |
| `RWKV-4-Raven-3B-v11-Eng99-20230425-ctx4096` | Yes | Yes | Yes | No | Yes | Yes | Yes |
|
18 |
| `RWKV-4-Raven-3B-v12-Eng98-20230520-ctx4096` | Yes | Yes | Yes | No | Yes | Yes | Yes |
|
19 |
| `RWKV-4-Raven-7B-v11x-Eng99-20230429-ctx8192` | Yes | Yes | Yes | No | Yes | Yes | Yes |
|
|
|
20 |
| `RWKV-4-Raven-14B-v11x-Eng99-20230501-ctx8192` | Split | Yes | Yes | No | Yes | Yes | Yes |
|
|
|
21 |
|
22 |
- The original PyTorch checkpoints (`.pth`) can be downloaded from the [`rwkv-4-raven`](https://huggingface.co/BlinkDL/rwkv-4-raven) repository.
|
23 |
- All `f32` and `f16` models were converted directly from the PyTorch checkpoints using [rwkv.cpp `convert_pytorch_to_ggml.py`](https://github.com/saharNooby/rwkv.cpp/blob/1c363e6d5f4ec7817ceffeeb17bd972b1ce9d9d0/rwkv/convert_pytorch_to_ggml.py).
|
|
|
17 |
| `RWKV-4-Raven-3B-v11-Eng99-20230425-ctx4096` | Yes | Yes | Yes | No | Yes | Yes | Yes |
|
18 |
| `RWKV-4-Raven-3B-v12-Eng98-20230520-ctx4096` | Yes | Yes | Yes | No | Yes | Yes | Yes |
|
19 |
| `RWKV-4-Raven-7B-v11x-Eng99-20230429-ctx8192` | Yes | Yes | Yes | No | Yes | Yes | Yes |
|
20 |
+
| `RWKV-4-Raven-7B-v12-Eng98-20230521-ctx8192` | Yes | Yes | No | No | No | No | No |
|
21 |
| `RWKV-4-Raven-14B-v11x-Eng99-20230501-ctx8192` | Split | Yes | Yes | No | Yes | Yes | Yes |
|
22 |
+
| `RWKV-4-Raven-14B-v12-Eng98-20230523-ctx8192` | Split | Yes | No | No | No | No | No |
|
23 |
|
24 |
- The original PyTorch checkpoints (`.pth`) can be downloaded from the [`rwkv-4-raven`](https://huggingface.co/BlinkDL/rwkv-4-raven) repository.
|
25 |
- All `f32` and `f16` models were converted directly from the PyTorch checkpoints using [rwkv.cpp `convert_pytorch_to_ggml.py`](https://github.com/saharNooby/rwkv.cpp/blob/1c363e6d5f4ec7817ceffeeb17bd972b1ce9d9d0/rwkv/convert_pytorch_to_ggml.py).
|
RWKV-4-Raven-14B-v12-Eng98-20230523-ctx8192-f16.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:2575177c69839531ad2b3814adf79e3840f42ef42551f7cb814a8e2e8e72be3e
|
3 |
+
size 28301772069
|
RWKV-4-Raven-14B-v12-Eng98-20230523-ctx8192-f32-00.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:c469d1572c2077ab26e619d7801b3f349521039f2596ca05918e34803596d32f
|
3 |
+
size 50000000000
|
RWKV-4-Raven-14B-v12-Eng98-20230523-ctx8192-f32-01.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:25130ac0680cc244bf78633a6820393af2fff59584f5e6ae104fc925fe67ba11
|
3 |
+
size 6594421029
|
RWKV-4-Raven-7B-v12-Eng98-20230521-ctx8192-f16.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:624cda612086a7de65137280bbf899a5d28bf409f6d70c0926cb372f1a4c1a33
|
3 |
+
size 14788238781
|
RWKV-4-Raven-7B-v12-Eng98-20230521-ctx8192-f32.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:bbe04a1002daa6c6bd9106ba235524279c9e7840d09b152050cdcea9ff6dbf49
|
3 |
+
size 29570620861
|