Update README.md
Browse files
README.md
CHANGED
@@ -9,6 +9,9 @@ language:
|
|
9 |

|
10 |
(Maybe i'll change the waifu picture later.)
|
11 |
|
|
|
|
|
|
|
12 |
Experimental RP-oriented MoE, the idea was to get a model that would be equal to or better than the Mixtral 8x7B and it's finetunes in RP/ERP tasks.
|
13 |
|
14 |
The model has totally 25B parameters, of which ~13B are active.
|
|
|
9 |

|
10 |
(Maybe i'll change the waifu picture later.)
|
11 |
|
12 |
+
> [!IMPORTANT]
|
13 |
+
> [GGUF quants](https://huggingface.co/collections/xxx777xxxASD/chaoticsoliloquy-v2-4x8b-test-66377d4b9ff6e77b00fa791f)
|
14 |
+
|
15 |
Experimental RP-oriented MoE, the idea was to get a model that would be equal to or better than the Mixtral 8x7B and it's finetunes in RP/ERP tasks.
|
16 |
|
17 |
The model has totally 25B parameters, of which ~13B are active.
|