Update README.md
Browse files
README.md
CHANGED
@@ -4,11 +4,13 @@ tags:
|
|
4 |
- merge
|
5 |
license: apache-2.0
|
6 |
---
|
7 |
-
#
|
8 |
You will occur GPT-like responses sometimes, just skip it and reroll (gacha time). Overall, I think it good enough for Roleplaying.
|
9 |
# This model is is a Mixure of Experts (MoE) made with the following models:
|
10 |
- udkai/Turdus
|
11 |
- Kquant03/Samlagast-7B-laser-bf16
|
12 |
# If you used it, please let me know if it good or not. Thank you :)
|
13 |
# GGUF version here:
|
14 |
-
https://huggingface.co/Alsebay/RainyMotip-2x7B-GGUF
|
|
|
|
|
|
4 |
- merge
|
5 |
license: apache-2.0
|
6 |
---
|
7 |
+
# What is it? A 2x7B MoE model for Roleplay(?).
|
8 |
You will occur GPT-like responses sometimes, just skip it and reroll (gacha time). Overall, I think it good enough for Roleplaying.
|
9 |
# This model is is a Mixure of Experts (MoE) made with the following models:
|
10 |
- udkai/Turdus
|
11 |
- Kquant03/Samlagast-7B-laser-bf16
|
12 |
# If you used it, please let me know if it good or not. Thank you :)
|
13 |
# GGUF version here:
|
14 |
+
https://huggingface.co/Alsebay/RainyMotip-2x7B-GGUF
|
15 |
+
# Want more Quantization?
|
16 |
+
[mradermacher](https://huggingface.co/mradermacher) made full of them. Check it out :)
|