Update README.md
Browse files
README.md
CHANGED
@@ -4,7 +4,8 @@ tags:
|
|
4 |
- merge
|
5 |
license: apache-2.0
|
6 |
---
|
7 |
-
# Testing model. Maybe worse than other 2x7B model.
|
|
|
8 |
# This model is is a Mixure of Experts (MoE) made with the following models:
|
9 |
- udkai/Turdus
|
10 |
- Kquant03/Samlagast-7B-laser-bf16
|
|
|
4 |
- merge
|
5 |
license: apache-2.0
|
6 |
---
|
7 |
+
# Testing model. Maybe worse than other 2x7B model.
|
8 |
+
You will occur GPT-like responses sometimes, just skip it and reroll (gacha time). Overall, I think it good enough for Roleplaying.
|
9 |
# This model is is a Mixure of Experts (MoE) made with the following models:
|
10 |
- udkai/Turdus
|
11 |
- Kquant03/Samlagast-7B-laser-bf16
|