File size: 722 Bytes
47224d8
 
 
 
 
c42b17a
 
 
47224d8
 
2f2cf29
 
d66ddc8
 
2f2cf29
47224d8
a2f3490
2fc589c
a2f3490
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
---
license: cc-by-nc-4.0
tags:
- moe
- merge
base_model:
- Sao10K/Fimbulvetr-11B-v2
- BlueNipples/SnowLotus-v2-10.7B
---
# This is MoE(Mixture of Experts) model base on those model:
- Sao10K/Fimbulvetr-11B-v2
- BlueNipples/SnowLotus-v2-10.7B
# You may want see this: https://huggingface.co/Alsebay/My_LLMs_Leaderboard
# Done testing :D
I have test only Quantization version, good enough for Roleplay(RP), that what I want when make this model. But have many strange behavious (maybe I don't have enough Vram?)
# GGUF version?
[Alsebay/SunnyRain-2x10.7B-GGUF](https://huggingface.co/Alsebay/SunnyRain-2x10.7B-GGUF) 

Want more? Check this, [he doing a great job](https://huggingface.co/mradermacher/SunnyRain-2x10.7B-GGUF)