SunnyRain-2x10.7B / README.md
Alsebay's picture
Update README.md
c42b17a verified
|
raw
history blame contribute delete
No virus
722 Bytes
metadata
license: cc-by-nc-4.0
tags:
  - moe
  - merge
base_model:
  - Sao10K/Fimbulvetr-11B-v2
  - BlueNipples/SnowLotus-v2-10.7B

This is MoE(Mixture of Experts) model base on those model:

  • Sao10K/Fimbulvetr-11B-v2
  • BlueNipples/SnowLotus-v2-10.7B

You may want see this: https://huggingface.co/Alsebay/My_LLMs_Leaderboard

Done testing :D

I have test only Quantization version, good enough for Roleplay(RP), that what I want when make this model. But have many strange behavious (maybe I don't have enough Vram?)

GGUF version?

Alsebay/SunnyRain-2x10.7B-GGUF

Want more? Check this, he doing a great job