Edit model card

image/png

Testing is needed, but x2 MoE's might be where its at. They have the potential benefits of an MoE with the resource efficiency of only 2 models running side by side!

Config:

base_model: NousResearch/Nous-Hermes-2-SOLAR-10.7B
gate_mode: hidden
dtype: bfloat16
experts:
  - source_model: NousResearch/Nous-Hermes-2-SOLAR-10.7B
    positive_prompts: [""]
  - source_model: NousResearch/Nous-Hermes-2-SOLAR-10.7B
    positive_prompts: [""]
Downloads last month
2,992
Safetensors
Model size
19.2B params
Tensor type
BF16
·