Edit model card

SuperBruphin-3x7B

This is an experimental MoE model created using mergekit. (mixtral branch)

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

base_model: nbeerbower/bruphin-epsilon
gate_mode: hidden
dtype: bfloat16
experts:
  - source_model: nbeerbower/bruphin-epsilon
    positive_prompts:
        - "Tell a story."
  - source_model: FelixChao/WestSeverus-7B-DPO-v2
    positive_prompts:
        - "Solve this problem."
  - source_model: jondurbin/airoboros-m-7b-3.1.2
    positive_prompts:
        - "Write a letter."
Downloads last month
1,102
Safetensors
Model size
18.5B params
Tensor type
BF16
·

Finetuned from