Edit model card

cnmoro/Felladrin-4x68m-moe

cnmoro/Felladrin-4x68m-moe is a merge of the following models using mergekit:

🧩 Configuration

base_model: Felladrin/Llama-68M-Chat-v1
gate_mode: random
dtype: bfloat16
experts:
  - source_model: Felladrin/Llama-68M-Chat-v1
    positive_prompts: [""]
  - source_model: Felladrin/Llama-68M-Chat-v1
    positive_prompts: [""]
  - source_model: Felladrin/Llama-68M-Chat-v1
    positive_prompts: [""]
  - source_model: Felladrin/Llama-68M-Chat-v1
    positive_prompts: [""]

https://huggingface.co/Felladrin/Llama-68M-Chat-v1

Downloads last month
3
Safetensors
Model size
111M params
Tensor type
BF16
·