Edit model card

L3-RPExperiment-2x8B is a Mixture of Experts (MoE) made with the following models using LazyMergekit:

I have no idea what im doing. GGUF: https://huggingface.co/mradermacher/L3-RPExperiment-2x8B-GGUF Tyy

Configuration

base_model: tannedbum/L3-Nymeria-8B
gate_mode: hidden 
dtype: bfloat16 
experts:
  - source_model: tannedbum/L3-Nymeria-8B
  - source_model: Casual-Autopsy/L3-Umbral-Mind-RP-v3-8B
Downloads last month
13
Safetensors
Model size
13.7B params
Tensor type
BF16
·

Merge of