File size: 1,193 Bytes
07e18df 0f09075 07e18df 0f09075 07e18df 0f09075 07e18df 0f09075 07e18df |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 |
---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/modelcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/model-cards
{}
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
This model is an extreme experiment, I wanted to test making an MoE out of multiple High Performing Solar models. Let me know what you think
## Model Details
### Model Description
This MoE model is an extreme experiment, I wanted to test making an MoE out of multiple High Performing Solar models. Let me know what you think.
Thinking about finetuning on a RP dataset later on to direct the model more
- **Model type:** [More Information Needed]
### Model Sources [optional]
model_name: Lumosia-MoE-4x10.7
base_model: DopeorNope/SOLARC-M-10.7B
gate_mode: hidden
dtype: bfloat16
experts:
- source_model: DopeorNope/SOLARC-M-10.7B
positive_prompts: [""]
- source_model: maywell/PiVoT-10.7B-Mistral-v0.2-RP
positive_prompts: [""]
- source_model: kyujinpy/Sakura-SOLAR-Instruct
positive_prompts: [""]
- source_model: jeonsworld/CarbonVillain-en-10.7B-v1
positive_prompts: [""]
|