bigstral-12b-32k-8xMoE
Made using mergekit MoE branch with the following config:
base_model: abacusai/bigstral-12b-32k
gate_mode: random
dtype: bfloat16
experts_per_token: 2
experts:
- source_model: abacusai/bigstral-12b-32k
positive_prompts: []
- source_model: abacusai/bigstral-12b-32k
positive_prompts: []
- source_model: abacusai/bigstral-12b-32k
positive_prompts: []
- source_model: abacusai/bigstral-12b-32k
positive_prompts: []
- source_model: abacusai/bigstral-12b-32k
positive_prompts: []
- source_model: abacusai/bigstral-12b-32k
positive_prompts: []
- source_model: abacusai/bigstral-12b-32k
positive_prompts: []
- source_model: abacusai/bigstral-12b-32k
positive_prompts: []
- Downloads last month
- 7
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.