Everyone else is using hidden gates with no prompts which I don't think works.
base_model: TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
gate_mode: random
dtype: bfloat16
experts:
- source_model: TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
positive_prompts: [""]
- source_model: TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
positive_prompts: [""]
- source_model: TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
positive_prompts: [""]
- source_model: TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
positive_prompts: [""]
- source_model: TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
positive_prompts: [""]
- source_model: TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
positive_prompts: [""]
- source_model: TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
positive_prompts: [""]
- source_model: TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
positive_prompts: [""]
- Downloads last month
- 7
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.