L3-SaoMerge-Della-v1
L3-SaoMerge-Della-v1 is a merge of the following models using mergekit:
🧩 Configuration
models:
- model: Sao10K/L3-8B-Lunaris-v1
parameters:
weight: .35
- model: Sao10K/L3-8B-Stheno-v3.2
parameters:
weight: .2
- model: Sao10K/L3-8B-Niitama-v1
parameters:
weight: .25
- model: Sao10K/L3-8B-Tamamo-v1
parameters:
weight: .2
base_model: Sao10K/L3-8B-Stheno-v3.2
merge_method: della
parameters:
density: 0.5
epsilon: 0.1
lambda: 1.0
dtype: bfloat16
- Downloads last month
- 4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
HF Inference deployability: The model has no library tag.