Yeet_51b_200k / recipe.txt
MarsupialAI's picture
Upload recipe.txt
714217e verified
raw
history blame
609 Bytes
Stack A
- model: aezakmi
layer_range: [0, 15]
- model: megamerge
layer_range: [5, 35]
- model: rpbird
layer_range: [25, 55]
- model: aezakmi
layer_range: [45, 60]
Stack B
- model: megamerge
layer_range: [0, 15]
- model: rpbird
layer_range: [5, 35]
- model: aezakmi
layer_range: [25, 55]
- model: megamerge
layer_range: [45, 60]
Stack C
- model: rpbird
layer_range: [0, 15]
- model: aezakmi
layer_range: [5, 35]
- model: megamerge
layer_range: [25, 55]
- model: rpbird
layer_range: [45, 60]
Linear merge
- model: StackA
weight: 1.0
- model: StackB
weight: 1.0
- model: StackC
weight: 1.0