BlackSheep-X-Dolphin / mergekit_config.yml
TroyDoesAI's picture
A slightly unconventional merge to create something bad :)
c86c210 verified
raw
history blame contribute delete
764 Bytes
#### BEST CONFIGURATION ####
slices:
- sources:
- layer_range: [0, 4]
model: TroyDoesAI/BlackSheep
- sources:
- layer_range: [2, 6]
model: cognitivecomputations/dolphin-2.9.3-llama-3-8b
- sources:
- layer_range: [4, 8]
model: TroyDoesAI/BlackSheep
- sources:
- layer_range: [6, 10]
model: cognitivecomputations/dolphin-2.9.3-llama-3-8b
- sources:
- layer_range: [8, 12]
model: TroyDoesAI/BlackSheep
- sources:
- layer_range: [10, 14]
model: cognitivecomputations/dolphin-2.9.3-llama-3-8b
- sources:
- layer_range: [12, 16]
model: TroyDoesAI/BlackSheep
- sources:
- layer_range: [14, 18]
model: cognitivecomputations/dolphin-2.9.3-llama-3-8b
merge_method: passthrough
dtype: float16