metadata
base_model:
- Sao10K/L3-8B-Niitama-v1
- Nitral-AI/Hathor_Tahsin-L3-8B-v0.85
- ArliAI/ArliAI-Llama-3-8B-Formax-v1.0
- nothingiisreal/L3-8B-Celeste-V1.2
library_name: transformers
tags:
- mergekit
- merge
merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the passthrough merge method.
Models Merged
The following models were included in the merge:
- parts/celeniit14-20.sl
- Sao10K/L3-8B-Niitama-v1
- Nitral-AI/Hathor_Tahsin-L3-8B-v0.85
- ArliAI/ArliAI-Llama-3-8B-Formax-v1.0
- nothingiisreal/L3-8B-Celeste-V1.2
Configuration
The following YAML configuration was used to produce this model:
models:
slices:
- sources:
- layer_range: [0, 4]
model: Nitral-AI/Hathor_Tahsin-L3-8B-v0.85
- sources:
- layer_range: [1, 5]
model: ArliAI/ArliAI-Llama-3-8B-Formax-v1.0
- sources:
- layer_range: [4, 8]
model: Nitral-AI/Hathor_Tahsin-L3-8B-v0.85
- sources:
- layer_range: [5, 9]
model: ArliAI/ArliAI-Llama-3-8B-Formax-v1.0
- sources:
- layer_range: [8, 10]
model: Sao10K/L3-8B-Niitama-v1
- sources:
- layer_range: [6, 14]
model: nothingiisreal/L3-8B-Celeste-V1.2
- sources:
- layer_range: [0, 6]
model: parts/celeniit14-20.sl
- sources:
- layer_range: [20, 23]
model: Sao10K/L3-8B-Niitama-v1
- sources:
- layer_range: [22, 26]
model: Nitral-AI/Hathor_Tahsin-L3-8B-v0.85
- sources:
- layer_range: [22, 28]
model: nothingiisreal/L3-8B-Celeste-V1.2
- sources:
- layer_range: [25, 27]
model: Nitral-AI/Hathor_Tahsin-L3-8B-v0.85
- sources:
- layer_range: [28, 30]
model: Sao10K/L3-8B-Niitama-v1
- sources:
- layer_range: [25, 32]
model: nothingiisreal/L3-8B-Celeste-V1.2
parameters:
int8_mask: true
merge_method: passthrough
dtype: bfloat16