|
--- |
|
license: apache-2.0 |
|
tags: |
|
- merge |
|
- mergekit |
|
--- |
|
|
|
# tinyllama_frankenmerge |
|
|
|
This model is a merge of the following models made with [mergekit](https://github.com/cg123/mergekit): |
|
* [TinyLlama/TinyLlama-1.1B-intermediate-step-1195k-token-2.5T](https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1195k-token-2.5T) |
|
* [TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T](https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T) |
|
|
|
## 🧩 Configuration |
|
|
|
```yml |
|
slices: |
|
- sources: |
|
- model: TinyLlama/TinyLlama-1.1B-intermediate-step-1195k-token-2.5T |
|
layer_range: [0, 16] |
|
- sources: |
|
- model: TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T |
|
layer_range: [6, 22] |
|
merge_method: passthrough |
|
dtype: float16``` |