matchaaaaa's picture
Update README.md
0b68b50 verified
|
raw
history blame
No virus
1 kB
---
base_model: []
library_name: transformers
tags:
- mergekit
- merge
---
# Chaifighter Latte 14B
## The Deets
### Mergekit
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
### Merge Method
This model was merged using the passthrough merge method.
### Models Merged
* [SanjiWatsuki/Kunoichi-7B]()
* [Sao10K/Fimbulvetr-11B-v2]()
* [Sao10K/Frostwind-v2.1-m7]()
* [Gryphe/MythoMist-7b]()
### Configuration
The following YAML configuration was used to produce this model:
```yaml
dtype: float32
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 16]
model: D:/MLnonsense/models/SanjiWatsuki_Kunoichi-7B
- sources:
- layer_range: [0, 8]
model: mega\Kuno-Fimbul-splice
- sources:
- layer_range: [16, 32]
model: D:/MLnonsense/models/Sao10K_Fimbulvetr-11B-v2
- sources:
- layer_range: [0, 8]
model: mega\Fimbul-Frosty-Mytho-splice
- sources:
- layer_range: [16, 32]
model: mega\Frosty-Mytho
```