largefalcon / README.md
monology's picture
Upload folder using huggingface_hub
c7af9ee verified
---
base_model:
- tiiuae/falcon-180B
library_name: transformers
tags:
- mergekit
- merge
license: apache-2.0
---
# largefalcon
This is a frankenmerge of [tiiuae/falcon-180B](https://huggingface.co/tiiuae/falcon-180B) created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
The following YAML configuration was used to produce this model:
```yaml
dtype: float16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 20]
model: tiiuae/falcon-180B
- sources:
- layer_range: [10, 30]
model: tiiuae/falcon-180B
- sources:
- layer_range: [20, 40]
model: tiiuae/falcon-180B
- sources:
- layer_range: [30, 50]
model: tiiuae/falcon-180B
- sources:
- layer_range: [40, 60]
model: tiiuae/falcon-180B
- sources:
- layer_range: [50, 70]
model: tiiuae/falcon-180B
- sources:
- layer_range: [60, 80]
model: tiiuae/falcon-180B
```