miqu-lzlv / README.md
ycros's picture
Update README.md
984f199 verified
---
base_model:
- lizpreciatior/lzlv_70b_fp16_hf
- 152334H/miqu-1-70b-sf
tags:
- mergekit
- merge
---
# This is a merge test, do not use (probably)
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [linear](https://arxiv.org/abs/2203.05482) merge method.
### Models Merged
The following models were included in the merge:
* [lizpreciatior/lzlv_70b_fp16_hf](https://huggingface.co/lizpreciatior/lzlv_70b_fp16_hf)
* [152334H/miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: 152334H/miqu-1-70b-sf
parameters:
weight: 1
- model: lizpreciatior/lzlv_70b_fp16_hf
parameters:
weight: 0.7
merge_method: linear
dtype: float16
```