File size: 596 Bytes
b04ab69 e6a4dbf 54eea28 16b813f |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 |
---
license: apache-2.0
---
This is a test merge of some Gemma-7b finetunes using task_arithmatic. After testing it is confirmed to be working properly.
Merge config:
```yaml
models:
- model: gemma-7b-it-fp16
parameters:
weight: 1
- model: CorticalStack_gemma-7b-ultrachat-sft
parameters:
weight: 1
- model: cloudyu_google-gemma-7b-it-dpo-v1
parameters:
weight: 1
- model: abideen_gemma-7b-openhermes
parameters:
weight: 1
merge_method: task_arithmetic
base_model: gemma-7b-base
parameters:
normalize: true
int8_mask: true
dtype: float16
```
|