Clown-DPO-Extended / README.md
Mark-Arcee's picture
Upload folder using huggingface_hub
58d1bb0 verified
|
raw
history blame
4.03 kB
---
base_model:
- CorticalStack/pastiche-crown-clown-7b-dare-dpo
library_name: transformers
tags:
- mergekit
- merge
---
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the passthrough merge method.
### Models Merged
The following models were included in the merge:
* [CorticalStack/pastiche-crown-clown-7b-dare-dpo](https://huggingface.co/CorticalStack/pastiche-crown-clown-7b-dare-dpo)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
slices:
- sources:
- model: CorticalStack/pastiche-crown-clown-7b-dare-dpo
layer_range:
- 0
- 4
- sources:
- model: CorticalStack/pastiche-crown-clown-7b-dare-dpo
layer_range:
- 3
- 4
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- model: CorticalStack/pastiche-crown-clown-7b-dare-dpo
layer_range:
- 4
- 8
- sources:
- model: CorticalStack/pastiche-crown-clown-7b-dare-dpo
layer_range:
- 7
- 8
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- model: CorticalStack/pastiche-crown-clown-7b-dare-dpo
layer_range:
- 8
- 12
- sources:
- model: CorticalStack/pastiche-crown-clown-7b-dare-dpo
layer_range:
- 11
- 12
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- model: CorticalStack/pastiche-crown-clown-7b-dare-dpo
layer_range:
- 12
- 16
- sources:
- model: CorticalStack/pastiche-crown-clown-7b-dare-dpo
layer_range:
- 15
- 16
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- model: CorticalStack/pastiche-crown-clown-7b-dare-dpo
layer_range:
- 16
- 20
- sources:
- model: CorticalStack/pastiche-crown-clown-7b-dare-dpo
layer_range:
- 19
- 20
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- model: CorticalStack/pastiche-crown-clown-7b-dare-dpo
layer_range:
- 20
- 24
- sources:
- model: CorticalStack/pastiche-crown-clown-7b-dare-dpo
layer_range:
- 23
- 24
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- model: CorticalStack/pastiche-crown-clown-7b-dare-dpo
layer_range:
- 24
- 28
- sources:
- model: CorticalStack/pastiche-crown-clown-7b-dare-dpo
layer_range:
- 27
- 28
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- model: CorticalStack/pastiche-crown-clown-7b-dare-dpo
layer_range:
- 28
- 32
- sources:
- model: CorticalStack/pastiche-crown-clown-7b-dare-dpo
layer_range:
- 31
- 32
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
merge_method: passthrough
dtype: bfloat16
```