ParasiticRogue's picture
Update README.md
26d3a62 verified
|
raw
history blame
1.17 kB
metadata
license: apache-2.0
tags:
  - mergekit
  - merge

Nontoxic-PiVoT-Bagel-RP-34b

This is a merge of pre-trained language models created using mergekit.

GGUFs by mradermacher if you want 'em:

https://huggingface.co/mradermacher/Nontoxic-PiVoT-Bagel-RP-34b-GGUF

Merge Details

Just uploading one of my stew's merge pieces here for convenience. If you want to use it you can. Vicuna should work good for the format.

Merge Method

This model was merged using the SLERP merge method.

Models Merged

The following models were included in the merge:

https://huggingface.co/jondurbin/nontoxic-bagel-34b-v0.2

https://huggingface.co/maywell/PiVoT-SUS-RP

Configuration

The following YAML configuration was used to produce this model:

slices:
  - sources:
      - model: PiVoT-SUS-RP
        layer_range: [0, 60]
      - model: nontoxic-bagel-34b-v0.2
        layer_range: [0, 60]
merge_method: slerp
base_model: nontoxic-bagel-34b-v0.2
parameters:
  t:
    - filter: self_attn
      value: [0, 0.5, 0.3, 0.7, 1]
    - filter: mlp
      value: [1, 0.5, 0.7, 0.3, 0]
    - value: 0.5
dtype: bfloat16