Pearl-7B-0210-ties / mergekit_config.yml
louisbrulenaudet's picture
Upload folder using huggingface_hub
05b9b45 verified
raw
history blame
567 Bytes
models:
- model: OpenPipe/mistral-ft-optimized-1227
- model: louisbrulenaudet/Pearl-7B-slerp
parameters:
density: 0.5
weight: 0.4
- model: WizardLM/WizardMath-7B-V1.1
parameters:
density: 0.5
weight: 0.2
- model: cognitivecomputations/WestLake-7B-v2-laser
parameters:
density: 0.5
weight: 0.2
- model: CultriX/NeuralTrix-7B-dpo
parameters:
density: 0.5
weight: 0.2
merge_method: ties
base_model: OpenPipe/mistral-ft-optimized-1227
parameters:
normalize: true
int8_mask: true
dtype: float16