metadata
base_model:
- Test157t/Pasta-Lake-7b
- Test157t/Prima-LelantaclesV4-7b-16k
library_name: transformers
tags:
- mergekit
- merge
license: other
Update: Getting suprisingly good results at 16384 context, which is unexpected given this context pool should remain untouched from other mistral models working around 8192.
Thanks to @Lewdiculus for the Quants: https://huggingface.co/Lewdiculous/Prima-LelantaclesV5-7b-GGUF
This model was merged using the DARE TIES merge method.
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
merge_method: dare_ties
base_model: Test157t/Prima-LelantaclesV4-7b-16k
parameters:
normalize: true
models:
- model: Test157t/Pasta-Lake-7b
parameters:
weight: 1
- model: Test157t/Prima-LelantaclesV4-7b-16k
parameters:
weight: 1
dtype: float16