metadata
base_model:
- grimjim/Mistral-7B-Instruct-demi-merge-v0.2-7B
- grimjim/kukulemon-7B
library_name: transformers
tags:
- mergekit
- merge
license: cc-by-nc-4.0
kukulemon-32K-7B
This is an attempt at a merge capable of functional 32K context length while being derived from kukulemon-7B. The functioning 32K context window has been folded in via a merger of Mistral 7B v0.2 models.
Although the resulting model natively supports Alpaca prompt, I've tested with ChatML prompts successfuly. Medium temperature (around 1) with low minP (e.g., 0.01) works with ChatML prompts in my most recent testing.
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the SLERP merge method.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
slices:
- sources:
- model: grimjim/kukulemon-7B
layer_range: [0, 32]
- model: grimjim/Mistral-7B-Instruct-demi-merge-v0.2-7B
layer_range: [0, 32]
# or, the equivalent models: syntax:
# models:
merge_method: slerp
base_model: grimjim/kukulemon-7B
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5 # fallback for rest of tensors
dtype: bfloat16