munin-neuralbeagle-7b
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the DARE TIES merge method using danish-foundation-models/munin-7b-alpha as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
models:
- model: danish-foundation-models/munin-7b-alpha
# No parameters necessary for base model
- model: mlabonne/NeuralBeagle14-7B
parameters:
density: 0.53
weight: 0.6
merge_method: dare_ties
base_model: danish-foundation-models/munin-7b-alpha
parameters:
int8_mask: true
dtype: bfloat16
- Downloads last month
- 9
Model tree for RJuro/munin-neuralbeagle-7b-GGUF
Merge model
this model