munin-neuralbeagle-7b
The model is based on danish-foundation-models/munin-7b-alpha with mlabonne/NeuralBeagle14-7B merged into using the configuration outlined below. As per 28 January 2024, it's ranked 2nd on the Mainland Scandinavian NLG leaderboard (after GPT3.5) This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the DARE TIES merge method using danish-foundation-models/munin-7b-alpha as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
models:
- model: danish-foundation-models/munin-7b-alpha
# No parameters necessary for base model
- model: mlabonne/NeuralBeagle14-7B
parameters:
density: 0.53
weight: 0.6
merge_method: dare_ties
base_model: danish-foundation-models/munin-7b-alpha
parameters:
int8_mask: true
dtype: bfloat16
- Downloads last month
- 27
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for RJuro/munin-neuralbeagle-7b
Merge model
this model