metadata
language:
- da
- sv
license: cc-by-4.0
library_name: transformers
tags:
- merge
- mergekit
base_model:
- danish-foundation-models/munin-7b-alpha
- timpal0l/Mistral-7B-v0.1-flashback-v2
Danish-Swedish Merged Model
This is a merge of the following models, all based on mistralai/Mistral-7B-v0.1
:
danish-foundation-models/munin-7b-alpha
, continued pretraining on Danish data;timpal0l/Mistral-7B-v0.1-flashback-v2
, continued pretraining on Swedish data.
Model Details
- Merged by: Dan Saattrup Nielsen
- Model type: Decoder model, based on
mistralai/Mistral-7B-v0.1
- Language(s): Danish and Swedish
- License: CC-BY-4.0
- Merge configuration:
dict( models=[ dict( model="danish-foundation-models/munin-7b-alpha", parameters=dict( weight=1.0, ), ), dict( model="timpal0l/Mistral-7B-v0.1-flashback-v2", parameters=dict( weight=1.0, ), ), ], merge_method="ties", base_model="mistralai/Mistral-7B-v0.1", parameters=dict( int8_mask=True, normalize=True, ), dtype="bfloat16", )