metadata
license: apache-2.0
pipeline_tag: text-generation
language:
- da
tags:
- pretrained
inference:
parameters:
temperature: 0.7
datasets:
- DDSC/partial-danish-gigaword-no-twitter
base_model: mistralai/Mistral-7B-v0.1
Model Card for Munin 7B Alpha
The Munin 7B Alpha Large Language Model (LLM) is a pretrained generative text model with 7 billion parameters, based on Mistral-7B-v0.1.
It has been trained on Danish Gigaword using continual pretraining.
For full details of this model please read our release blog post.
Notice
Munin 7B Alpha is, like Mistral 7B, a pretrained base model and therefore does not have any moderation mechanisms.
The Danish Foundation Models Team
- From the Center for Humanities Computing at Aarhus University:
- Kenneth Enevoldsen (kenneth.enevoldsen@cas.au.dk)
- Lasse Hansen (lasse.hansen@clin.au.dk)
- Kristoffer Laigaard Nielbo (kln@cas.au.dk)
- From the Alexandra Institute:
- Peter Bjørn Jørgensen (peter.jorgensen@alexandra.dk)
- Rasmus Larsen (rasmus.larsen@alexandra.dk)
- Dan Saattrup Nielsen (dan.nielsen@alexandra.dk)