munin-7b-alpha / README.md
saattrupdan's picture
Update README.md
b4097bc verified
|
raw
history blame
1.42 kB
metadata
license: apache-2.0
pipeline_tag: text-generation
language:
  - da
tags:
  - pretrained
inference:
  parameters:
    temperature: 0.7
datasets:
  - DDSC/partial-danish-gigaword-no-twitter
base_model: mistralai/Mistral-7B-v0.1

Model Card for Munin 7B Alpha

The Munin 7B Alpha Large Language Model (LLM) is a pretrained generative text model with 7 billion parameters, based on Mistral-7B-v0.1.

It has been trained on Danish Gigaword using continual pretraining.

For full details of this model please read our release blog post.

Notice

Munin 7B Alpha is, like Mistral 7B, a pretrained base model and therefore does not have any moderation mechanisms.

The Danish Foundation Models Team

With Support From

Danish e-infrastructure Consortium and the Danish Defence.