Edit model card
Heidrun Logo

Model description

Heidrun-Mistral-7B-base is a generative text model based on Mistral-7B. It has been further pretrained on a subset of the Danish corpus from Wikipedia, Wikibooks and small parts of Hestenettet for 2 epochs.

It is a foundational/completion model with potential for further finetuning.

For inference or chatting please check out Heidrun-Mistral-7B-chat.

Previous version

Please note that this has been updated since the original release. The old version can be found under branch v0.1.

Uploaded model

  • Developed by: Mabeck
  • Finetuned from model : mistralai/Mistral-7B-v0.1

This mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.

Downloads last month
14
Safetensors
Model size
7.24B params
Tensor type
BF16
·

Finetuned from

Dataset used to train Mabeck/Heidrun-Mistral-7B-base