File size: 1,321 Bytes
cfab1b7
d99f1ee
 
150c7ff
d99f1ee
 
 
 
 
 
 
150c7ff
 
 
cfab1b7
150c7ff
 
 
 
 
 
 
 
 
 
 
 
 
 
d99f1ee
 
 
 
 
 
 
 
150c7ff
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
---
language:
- en
- da
tags:
- text-generation-inference
- transformers
- unsloth
- mistral
- trl
base_model: mistralai/Mistral-7B-v0.1
datasets:
- wikimedia/wikipedia
license: mit
---
<img src="https://huggingface.co/Mabeck/Heidrun-Mistral-7B-chat/resolve/main/heidrun.jpeg" alt="Heidrun Logo" width="400">

# Model description

Heidrun-Mistral-7B-base is a generative text model based on [Mistral-7B](https://huggingface.co/mistralai/Mistral-7B-v0.1). It has been further pretrained on a subset of the Danish corpus from Wikipedia, Wikibooks and small parts of Hestenettet for 2 epochs.

It is a foundational/completion model with potential for further finetuning.

For inference or chatting please check out [Heidrun-Mistral-7B-chat](https://huggingface.co/Mabeck/Heidrun-Mistral-7B-chat).

# Previous version

Please note that this has been updated since the original release. The old version can be found under branch v0.1.


# Uploaded  model

- **Developed by:** Mabeck
- **Finetuned from model :** mistralai/Mistral-7B-v0.1

This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.

[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)