File size: 525 Bytes
44748cd 2711647 dcb2c23 2711647 dcb2c23 2711647 dcb2c23 44748cd 2711647 9c8ada5 2711647 3551c7a 2711647 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
---
language:
- sv
- en
license: mit
tags:
- pretrained
pipeline_tag: text-generation
widget:
- text: Jag tycker att det är roligt med
---
# 🐈⬛ Mistral-7B-v0.1-flashback-v2

Mistral-7B-v0.1-flashback-v2 model is a continuation of the pretraining process for the base Mistral-7B-v0.1 model, utilizing around 40GB of forum threads from the Swedish website flashback.org.
It is a full finetune for one epoch.
|