File size: 1,643 Bytes
0fd2171 76115bb 0fd2171 4114ee9 74a7cba 0fd2171 76115bb 0fd2171 76115bb 0fd2171 c2a83db 0fd2171 264352b 0fd2171 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 |
---
license: cc-by-nc-4.0
tags:
- not-for-all-audiences
- nsfw
---
## Lumimaid 0.2
<img src="https://cdn-uploads.huggingface.co/production/uploads/63ab1241ad514ca8d1430003/ep3ojmuMkFS-GmgRuI9iB.png" alt="Image" style="display: block; margin-left: auto; margin-right: auto; width: 65%;">
<div style="text-align: center; font-size: 30px;">
<a href="https://huggingface.co/NeverSleep/Lumimaid-v0.2-8B">8b</a> -
<a href="https://huggingface.co/NeverSleep/Lumimaid-v0.2-12B">[12b]</a> -
<a href="https://huggingface.co/NeverSleep/Lumimaid-v0.2-70B">70b</a> -
<a href="https://huggingface.co/NeverSleep/Lumimaid-v0.2-123B">123b</a>
</div>
### This model is based on: [Mistral-Nemo-Instruct-2407](https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407)
Wandb: https://wandb.ai/undis95/Lumi-Mistral-Nemo?nw=nwuserundis95
Lumimaid 0.1 -> 0.2 is a HUGE step up dataset wise.
As some people have told us our models are sloppy, Ikari decided to say fuck it and literally nuke all chats out with most slop.
Our dataset stayed the same since day one, we added data over time, cleaned them, and repeat. After not releasing model for a while because we were never satisfied, we think it's time to come back!
## Credits:
- Undi
- IkariDev
## Training data used:
We will point out all dataset we used here, please be patient the time we get them all back kek.
# Prompt template: Mistral
```
<s>{system prompt} [INST] {input} [/INST] {output}</s>
```
## Others
Undi: If you want to support us, you can [here](https://ko-fi.com/undiai).
IkariDev: Visit my [retro/neocities style website](https://ikaridevgit.github.io/) please kek |