File size: 354 Bytes
6adfd68 c4f5e91 |
1 2 3 4 5 6 7 8 9 10 11 12 13 |
---
license: apache-2.0
---
I'm back! :D
A mergekit made MoE with all Apache licenses, so this lil guy is commercially usable, unlike my prior models.
Models used:
- cognitivecomputations/dolphin-2.6-mistral-7b-dpo-laser
- lvkaokao/mistral-7b-finetuned-orca-dpo-v2
- Herman555/Hexoteric-AshhLimaRP-Mistral-7B-GGUF
- jondurbin/bagel-dpo-7b-v0.1 |