MoE-Test-4x7b / README.md
athirdpath's picture
Update README.md
7a7c0b8 verified
|
raw
history blame
No virus
354 Bytes
metadata
license: apache-2.0

I'm back! :D

A mergekit made MoE with all Apache licenses, so this lil guy is commercially usable, unlike my prior models.

Models used:

  • cognitivecomputations/dolphin-2.6-mistral-7b-dpo-laser
  • lvkaokao/mistral-7b-finetuned-orca-dpo-v2
  • Herman555/Hexoteric-AshhLimaRP-Mistral-7B-GGUF
  • jondurbin/bagel-dpo-7b-v0.1