mixtralnt-4x7b-test / README.md
chargoddard's picture
Create README.md
fc26ce0
|
raw
history blame
No virus
1.05 kB
metadata
license: cc-by-nc-4.0

Mixtraln't 4x7B

Oh boy, a new model architecture in Transformers! Time to do profane things with it.

What if instead of training a MoE from scratch, we took some pre-trained Mistral models and shoved them in a little clown car? Let's find out.

Uses parts from the following models:

Works and generates coherent text. The big question here is if the hack I used to populate the MoE gates works well enough to take advantage of all of the experts. Let's find out!

Prompt format: maybe alpaca??? or chatml??? life is full of mysteries