Sao10K's picture
Update README.md
b3114eb verified
metadata
license: cc-by-nc-4.0
language:
  - en
base_model: mistralai/Mixtral-8x7B-v0.1

Frost2

GGUF: https://huggingface.co/Sao10K/Frostwind-Mixtral-v1-GGUF

Frostwind-10.7B-v1 but on Mixtral. More info there.

Experimental. May or may not be good, Mixtral training is... difficult to work with.

Trained with Alpaca Instruct Format.


Why Frostwind v1 instead of v2? This was requested by someone.

Inherits all 'flaws' and 'strengths' of initial Frostwind.

Pretty smart, I think, from initial testing.

Less terse than Solar variant, but this is probably do to Mixtral being more verbose than base solar? Idk, hard to explain.


I really appreciate your feedback / supportive comments. They keep me going.


Support me here :)