|
--- |
|
license: cc-by-nc-4.0 |
|
language: |
|
- en |
|
base_model: mistralai/Mixtral-8x7B-v0.1 |
|
--- |
|
|
|
![Frost2](https://huggingface.co/Sao10K/Frostwind-Mixtral-v1/resolve/main/mraww.png) |
|
|
|
GGUF: https://huggingface.co/Sao10K/Frostwind-Mixtral-v1-GGUF |
|
|
|
[Frostwind-10.7B-v1](https://huggingface.co/Sao10K/Frostwind-10.7B-v1) but on Mixtral. More info there. |
|
|
|
Experimental. May or may not be good, Mixtral training is... difficult to work with. |
|
|
|
Trained with Alpaca Instruct Format. |
|
|
|
*** |
|
|
|
> Why Frostwind v1 instead of v2? This was requested by someone. |
|
|
|
Inherits all 'flaws' and 'strengths' of initial Frostwind. |
|
|
|
Pretty smart, I think, from initial testing. |
|
|
|
Less terse than Solar variant, but this is probably do to Mixtral being more verbose than base solar? Idk, hard to explain. |
|
|
|
*** |
|
|
|
I really appreciate your feedback / supportive comments. They keep me going. |
|
|
|
*** |
|
|
|
Support me [here](https://ko-fi.com/sao10k) :) |