File size: 901 Bytes
8a86d8e
 
 
 
6e94071
8a86d8e
 
b3114eb
 
8a86d8e
 
9db13d4
8a86d8e
 
 
 
 
 
 
519aeb4
8a86d8e
 
 
6e94071
8a86d8e
6e94071
8a86d8e
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
---
license: cc-by-nc-4.0
language:
- en
base_model: mistralai/Mixtral-8x7B-v0.1
---

![Frost2](https://huggingface.co/Sao10K/Frostwind-Mixtral-v1/resolve/main/mraww.png)

GGUF: https://huggingface.co/Sao10K/Frostwind-Mixtral-v1-GGUF

[Frostwind-10.7B-v1](https://huggingface.co/Sao10K/Frostwind-10.7B-v1) but on Mixtral. More info there.

Experimental. May or may not be good, Mixtral training is... difficult to work with.

Trained with Alpaca Instruct Format.

***

> Why Frostwind v1 instead of v2? This was requested by someone.

Inherits all 'flaws' and 'strengths' of initial Frostwind. 

Pretty smart, I think, from initial testing.

Less terse than Solar variant, but this is probably do to Mixtral being more verbose than base solar? Idk, hard to explain.

***

I really appreciate your feedback / supportive comments. They keep me going.

***

Support me [here](https://ko-fi.com/sao10k) :)