mradermacher commited on
Commit
81f5a6f
1 Parent(s): b0b562b

auto-patch README.md

Browse files
Files changed (1) hide show
  1. README.md +16 -16
README.md CHANGED
@@ -48,8 +48,8 @@ tags:
48
  <!-- ### vocab_type: -->
49
  static quants of https://huggingface.co/LeroyDyer/Mixtral_AI_Ultron
50
 
51
-
52
  <!-- provided-files -->
 
53
  ## Usage
54
 
55
  If you are unsure how to use GGUF files, refer to one of [TheBloke's
@@ -62,21 +62,21 @@ more details, including on how to concatenate multi-part files.
62
 
63
  | Link | Type | Size/GB | Notes |
64
  |:-----|:-----|--------:|:------|
65
- | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-i1-GGUF/resolve/main/Mixtral_AI_Ultron.Q2_K.gguf) | Q2_K | 2.8 | |
66
- | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-i1-GGUF/resolve/main/Mixtral_AI_Ultron.IQ3_XS.gguf) | IQ3_XS | 3.1 | |
67
- | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-i1-GGUF/resolve/main/Mixtral_AI_Ultron.Q3_K_S.gguf) | Q3_K_S | 3.3 | |
68
- | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-i1-GGUF/resolve/main/Mixtral_AI_Ultron.IQ3_S.gguf) | IQ3_S | 3.3 | beats Q3_K* |
69
- | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-i1-GGUF/resolve/main/Mixtral_AI_Ultron.IQ3_M.gguf) | IQ3_M | 3.4 | |
70
- | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-i1-GGUF/resolve/main/Mixtral_AI_Ultron.Q3_K_M.gguf) | Q3_K_M | 3.6 | lower quality |
71
- | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-i1-GGUF/resolve/main/Mixtral_AI_Ultron.Q3_K_L.gguf) | Q3_K_L | 3.9 | |
72
- | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-i1-GGUF/resolve/main/Mixtral_AI_Ultron.IQ4_XS.gguf) | IQ4_XS | 4.0 | |
73
- | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-i1-GGUF/resolve/main/Mixtral_AI_Ultron.Q4_K_S.gguf) | Q4_K_S | 4.2 | fast, recommended |
74
- | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-i1-GGUF/resolve/main/Mixtral_AI_Ultron.Q4_K_M.gguf) | Q4_K_M | 4.5 | fast, recommended |
75
- | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-i1-GGUF/resolve/main/Mixtral_AI_Ultron.Q5_K_S.gguf) | Q5_K_S | 5.1 | |
76
- | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-i1-GGUF/resolve/main/Mixtral_AI_Ultron.Q5_K_M.gguf) | Q5_K_M | 5.2 | |
77
- | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-i1-GGUF/resolve/main/Mixtral_AI_Ultron.Q6_K.gguf) | Q6_K | 6.0 | very good quality |
78
- | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-i1-GGUF/resolve/main/Mixtral_AI_Ultron.Q8_0.gguf) | Q8_0 | 7.8 | fast, best quality |
79
- | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-i1-GGUF/resolve/main/Mixtral_AI_Ultron.f16.gguf) | f16 | 14.6 | 16 bpw, overkill |
80
 
81
 
82
  Here is a handy graph by ikawrakow comparing some lower-quality quant
 
48
  <!-- ### vocab_type: -->
49
  static quants of https://huggingface.co/LeroyDyer/Mixtral_AI_Ultron
50
 
 
51
  <!-- provided-files -->
52
+ weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
53
  ## Usage
54
 
55
  If you are unsure how to use GGUF files, refer to one of [TheBloke's
 
62
 
63
  | Link | Type | Size/GB | Notes |
64
  |:-----|:-----|--------:|:------|
65
+ | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-GGUF/resolve/main/Mixtral_AI_Ultron.Q2_K.gguf) | Q2_K | 2.8 | |
66
+ | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-GGUF/resolve/main/Mixtral_AI_Ultron.IQ3_XS.gguf) | IQ3_XS | 3.1 | |
67
+ | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-GGUF/resolve/main/Mixtral_AI_Ultron.Q3_K_S.gguf) | Q3_K_S | 3.3 | |
68
+ | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-GGUF/resolve/main/Mixtral_AI_Ultron.IQ3_S.gguf) | IQ3_S | 3.3 | beats Q3_K* |
69
+ | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-GGUF/resolve/main/Mixtral_AI_Ultron.IQ3_M.gguf) | IQ3_M | 3.4 | |
70
+ | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-GGUF/resolve/main/Mixtral_AI_Ultron.Q3_K_M.gguf) | Q3_K_M | 3.6 | lower quality |
71
+ | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-GGUF/resolve/main/Mixtral_AI_Ultron.Q3_K_L.gguf) | Q3_K_L | 3.9 | |
72
+ | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-GGUF/resolve/main/Mixtral_AI_Ultron.IQ4_XS.gguf) | IQ4_XS | 4.0 | |
73
+ | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-GGUF/resolve/main/Mixtral_AI_Ultron.Q4_K_S.gguf) | Q4_K_S | 4.2 | fast, recommended |
74
+ | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-GGUF/resolve/main/Mixtral_AI_Ultron.Q4_K_M.gguf) | Q4_K_M | 4.5 | fast, recommended |
75
+ | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-GGUF/resolve/main/Mixtral_AI_Ultron.Q5_K_S.gguf) | Q5_K_S | 5.1 | |
76
+ | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-GGUF/resolve/main/Mixtral_AI_Ultron.Q5_K_M.gguf) | Q5_K_M | 5.2 | |
77
+ | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-GGUF/resolve/main/Mixtral_AI_Ultron.Q6_K.gguf) | Q6_K | 6.0 | very good quality |
78
+ | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-GGUF/resolve/main/Mixtral_AI_Ultron.Q8_0.gguf) | Q8_0 | 7.8 | fast, best quality |
79
+ | [GGUF](https://huggingface.co/mradermacher/Mixtral_AI_Ultron-GGUF/resolve/main/Mixtral_AI_Ultron.f16.gguf) | f16 | 14.6 | 16 bpw, overkill |
80
 
81
 
82
  Here is a handy graph by ikawrakow comparing some lower-quality quant