Update README.md
Browse files
README.md
CHANGED
@@ -13,23 +13,30 @@ Quantized using this PR https://github.com/leejet/stable-diffusion.cpp/pull/447
|
|
13 |
|
14 |
### Files:
|
15 |
|
|
|
|
|
16 |
- [sd3.5_large_turbo-q2_k_4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-Turbo-GGUF-mixed-sdcpp/blob/main/sd3.5_large_turbo-q2_k_4_0.gguf): Smallest quantization yet. Use this if you can't afford anything bigger
|
17 |
- [sd3.5_large_turbo-q3_k_4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-Turbo-GGUF-mixed-sdcpp/blob/main/sd3.5_large_turbo-q3_k_4_0.gguf): Smaller than q4_0, acceptable degradation.
|
18 |
- [sd3.5_large_turbo-q4_k_4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-Turbo-GGUF-mixed-sdcpp/blob/main/sd3.5_large_turbo-q4_k_4_0.gguf): Exacty same size as q4_0, but with slightly less degradation. Recommended
|
19 |
- [sd3.5_large_turbo-q4_k_4_1.gguf](https://huggingface.co/stduhpf/SD3.5-Large-Turbo-GGUF-mixed-sdcpp/blob/main/sd3.5_large_turbo-q4_k_4_1.gguf): Smaller than q4_1, and with comparable degradation. Recommended
|
20 |
- [sd3.5_large_turbo-q4_k_5_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-Turbo-GGUF-mixed-sdcpp/blob/main/sd3.5_large_turbo-q4_k_5_0.gguf): Smaller than q5_0, and with comparable degradation. Recommended
|
21 |
|
|
|
|
|
|
|
|
|
|
|
22 |
### Outputs:
|
23 |
|
24 |
-
| Name | Image | Image |
|
25 |
-
| ------------------ | -------------------------------- | ---------------------------------- |
|
26 |
-
| q2_k_4_0 |  |  |
|
27 |
-
| q3_k_4_0 |  |  |
|
28 |
-
| q4_0 |  |  |
|
29 |
-
| q4_k_4_0 |  |  |
|
30 |
-
| q4_k_4_1 |  |  |
|
31 |
-
| q4_1 |  |  |
|
32 |
-
| q4_k_5_0 |  |  |
|
33 |
-
| q5_0 |  |  |
|
34 |
-
| q8_0 |  |  |
|
35 |
-
| f16(sft) |  |  |
|
|
|
13 |
|
14 |
### Files:
|
15 |
|
16 |
+
#### Mixed Types:
|
17 |
+
|
18 |
- [sd3.5_large_turbo-q2_k_4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-Turbo-GGUF-mixed-sdcpp/blob/main/sd3.5_large_turbo-q2_k_4_0.gguf): Smallest quantization yet. Use this if you can't afford anything bigger
|
19 |
- [sd3.5_large_turbo-q3_k_4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-Turbo-GGUF-mixed-sdcpp/blob/main/sd3.5_large_turbo-q3_k_4_0.gguf): Smaller than q4_0, acceptable degradation.
|
20 |
- [sd3.5_large_turbo-q4_k_4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-Turbo-GGUF-mixed-sdcpp/blob/main/sd3.5_large_turbo-q4_k_4_0.gguf): Exacty same size as q4_0, but with slightly less degradation. Recommended
|
21 |
- [sd3.5_large_turbo-q4_k_4_1.gguf](https://huggingface.co/stduhpf/SD3.5-Large-Turbo-GGUF-mixed-sdcpp/blob/main/sd3.5_large_turbo-q4_k_4_1.gguf): Smaller than q4_1, and with comparable degradation. Recommended
|
22 |
- [sd3.5_large_turbo-q4_k_5_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-Turbo-GGUF-mixed-sdcpp/blob/main/sd3.5_large_turbo-q4_k_5_0.gguf): Smaller than q5_0, and with comparable degradation. Recommended
|
23 |
|
24 |
+
#### Legacy types:
|
25 |
+
|
26 |
+
- [sd3.5_large_turbo-q4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-Turbo-GGUF-mixed-sdcpp/blob/main/legacy/sd3.5_large_turbo-q4_0.gguf): Same size as q4_k_4_0, Not recommended (use q4_k_4_0 instead)
|
27 |
+
(I wanted to upload more, but it's not working anymore, maybe i hit a rate limit)
|
28 |
+
|
29 |
### Outputs:
|
30 |
|
31 |
+
| Name | Image | Image | Image |
|
32 |
+
| ------------------ | -------------------------------- | ---------------------------------- | ---------------------------------- |
|
33 |
+
| q2_k_4_0 |  |  |  |
|
34 |
+
| q3_k_4_0 |  |  |  |
|
35 |
+
| q4_0 |  |  |  |
|
36 |
+
| q4_k_4_0 |  |  |  |
|
37 |
+
| q4_k_4_1 |  |  |  |
|
38 |
+
| q4_1 |  |  |  |
|
39 |
+
| q4_k_5_0 |  |  |  |
|
40 |
+
| q5_0 |  |  |  |
|
41 |
+
| q8_0 |  |  |  |
|
42 |
+
| f16(sft) |  |  |  |
|