bartowski commited on
Commit
fa47290
1 Parent(s): f3a68e2

Llamacpp quants

Browse files
.gitattributes CHANGED
@@ -33,3 +33,19 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ Faro-Yi-9B-200K-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text
37
+ Faro-Yi-9B-200K-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text
38
+ Faro-Yi-9B-200K-IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text
39
+ Faro-Yi-9B-200K-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
40
+ Faro-Yi-9B-200K-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
41
+ Faro-Yi-9B-200K-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
42
+ Faro-Yi-9B-200K-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
43
+ Faro-Yi-9B-200K-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
44
+ Faro-Yi-9B-200K-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
45
+ Faro-Yi-9B-200K-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
46
+ Faro-Yi-9B-200K-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
47
+ Faro-Yi-9B-200K-Q5_0.gguf filter=lfs diff=lfs merge=lfs -text
48
+ Faro-Yi-9B-200K-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
49
+ Faro-Yi-9B-200K-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
50
+ Faro-Yi-9B-200K-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
51
+ Faro-Yi-9B-200K-Q8_0.gguf filter=lfs diff=lfs merge=lfs -text
Faro-Yi-9B-200K-IQ3_M.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9136df962258fb0966625888529ed3eec0ed9837c6f6bd87ca2ecfda7579a3f5
3
+ size 4055461952
Faro-Yi-9B-200K-IQ3_S.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e46c2ebf3c7eb8c4f9716cbc34895a36795792a40e7008750ce956baf789a43e
3
+ size 3912577088
Faro-Yi-9B-200K-IQ4_NL.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:74436ed3d26145bb38fef2b9fac67e9759dfae69856bb0dea87fe9229bdf84bd
3
+ size 5083394112
Faro-Yi-9B-200K-IQ4_XS.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:182716a863d23c8a5b364874f350a5ed3dcc65499ab8751f6561c9095ce80d27
3
+ size 4827279424
Faro-Yi-9B-200K-Q2_K.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9eabdfc6638b9b10ea0be068b64b165b0a7b338cb5b93a8f4da46d99079b42e7
3
+ size 3354325056
Faro-Yi-9B-200K-Q3_K_L.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fdb3380e8851261908d777d888be8adf87216efdae7da2e958b8b61e7d2b6679
3
+ size 4690751552
Faro-Yi-9B-200K-Q3_K_M.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:76d3d1b523b1c42c48f3ff2ead3b3441001b9416f695b3165c05381163df273c
3
+ size 4324405312
Faro-Yi-9B-200K-Q3_K_S.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:45095068895ebf2782320fdf21dec700093aea07edfef8b0995a74f9f8c5c8a5
3
+ size 3899207744
Faro-Yi-9B-200K-Q4_0.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a6f3b78a2d3c5783fff13110a8acba11c11a520525eaafe07364a6dc88734abf
3
+ size 5036994624
Faro-Yi-9B-200K-Q4_K_M.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:82c982c4d815a7a510aa2bd8a4f995699961f29d58af8c2549f0c4811d317fd2
3
+ size 5328957504
Faro-Yi-9B-200K-Q4_K_S.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:07cfae867b0fb3f411feeb5ccc432e191245051ad3c70734a6671890270f1fe7
3
+ size 5071859776
Faro-Yi-9B-200K-Q5_0.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:54e573942949448c9849ccf3155f5ce889a5bbf5b5ddbc928e9739263c8799e8
3
+ size 6107852864
Faro-Yi-9B-200K-Q5_K_M.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ff755592f8e81088dcaa6e0694151cb0bbf28626c664193401553ef3888e6912
3
+ size 6258257984
Faro-Yi-9B-200K-Q5_K_S.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:47188a4109feaa7905888bc0ca563cecd1cd1f86e39a404501698f1b2a078d05
3
+ size 6107852864
Faro-Yi-9B-200K-Q6_K.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fc8368d532af2473d4a6b862fd1a2e3bf2d23afa0315363cefb22949943898d2
3
+ size 7245639744
Faro-Yi-9B-200K-Q8_0.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5a031fd48280b887c04b09cb7a079751a50d53762e4636f9a4cbf33d4be749b1
3
+ size 9383915584
README.md ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ datasets:
4
+ - wenbopan/Fusang-v1
5
+ - wenbopan/OpenOrca-zh-20k
6
+ language:
7
+ - zh
8
+ - en
9
+ quantized_by: bartowski
10
+ pipeline_tag: text-generation
11
+ ---
12
+
13
+ ## Llamacpp Quantizations of Faro-Yi-9B-200K
14
+
15
+ Using <a href="https://github.com/ggerganov/llama.cpp/">llama.cpp</a> release <a href="https://github.com/ggerganov/llama.cpp/releases/tag/b2536">b2536</a> for quantization.
16
+
17
+ Original model: https://huggingface.co/wenbopan/Faro-Yi-9B-200K
18
+
19
+ Download a file (not the whole branch) from below:
20
+
21
+ | Filename | Quant type | File Size | Description |
22
+ | -------- | ---------- | --------- | ----------- |
23
+ | [Faro-Yi-9B-200K-Q8_0.gguf](https://huggingface.co/bartowski/Faro-Yi-9B-200K-GGUF/blob/main/Faro-Yi-9B-200K-Q8_0.gguf) | Q8_0 | 9.38GB | Extremely high quality, generally unneeded but max available quant. |
24
+ | [Faro-Yi-9B-200K-Q6_K.gguf](https://huggingface.co/bartowski/Faro-Yi-9B-200K-GGUF/blob/main/Faro-Yi-9B-200K-Q6_K.gguf) | Q6_K | 7.24GB | Very high quality, near perfect, *recommended*. |
25
+ | [Faro-Yi-9B-200K-Q5_K_M.gguf](https://huggingface.co/bartowski/Faro-Yi-9B-200K-GGUF/blob/main/Faro-Yi-9B-200K-Q5_K_M.gguf) | Q5_K_M | 6.25GB | High quality, very usable. |
26
+ | [Faro-Yi-9B-200K-Q5_K_S.gguf](https://huggingface.co/bartowski/Faro-Yi-9B-200K-GGUF/blob/main/Faro-Yi-9B-200K-Q5_K_S.gguf) | Q5_K_S | 6.10GB | High quality, very usable. |
27
+ | [Faro-Yi-9B-200K-Q5_0.gguf](https://huggingface.co/bartowski/Faro-Yi-9B-200K-GGUF/blob/main/Faro-Yi-9B-200K-Q5_0.gguf) | Q5_0 | 6.10GB | High quality, older format, generally not recommended. |
28
+ | [Faro-Yi-9B-200K-Q4_K_M.gguf](https://huggingface.co/bartowski/Faro-Yi-9B-200K-GGUF/blob/main/Faro-Yi-9B-200K-Q4_K_M.gguf) | Q4_K_M | 5.32GB | Good quality, uses about 4.83 bits per weight. |
29
+ | [Faro-Yi-9B-200K-Q4_K_S.gguf](https://huggingface.co/bartowski/Faro-Yi-9B-200K-GGUF/blob/main/Faro-Yi-9B-200K-Q4_K_S.gguf) | Q4_K_S | 5.07GB | Slightly lower quality with small space savings. |
30
+ | [Faro-Yi-9B-200K-IQ4_NL.gguf](https://huggingface.co/bartowski/Faro-Yi-9B-200K-GGUF/blob/main/Faro-Yi-9B-200K-IQ4_NL.gguf) | IQ4_NL | 5.08GB | Decent quality, similar to Q4_K_S, new method of quanting, |
31
+ | [Faro-Yi-9B-200K-IQ4_XS.gguf](https://huggingface.co/bartowski/Faro-Yi-9B-200K-GGUF/blob/main/Faro-Yi-9B-200K-IQ4_XS.gguf) | IQ4_XS | 4.82GB | Decent quality, new method with similar performance to Q4. |
32
+ | [Faro-Yi-9B-200K-Q4_0.gguf](https://huggingface.co/bartowski/Faro-Yi-9B-200K-GGUF/blob/main/Faro-Yi-9B-200K-Q4_0.gguf) | Q4_0 | 5.03GB | Decent quality, older format, generally not recommended. |
33
+ | [Faro-Yi-9B-200K-Q3_K_L.gguf](https://huggingface.co/bartowski/Faro-Yi-9B-200K-GGUF/blob/main/Faro-Yi-9B-200K-Q3_K_L.gguf) | Q3_K_L | 4.69GB | Lower quality but usable, good for low RAM availability. |
34
+ | [Faro-Yi-9B-200K-Q3_K_M.gguf](https://huggingface.co/bartowski/Faro-Yi-9B-200K-GGUF/blob/main/Faro-Yi-9B-200K-Q3_K_M.gguf) | Q3_K_M | 4.32GB | Even lower quality. |
35
+ | [Faro-Yi-9B-200K-IQ3_M.gguf](https://huggingface.co/bartowski/Faro-Yi-9B-200K-GGUF/blob/main/Faro-Yi-9B-200K-IQ3_M.gguf) | IQ3_M | 4.05GB | Medium-low quality, new method with decent performance. |
36
+ | [Faro-Yi-9B-200K-IQ3_S.gguf](https://huggingface.co/bartowski/Faro-Yi-9B-200K-GGUF/blob/main/Faro-Yi-9B-200K-IQ3_S.gguf) | IQ3_S | 3.91GB | Lower quality, new method with decent performance, recommended over Q3 quants. |
37
+ | [Faro-Yi-9B-200K-Q3_K_S.gguf](https://huggingface.co/bartowski/Faro-Yi-9B-200K-GGUF/blob/main/Faro-Yi-9B-200K-Q3_K_S.gguf) | Q3_K_S | 3.89GB | Low quality, not recommended. |
38
+ | [Faro-Yi-9B-200K-Q2_K.gguf](https://huggingface.co/bartowski/Faro-Yi-9B-200K-GGUF/blob/main/Faro-Yi-9B-200K-Q2_K.gguf) | Q2_K | 3.35GB | Extremely low quality, *not* recommended.
39
+
40
+ Want to support my work? Visit my ko-fi page here: https://ko-fi.com/bartowski