Update README.md
Browse files
README.md
CHANGED
@@ -39,12 +39,13 @@ Original model: https://huggingface.co/microsoft/Phi-3-medium-4k-instruct
|
|
39 |
|
40 |
## Available sizes
|
41 |
|
42 |
-
| Branch | Bits | lm_head bits | VRAM (4k) | VRAM (16k) |
|
43 |
| ----- | ---- | ------- | ------ | ------ | ------ | ------------ |
|
44 |
-
| [
|
45 |
-
| [
|
46 |
-
| [
|
47 |
-
| [
|
|
|
48 |
|
49 |
## Download instructions
|
50 |
|
@@ -65,13 +66,13 @@ To download a specific branch, use the `--revision` parameter. For example, to d
|
|
65 |
Linux:
|
66 |
|
67 |
```shell
|
68 |
-
huggingface-cli download bartowski/Phi-3-medium-4k-instruct-exl2 --revision 6_5 --local-dir Phi-3-medium-4k-instruct-exl2-6_5
|
69 |
```
|
70 |
|
71 |
Windows (which apparently doesn't like _ in folders sometimes?):
|
72 |
|
73 |
```shell
|
74 |
-
huggingface-cli download bartowski/Phi-3-medium-4k-instruct-exl2 --revision 6_5 --local-dir Phi-3-medium-4k-instruct-exl2-6.5
|
75 |
```
|
76 |
|
77 |
Want to support my work? Visit my ko-fi page here: https://ko-fi.com/bartowski
|
|
|
39 |
|
40 |
## Available sizes
|
41 |
|
42 |
+
| Branch | Bits | lm_head bits | VRAM (4k) | VRAM (16k) | Description |
|
43 |
| ----- | ---- | ------- | ------ | ------ | ------ | ------------ |
|
44 |
+
| [8_0](https://huggingface.co/bartowski/Phi-3-medium-4k-instruct-exl2/tree/8_0) | 8.0 | 8.0 | 14.0 GB | 16.4 GB | Max quality that ExLlamaV2 can produce, **recommended**. |
|
45 |
+
| [6_5](https://huggingface.co/bartowski/Phi-3-medium-4k-instruct-exl2/tree/6_5) | 6.5 | 8.0 | 12.5 GB | 14.9 GB | Near unquantized performance at vastly reduced size, **recommended**. |
|
46 |
+
| [5_0](https://huggingface.co/bartowski/Phi-3-medium-4k-instruct-exl2/tree/5_0) | 5.0 | 6.0 | 10.0 GB | 12.4 GB | Slightly lower quality vs 6.5. |
|
47 |
+
| [4_25](https://huggingface.co/bartowski/Phi-3-medium-4k-instruct-exl2/tree/4_25) | 4.25 | 6.0 | 8.8 GB | 11.2 GB | GPTQ equivalent bits per weight. |
|
48 |
+
| [3_5](https://huggingface.co/bartowski/Phi-3-medium-4k-instruct-exl2/tree/3_5) | 3.5 | 6.0 | 7.6 GB | 10.0 GB | Lower quality, not recommended. |
|
49 |
|
50 |
## Download instructions
|
51 |
|
|
|
66 |
Linux:
|
67 |
|
68 |
```shell
|
69 |
+
huggingface-cli download bartowski/Phi-3-medium-4k-instruct-exl2 --revision 6_5 --local-dir Phi-3-medium-4k-instruct-exl2-6_5
|
70 |
```
|
71 |
|
72 |
Windows (which apparently doesn't like _ in folders sometimes?):
|
73 |
|
74 |
```shell
|
75 |
+
huggingface-cli download bartowski/Phi-3-medium-4k-instruct-exl2 --revision 6_5 --local-dir Phi-3-medium-4k-instruct-exl2-6.5
|
76 |
```
|
77 |
|
78 |
Want to support my work? Visit my ko-fi page here: https://ko-fi.com/bartowski
|