bartowski commited on
Commit
58085a0
1 Parent(s): 6c59d4a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -7
README.md CHANGED
@@ -39,12 +39,13 @@ Original model: https://huggingface.co/microsoft/Phi-3-medium-4k-instruct
39
 
40
  ## Available sizes
41
 
42
- | Branch | Bits | lm_head bits | VRAM (4k) | VRAM (16k) | VRAM (32k) | Description |
43
  | ----- | ---- | ------- | ------ | ------ | ------ | ------------ |
44
- | [6_5](https://huggingface.co/bartowski/Phi-3-medium-4k-instruct-exl2/tree/6_5) | 6.5 | 8.0 | 12.0 GB | 14.7 GB | 18.4 GB | Near unquantized performance at vastly reduced size, **recommended**. |
45
- | [5_0](https://huggingface.co/bartowski/Phi-3-medium-4k-instruct-exl2/tree/5_0) | 5.0 | 6.0 | 9.8 GB | 12.4 GB | 16.1 GB | Slightly lower quality vs 6.5. |
46
- | [4_25](https://huggingface.co/bartowski/Phi-3-medium-4k-instruct-exl2/tree/4_25) | 4.25 | 6.0 | 8.7 GB | 11.3 GB | 15.0 GB | GPTQ equivalent bits per weight. |
47
- | [3_5](https://huggingface.co/bartowski/Phi-3-medium-4k-instruct-exl2/tree/3_5) | 3.5 | 6.0 | 7.6 GB | 10.1 GB | 13.8 GB | Lower quality, not recommended. |
 
48
 
49
  ## Download instructions
50
 
@@ -65,13 +66,13 @@ To download a specific branch, use the `--revision` parameter. For example, to d
65
  Linux:
66
 
67
  ```shell
68
- huggingface-cli download bartowski/Phi-3-medium-4k-instruct-exl2 --revision 6_5 --local-dir Phi-3-medium-4k-instruct-exl2-6_5 --local-dir-use-symlinks False
69
  ```
70
 
71
  Windows (which apparently doesn't like _ in folders sometimes?):
72
 
73
  ```shell
74
- huggingface-cli download bartowski/Phi-3-medium-4k-instruct-exl2 --revision 6_5 --local-dir Phi-3-medium-4k-instruct-exl2-6.5 --local-dir-use-symlinks False
75
  ```
76
 
77
  Want to support my work? Visit my ko-fi page here: https://ko-fi.com/bartowski
 
39
 
40
  ## Available sizes
41
 
42
+ | Branch | Bits | lm_head bits | VRAM (4k) | VRAM (16k) | Description |
43
  | ----- | ---- | ------- | ------ | ------ | ------ | ------------ |
44
+ | [8_0](https://huggingface.co/bartowski/Phi-3-medium-4k-instruct-exl2/tree/8_0) | 8.0 | 8.0 | 14.0 GB | 16.4 GB | Max quality that ExLlamaV2 can produce, **recommended**. |
45
+ | [6_5](https://huggingface.co/bartowski/Phi-3-medium-4k-instruct-exl2/tree/6_5) | 6.5 | 8.0 | 12.5 GB | 14.9 GB | Near unquantized performance at vastly reduced size, **recommended**. |
46
+ | [5_0](https://huggingface.co/bartowski/Phi-3-medium-4k-instruct-exl2/tree/5_0) | 5.0 | 6.0 | 10.0 GB | 12.4 GB | Slightly lower quality vs 6.5. |
47
+ | [4_25](https://huggingface.co/bartowski/Phi-3-medium-4k-instruct-exl2/tree/4_25) | 4.25 | 6.0 | 8.8 GB | 11.2 GB | GPTQ equivalent bits per weight. |
48
+ | [3_5](https://huggingface.co/bartowski/Phi-3-medium-4k-instruct-exl2/tree/3_5) | 3.5 | 6.0 | 7.6 GB | 10.0 GB | Lower quality, not recommended. |
49
 
50
  ## Download instructions
51
 
 
66
  Linux:
67
 
68
  ```shell
69
+ huggingface-cli download bartowski/Phi-3-medium-4k-instruct-exl2 --revision 6_5 --local-dir Phi-3-medium-4k-instruct-exl2-6_5
70
  ```
71
 
72
  Windows (which apparently doesn't like _ in folders sometimes?):
73
 
74
  ```shell
75
+ huggingface-cli download bartowski/Phi-3-medium-4k-instruct-exl2 --revision 6_5 --local-dir Phi-3-medium-4k-instruct-exl2-6.5
76
  ```
77
 
78
  Want to support my work? Visit my ko-fi page here: https://ko-fi.com/bartowski