bartowski commited on
Commit
d6e74e0
1 Parent(s): 16012f3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -12
README.md CHANGED
@@ -18,17 +18,17 @@ Original model: https://huggingface.co/internlm/internlm2-chat-20b
18
 
19
  | Branch | Bits | lm_head bits | VRAM (4k) | VRAM (16k) | VRAM (32k) | Description |
20
  | ------ | ---- | ------------ | ---- | ---- | ---- | ----------- |
21
- | [6_5](https://huggingface.co/Bartowski/internlm2-chat-20b-llama-test-exl2/tree/6_5) | 6.5 | 8.0 | 19.6 GB | 21.0 GB | 23.0 GB | Near unquantized performance at vastly reduced size, **recommended**. |
22
- | [4_25](https://huggingface.co/Bartowski/internlm2-chat-20b-llama-test-exl2/tree/4_25) | 4.25 | 6.0 | 13.8 GB | 15.2 GB | 17.2 GB | GPTQ equivalent bits per weight, slightly higher quality. |
23
- | [3_5](https://huggingface.co/Bartowski/internlm2-chat-20b-llama-test-exl2/tree/3_5) | 3.5 | 6.0 | 12.4 GB | 13.8 GB | 15.8 GB | Lower quality, only use if you have to. |
24
- | [3_0](https://huggingface.co/Bartowski/internlm2-chat-20b-llama-test-exl2/tree/3_0) | 3.0 | 6.0 | 11.1 GB | 12.5 GB | 15.5 GB | Very low quality. Usable on 12GB. |
25
 
26
  ## Download instructions
27
 
28
  With git:
29
 
30
  ```shell
31
- git clone --single-branch --branch 6_5 https://huggingface.co/bartowski/internlm2-chat-20b-llama-test-exl2 internlm2-chat-20b-llama-test-exl2-6_5
32
  ```
33
 
34
  With huggingface hub (credit to TheBloke for instructions):
@@ -37,11 +37,11 @@ With huggingface hub (credit to TheBloke for instructions):
37
  pip3 install huggingface-hub
38
  ```
39
 
40
- To download the `main` (only useful if you only care about measurement.json) branch to a folder called `internlm2-chat-20b-llama-test-exl2`:
41
 
42
  ```shell
43
- mkdir internlm2-chat-20b-llama-test-exl2
44
- huggingface-cli download bartowski/internlm2-chat-20b-llama-test-exl2 --local-dir internlm2-chat-20b-llama-test-exl2 --local-dir-use-symlinks False
45
  ```
46
 
47
  To download from a different branch, add the `--revision` parameter:
@@ -49,15 +49,15 @@ To download from a different branch, add the `--revision` parameter:
49
  Linux:
50
 
51
  ```shell
52
- mkdir internlm2-chat-20b-llama-test-exl2-6_5
53
- huggingface-cli download bartowski/internlm2-chat-20b-llama-test-exl2 --revision 6_5 --local-dir internlm2-chat-20b-llama-test-exl2-6_5 --local-dir-use-symlinks False
54
  ```
55
 
56
  Windows (which apparently doesn't like _ in folders sometimes?):
57
 
58
  ```shell
59
- mkdir internlm2-chat-20b-llama-test-exl2-6.5
60
- huggingface-cli download bartowski/internlm2-chat-20b-llama-test-exl2 --revision 6_5 --local-dir internlm2-chat-20b-llama-test-exl2-6.5 --local-dir-use-symlinks False
61
  ```
62
 
63
  Want to support my work? Visit my ko-fi page here: https://ko-fi.com/bartowski
 
18
 
19
  | Branch | Bits | lm_head bits | VRAM (4k) | VRAM (16k) | VRAM (32k) | Description |
20
  | ------ | ---- | ------------ | ---- | ---- | ---- | ----------- |
21
+ | [6_5](https://huggingface.co/Bartowski/internlm2-chat-20b-llama-exl2/tree/6_5) | 6.5 | 8.0 | 19.6 GB | 21.0 GB | 23.0 GB | Near unquantized performance at vastly reduced size, **recommended**. |
22
+ | [4_25](https://huggingface.co/Bartowski/internlm2-chat-20b-llama-exl2/tree/4_25) | 4.25 | 6.0 | 13.8 GB | 15.2 GB | 17.2 GB | GPTQ equivalent bits per weight, slightly higher quality. |
23
+ | [3_5](https://huggingface.co/Bartowski/internlm2-chat-20b-llama-exl2/tree/3_5) | 3.5 | 6.0 | 12.4 GB | 13.8 GB | 15.8 GB | Lower quality, only use if you have to. |
24
+ | [3_0](https://huggingface.co/Bartowski/internlm2-chat-20b-llama-exl2/tree/3_0) | 3.0 | 6.0 | 11.1 GB | 12.5 GB | 15.5 GB | Very low quality. Usable on 12GB. |
25
 
26
  ## Download instructions
27
 
28
  With git:
29
 
30
  ```shell
31
+ git clone --single-branch --branch 6_5 https://huggingface.co/bartowski/internlm2-chat-20b-llama-exl2 internlm2-chat-20b-llama-exl2-6_5
32
  ```
33
 
34
  With huggingface hub (credit to TheBloke for instructions):
 
37
  pip3 install huggingface-hub
38
  ```
39
 
40
+ To download the `main` (only useful if you only care about measurement.json) branch to a folder called `internlm2-chat-20b-llama-exl2`:
41
 
42
  ```shell
43
+ mkdir internlm2-chat-20b-llama-exl2
44
+ huggingface-cli download bartowski/internlm2-chat-20b-llama-exl2 --local-dir internlm2-chat-20b-llama-exl2 --local-dir-use-symlinks False
45
  ```
46
 
47
  To download from a different branch, add the `--revision` parameter:
 
49
  Linux:
50
 
51
  ```shell
52
+ mkdir internlm2-chat-20b-llama-exl2-6_5
53
+ huggingface-cli download bartowski/internlm2-chat-20b-llama-exl2 --revision 6_5 --local-dir internlm2-chat-20b-llama-exl2-6_5 --local-dir-use-symlinks False
54
  ```
55
 
56
  Windows (which apparently doesn't like _ in folders sometimes?):
57
 
58
  ```shell
59
+ mkdir internlm2-chat-20b-llama-exl2-6.5
60
+ huggingface-cli download bartowski/internlm2-chat-20b-llama-exl2 --revision 6_5 --local-dir internlm2-chat-20b-llama-exl2-6.5 --local-dir-use-symlinks False
61
  ```
62
 
63
  Want to support my work? Visit my ko-fi page here: https://ko-fi.com/bartowski