shaowenchen commited on
Commit
7cb8d58
1 Parent(s): a8360cd

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +30 -30
README.md CHANGED
@@ -2,57 +2,57 @@
2
  inference: false
3
  language:
4
  - zh
5
- license: apache-2.0
6
  model_creator: ziqingyang
7
  model_link: https://huggingface.co/ziqingyang/chinese-alpaca-2-7b
8
  model_name: chinese-alpaca-2-7b
9
  model_type: llama
10
  pipeline_tag: text-generation
11
  quantized_by: shaowenchen
 
 
12
  tags:
13
  - meta
14
  - gguf
15
  - llama
16
- - llama-2
17
- - alpaca
18
  - alpaca-2
19
  - chinese
20
  ---
21
 
22
  ## Provided files
23
 
24
- | Name | Quant method | Size |
25
- | ---------------------------------------------------------------------------------------------------------------------------------------- | ------------ | ------ |
26
- | [chinese-alpaca-2-7b.Q2_K.gguf](https://huggingface.co/shaowenchen/chinese-alpaca-2-7b-GGUF/blob/main/chinese-alpaca-2-7b.Q2_K.gguf) | Q2_K | 2.7 GB |
27
- | [chinese-alpaca-2-7b.Q3_K.gguf](https://huggingface.co/shaowenchen/chinese-alpaca-2-7b-GGUF/blob/main/chinese-alpaca-2-7b.Q3_K.gguf) | Q3_K | 3.2 GB |
28
- | [chinese-alpaca-2-7b.Q3_K_L.gguf](https://huggingface.co/shaowenchen/chinese-alpaca-2-7b-GGUF/blob/main/chinese-alpaca-2-7b.Q3_K_L.gguf) | Q3_K_L | 3.5 GB |
29
- | [chinese-alpaca-2-7b.Q3_K_S.gguf](https://huggingface.co/shaowenchen/chinese-alpaca-2-7b-GGUF/blob/main/chinese-alpaca-2-7b.Q3_K_S.gguf) | Q3_K_S | 2.9 GB |
30
- | [chinese-alpaca-2-7b.Q4_0.gguf](https://huggingface.co/shaowenchen/chinese-alpaca-2-7b-GGUF/blob/main/chinese-alpaca-2-7b.Q4_0.gguf) | Q4_0 | 3.7 GB |
31
- | [chinese-alpaca-2-7b.Q4_1.gguf](https://huggingface.co/shaowenchen/chinese-alpaca-2-7b-GGUF/blob/main/chinese-alpaca-2-7b.Q4_1.gguf) | Q4_1 | 4.1 GB |
32
- | [chinese-alpaca-2-7b.Q4_K.gguf](https://huggingface.co/shaowenchen/chinese-alpaca-2-7b-GGUF/blob/main/chinese-alpaca-2-7b.Q4_K.gguf) | Q4_K | 3.9 GB |
33
- | [chinese-alpaca-2-7b.Q4_K_S.gguf](https://huggingface.co/shaowenchen/chinese-alpaca-2-7b-GGUF/blob/main/chinese-alpaca-2-7b.Q4_K_S.gguf) | Q4_K_S | 3.7 GB |
34
- | [chinese-alpaca-2-7b.Q5_0.gguf](https://huggingface.co/shaowenchen/chinese-alpaca-2-7b-GGUF/blob/main/chinese-alpaca-2-7b.Q5_0.gguf) | Q5_0 | 4.5 GB |
35
- | [chinese-alpaca-2-7b.Q5_1.gguf](https://huggingface.co/shaowenchen/chinese-alpaca-2-7b-GGUF/blob/main/chinese-alpaca-2-7b.Q5_1.gguf) | Q5_1 | 4.9 GB |
36
- | [chinese-alpaca-2-7b.Q5_K.gguf](https://huggingface.co/shaowenchen/chinese-alpaca-2-7b-GGUF/blob/main/chinese-alpaca-2-7b.Q5_K.gguf) | Q5_K | 4.6 GB |
37
- | [chinese-alpaca-2-7b.Q5_K_S.gguf](https://huggingface.co/shaowenchen/chinese-alpaca-2-7b-GGUF/blob/main/chinese-alpaca-2-7b.Q5_K_S.gguf) | Q5_K_S | 4.5 GB |
38
- | [chinese-alpaca-2-7b.Q6_K.gguf](https://huggingface.co/shaowenchen/chinese-alpaca-2-7b-GGUF/blob/main/chinese-alpaca-2-7b.Q6_K.gguf) | Q6_K | 5.3 GB |
39
- | [chinese-alpaca-2-7b.Q8_0.gguf](https://huggingface.co/shaowenchen/chinese-alpaca-2-7b-GGUF/blob/main/chinese-alpaca-2-7b.Q8_0.gguf) | Q8_0 | 6.9 GB |
40
- | [chinese-alpaca-2-7b.gguf](https://huggingface.co/shaowenchen/chinese-alpaca-2-7b-GGUF/blob/main/chinese-alpaca-2-7b.gguf) | full | 13 GB |
41
 
42
  ## Provided images
43
 
44
- | Name | Quant method | Size |
45
- | ------------------------------------------------------------------------------------------------------------------------------------ | ------------ | ------- |
46
- | [shaowenchen/chinese-alpaca-2-7b-gguf:Q2_K](https://hub.docker.com/repository/docker/shaowenchen/chinese-alpaca-2-7b-gguf/general) | Q2_K | 3.68 GB |
47
- | [shaowenchen/chinese-alpaca-2-7b-gguf:Q3_K](https://hub.docker.com/repository/docker/shaowenchen/chinese-alpaca-2-7b-gguf/general) | Q3_K | 4.16 GB |
48
- | [shaowenchen/chinese-alpaca-2-7b-gguf:Q3_K_L](https://hub.docker.com/repository/docker/shaowenchen/chinese-alpaca-2-7b-gguf/general) | Q3_K_L | 4.46 GB |
49
- | [shaowenchen/chinese-alpaca-2-7b-gguf:Q3_K_S](https://hub.docker.com/repository/docker/shaowenchen/chinese-alpaca-2-7b-gguf/general) | Q3_K_S | 3.81 GB |
50
- | [shaowenchen/chinese-alpaca-2-7b-gguf:Q4_0](https://hub.docker.com/repository/docker/shaowenchen/chinese-alpaca-2-7b-gguf/general) | Q4_0 | 4.7 GB |
51
- | [shaowenchen/chinese-alpaca-2-7b-gguf:Q4_K](https://hub.docker.com/repository/docker/shaowenchen/chinese-alpaca-2-7b-gguf/general) | Q4_K | 4.95 GB |
52
- | [shaowenchen/chinese-alpaca-2-7b-gguf:Q4_K_S](https://hub.docker.com/repository/docker/shaowenchen/chinese-alpaca-2-7b-gguf/general) | Q4_K_S | 4.73 GB |
53
 
54
  ```
55
  docker run --rm -p 8000:8000 shaowenchen/chinese-alpaca-2-7b-gguf:Q2_K
56
  ```
57
 
58
- and open http://localhost:8000/docs to view the API documentation.
 
2
  inference: false
3
  language:
4
  - zh
5
+ license: other
6
  model_creator: ziqingyang
7
  model_link: https://huggingface.co/ziqingyang/chinese-alpaca-2-7b
8
  model_name: chinese-alpaca-2-7b
9
  model_type: llama
10
  pipeline_tag: text-generation
11
  quantized_by: shaowenchen
12
+ tasks:
13
+ - text2text-generation
14
  tags:
15
  - meta
16
  - gguf
17
  - llama
 
 
18
  - alpaca-2
19
  - chinese
20
  ---
21
 
22
  ## Provided files
23
 
24
+ | Name | Quant method | Size |
25
+ | ------------------------------- | ------------ | ------ |
26
+ | chinese-alpaca-2-7b.Q2_K.gguf | Q2_K | 2.7 GB |
27
+ | chinese-alpaca-2-7b.Q3_K.gguf | Q3_K | 3.2 GB |
28
+ | chinese-alpaca-2-7b.Q3_K_L.gguf | Q3_K_L | 3.5 GB |
29
+ | chinese-alpaca-2-7b.Q3_K_S.gguf | Q3_K_S | 2.9 GB |
30
+ | chinese-alpaca-2-7b.Q4_0.gguf | Q4_0 | 3.7 GB |
31
+ | chinese-alpaca-2-7b.Q4_1.gguf | Q4_1 | 4.1 GB |
32
+ | chinese-alpaca-2-7b.Q4_K.gguf | Q4_K | 3.9 GB |
33
+ | chinese-alpaca-2-7b.Q4_K_S.gguf | Q4_K_S | 3.7 GB |
34
+ | chinese-alpaca-2-7b.Q5_0.gguf | Q5_0 | 4.5 GB |
35
+ | chinese-alpaca-2-7b.Q5_1.gguf | Q5_1 | 4.9 GB |
36
+ | chinese-alpaca-2-7b.Q5_K.gguf | Q5_K | 4.6 GB |
37
+ | chinese-alpaca-2-7b.Q5_K_S.gguf | Q5_K_S | 4.5 GB |
38
+ | chinese-alpaca-2-7b.Q6_K.gguf | Q6_K | 5.3 GB |
39
+ | chinese-alpaca-2-7b.Q8_0.gguf | Q8_0 | 6.9 GB |
40
+ | chinese-alpaca-2-7b.gguf | full | 13 GB |
41
 
42
  ## Provided images
43
 
44
+ | Name | Quant method | Size |
45
+ | --------------------------------------------- | ------------ | ------- |
46
+ | `shaowenchen/chinese-alpaca-2-7b-gguf:Q2_K` | Q2_K | 3.68 GB |
47
+ | `shaowenchen/chinese-alpaca-2-7b-gguf:Q3_K` | Q3_K | 4.16 GB |
48
+ | `shaowenchen/chinese-alpaca-2-7b-gguf:Q3_K_L` | Q3_K_L | 4.46 GB |
49
+ | `shaowenchen/chinese-alpaca-2-7b-gguf:Q3_K_S` | Q3_K_S | 3.81 GB |
50
+ | `shaowenchen/chinese-alpaca-2-7b-gguf:Q4_0` | Q4_0 | 4.7 GB |
51
+ | `shaowenchen/chinese-alpaca-2-7b-gguf:Q4_K` | Q4_K | 4.95 GB |
52
+ | `shaowenchen/chinese-alpaca-2-7b-gguf:Q4_K_S` | Q4_K_S | 4.73 GB |
53
 
54
  ```
55
  docker run --rm -p 8000:8000 shaowenchen/chinese-alpaca-2-7b-gguf:Q2_K
56
  ```
57
 
58
+ and you can view http://localhost:8000/docs to see the swagger UI.