mradermacher commited on
Commit
6f1e139
1 Parent(s): a0e1fe2

auto-patch README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -1
README.md CHANGED
@@ -3,6 +3,10 @@ base_model: MaziyarPanahi/Goku-8x22B-v0.1
3
  datasets:
4
  - philschmid/guanaco-sharegpt-style
5
  language:
 
 
 
 
6
  - en
7
  library_name: transformers
8
  license: apache-2.0
@@ -13,6 +17,7 @@ tags:
13
  - moe
14
  - mixtral
15
  - sharegpt
 
16
  ---
17
  ## About
18
 
@@ -51,7 +56,6 @@ more details, including on how to concatenate multi-part files.
51
  | [PART 1](https://huggingface.co/mradermacher/Goku-8x22B-v0.1-GGUF/resolve/main/Goku-8x22B-v0.1.Q6_K.gguf.part1of3) [PART 2](https://huggingface.co/mradermacher/Goku-8x22B-v0.1-GGUF/resolve/main/Goku-8x22B-v0.1.Q6_K.gguf.part2of3) [PART 3](https://huggingface.co/mradermacher/Goku-8x22B-v0.1-GGUF/resolve/main/Goku-8x22B-v0.1.Q6_K.gguf.part3of3) | Q6_K | 115.6 | very good quality |
52
  | [PART 1](https://huggingface.co/mradermacher/Goku-8x22B-v0.1-GGUF/resolve/main/Goku-8x22B-v0.1.Q8_0.gguf.part1of4) [PART 2](https://huggingface.co/mradermacher/Goku-8x22B-v0.1-GGUF/resolve/main/Goku-8x22B-v0.1.Q8_0.gguf.part2of4) [PART 3](https://huggingface.co/mradermacher/Goku-8x22B-v0.1-GGUF/resolve/main/Goku-8x22B-v0.1.Q8_0.gguf.part3of4) [PART 4](https://huggingface.co/mradermacher/Goku-8x22B-v0.1-GGUF/resolve/main/Goku-8x22B-v0.1.Q8_0.gguf.part4of4) | Q8_0 | 149.5 | fast, best quality |
53
 
54
-
55
  Here is a handy graph by ikawrakow comparing some lower-quality quant
56
  types (lower is better):
57
 
 
3
  datasets:
4
  - philschmid/guanaco-sharegpt-style
5
  language:
6
+ - fr
7
+ - it
8
+ - de
9
+ - es
10
  - en
11
  library_name: transformers
12
  license: apache-2.0
 
17
  - moe
18
  - mixtral
19
  - sharegpt
20
+ - axolotl
21
  ---
22
  ## About
23
 
 
56
  | [PART 1](https://huggingface.co/mradermacher/Goku-8x22B-v0.1-GGUF/resolve/main/Goku-8x22B-v0.1.Q6_K.gguf.part1of3) [PART 2](https://huggingface.co/mradermacher/Goku-8x22B-v0.1-GGUF/resolve/main/Goku-8x22B-v0.1.Q6_K.gguf.part2of3) [PART 3](https://huggingface.co/mradermacher/Goku-8x22B-v0.1-GGUF/resolve/main/Goku-8x22B-v0.1.Q6_K.gguf.part3of3) | Q6_K | 115.6 | very good quality |
57
  | [PART 1](https://huggingface.co/mradermacher/Goku-8x22B-v0.1-GGUF/resolve/main/Goku-8x22B-v0.1.Q8_0.gguf.part1of4) [PART 2](https://huggingface.co/mradermacher/Goku-8x22B-v0.1-GGUF/resolve/main/Goku-8x22B-v0.1.Q8_0.gguf.part2of4) [PART 3](https://huggingface.co/mradermacher/Goku-8x22B-v0.1-GGUF/resolve/main/Goku-8x22B-v0.1.Q8_0.gguf.part3of4) [PART 4](https://huggingface.co/mradermacher/Goku-8x22B-v0.1-GGUF/resolve/main/Goku-8x22B-v0.1.Q8_0.gguf.part4of4) | Q8_0 | 149.5 | fast, best quality |
58
 
 
59
  Here is a handy graph by ikawrakow comparing some lower-quality quant
60
  types (lower is better):
61