TheBloke commited on
Commit
2245a79
1 Parent(s): 0fbb5be

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +13 -5
README.md CHANGED
@@ -7,15 +7,23 @@ This repo contains the weights of the Koala 13B model produced at Berkeley. It i
7
 
8
  This version has then been converted to HF format.
9
 
10
- ## Other Koala repos
 
11
 
12
- I have also made these other Koala models available:
13
- * [GPTQ quantized 4bit 13B model in GGML format for llama.cpp](https://huggingface.co/TheBloke/koala-13B-GPTQ-4bit-128g-GGML)
14
- * [GPTQ quantized 4bit 7B model in `pt` and `safetensors` formats](https://huggingface.co/TheBloke/koala-7B-4bit-128g)
 
 
 
15
  * [Unquantized 7B model in HF format](https://huggingface.co/TheBloke/koala-7B-HF)
16
  * [Unquantized 7B model in GGML format for llama.cpp](https://huggingface.co/TheBloke/koala-7b-ggml-unquantized)
 
 
17
 
18
- The following commands were run to produce this repo:
 
 
19
  ```
20
  git clone https://github.com/young-geng/EasyLM
21
 
 
7
 
8
  This version has then been converted to HF format.
9
 
10
+ ## My Koala repos
11
+ I have the following Koala model repositories available:
12
 
13
+ **13B models:**
14
+ * [Unquantized 13B model in HF format](https://huggingface.co/TheBloke/koala-13B-HF)
15
+ * [GPTQ quantized 4bit 13B model in `pt` and `safetensors` formats](https://huggingface.co/TheBloke/koala-13B-GPTQ-4bit-128g)
16
+ * [GPTQ quantized 4bit 13B model in GGML format for `llama.cpp`](https://huggingface.co/TheBloke/koala-13B-GPTQ-4bit-128g-GGML)
17
+
18
+ **7B models:**
19
  * [Unquantized 7B model in HF format](https://huggingface.co/TheBloke/koala-7B-HF)
20
  * [Unquantized 7B model in GGML format for llama.cpp](https://huggingface.co/TheBloke/koala-7b-ggml-unquantized)
21
+ * [GPTQ quantized 4bit 7B model in `pt` and `safetensors` formats](https://huggingface.co/TheBloke/koala-7B-GPTQ-4bit-128g)
22
+ * [GPTQ quantized 4bit 7B model in GGML format for `llama.cpp`](https://huggingface.co/TheBloke/koala-7B-GPTQ-4bit-128g-GGML)
23
 
24
+ ## How the Koala delta weights were merged
25
+
26
+ The Koala delta weights were merged using the following commands:
27
  ```
28
  git clone https://github.com/young-geng/EasyLM
29