pek111 commited on
Commit
5a4176e
1 Parent(s): 47d013c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +13 -13
README.md CHANGED
@@ -93,19 +93,19 @@ Refer to the Provided Files table below to see what files use which methods, and
93
 
94
  | Name | Quant method | Bits | Size | Use case |
95
  | ---- | ---- | ---- | ---- | ---- |
96
- | [tc-instruct-dpo.Q2_K.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/blob/main/tc-instruct-dpo.Q2_K.gguf) | Q2_K | 2 | 2.88 GB | smallest, significant quality loss - not recommended for most purposes |
97
- | [tc-instruct-dpo.Q3_K_S.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/blob/main/tc-instruct-dpo.Q3_K_S.gguf) | Q3_K_S | 3 | 2.96 GB | very small, high quality loss |
98
- | [tc-instruct-dpo.Q3_K_M.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/blob/main/tc-instruct-dpo.Q3_K_M.gguf) | Q3_K_M | 3 | 3.29 GB | very small, high quality loss |
99
- | [tc-instruct-dpo.Q3_K_L.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/blob/main/tc-instruct-dpo.Q3_K_L.gguf) | Q3_K_L | 3 | 3.57 GB | small, substantial quality loss |
100
- | [tc-instruct-dpo.Q4_0.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/blob/main/tc-instruct-dpo.Q4_0.gguf) | Q4_0 | 4 | 3.84 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
101
- | [tc-instruct-dpo.Q4_K_S.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/blob/main/tc-instruct-dpo.Q4_K_S.gguf) | Q4_K_S | 4 | 3.87 GB | small, greater quality loss |
102
- | [tc-instruct-dpo.Q4_K_M.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/blob/main/tc-instruct-dpo.Q4_K_M.gguf) | Q4_K_M | 4 | 4.08 GB | medium, balanced quality - recommended |
103
- | [tc-instruct-dpo.Q5_0.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/blob/main/tc-instruct-dpo.Q5_0.gguf) | Q5_0 | 5 | 4.67 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
104
- | [tc-instruct-dpo.Q5_K_S.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/blob/main/tc-instruct-dpo.Q5_K_S.gguf) | Q5_K_S | 5 | 4.67 GB | large, low quality loss - recommended |
105
- | [tc-instruct-dpo.Q5_K_M.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/blob/main/tc-instruct-dpo.Q5_K_M.gguf) | Q5_K_M | 5 | 4.79 GB | large, very low quality loss - recommended |
106
- | [tc-instruct-dpo.Q6_K.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/blob/main/tc-instruct-dpo.Q6_K.gguf) | Q6_K | 6 | 5.55 GB | very large, extremely low quality loss |
107
- | [tc-instruct-dpo.Q8_0.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/blob/main/tc-instruct-dpo.Q8_0.gguf) | Q8_0 | 8 | 7.19 GB | very large, extremely low quality loss - not recommended |
108
- | [tc-instruct-dpo.QF16.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/blob/main/tc-instruct-dpo.Q8_0.gguf) | F16 | 16 | 13.53 GB | largest, original quality - not recommended |
109
 
110
  ## How to download GGUF files
111
 
 
93
 
94
  | Name | Quant method | Bits | Size | Use case |
95
  | ---- | ---- | ---- | ---- | ---- |
96
+ | [tc-instruct-dpo.Q2_K.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/resolve/main/tc-instruct-dpo.Q2_K.gguf) | Q2_K | 2 | 2.88 GB | smallest, significant quality loss - not recommended for most purposes |
97
+ | [tc-instruct-dpo.Q3_K_S.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/resolve/main/tc-instruct-dpo.Q3_K_S.gguf) | Q3_K_S | 3 | 2.96 GB | very small, high quality loss |
98
+ | [tc-instruct-dpo.Q3_K_M.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/resolve/main/tc-instruct-dpo.Q3_K_M.gguf) | Q3_K_M | 3 | 3.29 GB | very small, high quality loss |
99
+ | [tc-instruct-dpo.Q3_K_L.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/resolve/main/tc-instruct-dpo.Q3_K_L.gguf) | Q3_K_L | 3 | 3.57 GB | small, substantial quality loss |
100
+ | [tc-instruct-dpo.Q4_0.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/resolve/main/tc-instruct-dpo.Q4_0.gguf) | Q4_0 | 4 | 3.84 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
101
+ | [tc-instruct-dpo.Q4_K_S.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/resolve/main/tc-instruct-dpo.Q4_K_S.gguf) | Q4_K_S | 4 | 3.87 GB | small, greater quality loss |
102
+ | [tc-instruct-dpo.Q4_K_M.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/resolve/main/tc-instruct-dpo.Q4_K_M.gguf) | Q4_K_M | 4 | 4.08 GB | medium, balanced quality - recommended |
103
+ | [tc-instruct-dpo.Q5_0.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/resolve/main/tc-instruct-dpo.Q5_0.gguf) | Q5_0 | 5 | 4.67 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
104
+ | [tc-instruct-dpo.Q5_K_S.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/resolve/main/tc-instruct-dpo.Q5_K_S.gguf) | Q5_K_S | 5 | 4.67 GB | large, low quality loss - recommended |
105
+ | [tc-instruct-dpo.Q5_K_M.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/resolve/main/tc-instruct-dpo.Q5_K_M.gguf) | Q5_K_M | 5 | 4.79 GB | large, very low quality loss - recommended |
106
+ | [tc-instruct-dpo.Q6_K.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/resolve/main/tc-instruct-dpo.Q6_K.gguf) | Q6_K | 6 | 5.55 GB | very large, extremely low quality loss |
107
+ | [tc-instruct-dpo.Q8_0.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/resolve/main/tc-instruct-dpo.Q8_0.gguf) | Q8_0 | 8 | 7.19 GB | very large, extremely low quality loss - not recommended |
108
+ | [tc-instruct-dpo.F16.gguf](https://huggingface.co/pek111/TC-instruct-DPO-GGUF/resolve/main/tc-instruct-dpo.F16.gguf) | F16 | 16 | 13.53 GB | largest, original quality - not recommended |
109
 
110
  ## How to download GGUF files
111