apepkuss79 commited on
Commit
e69d490
·
verified ·
1 Parent(s): 6eb0046

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -75,7 +75,9 @@ language:
75
  | Name | Quant method | Bits | Size | Use case |
76
  | ---- | ---- | ---- | ---- | ----- |
77
  | [Mistral-Large-Instruct-2407-Q2_K.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q2_K.gguf) | Q2_K | 2 | 45.2 GB| smallest, significant quality loss - not recommended for most purposes |
78
- | [Mistral-Large-Instruct-2407-Q3_K_L.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_L.gguf) | Q3_K_L | 3 | 3.83 GB| small, substantial quality loss |
 
 
79
  | [Mistral-Large-Instruct-2407-Q3_K_M.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_M.gguf) | Q3_K_M | 3 | 3.52 GB| very small, high quality loss |
80
  | [Mistral-Large-Instruct-2407-Q3_K_S.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_S.gguf) | Q3_K_S | 3 | 3.17 GB| very small, high quality loss |
81
  | [Mistral-Large-Instruct-2407-Q4_0.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q4_0.gguf) | Q4_0 | 4 | 4.11 GB| legacy; small, very high quality loss - prefer using Q3_K_M |
 
75
  | Name | Quant method | Bits | Size | Use case |
76
  | ---- | ---- | ---- | ---- | ----- |
77
  | [Mistral-Large-Instruct-2407-Q2_K.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q2_K.gguf) | Q2_K | 2 | 45.2 GB| smallest, significant quality loss - not recommended for most purposes |
78
+ | [Mistral-Large-Instruct-2407-Q3_K_L-00001-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_L-00001-of-00003.gguf) | Q3_K_L | 3 | 29.9 GB| small, substantial quality loss |
79
+ | [Mistral-Large-Instruct-2407-Q3_K_L-00002-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_L-00002-of-00003.gguf) | Q3_K_L | 3 | 29.9 GB| small, substantial quality loss |
80
+ | [Mistral-Large-Instruct-2407-Q3_K_L-00003-of-00003.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_L-00003-of-00003.gguf) | Q3_K_L | 3 | 4.70 GB| small, substantial quality loss |
81
  | [Mistral-Large-Instruct-2407-Q3_K_M.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_M.gguf) | Q3_K_M | 3 | 3.52 GB| very small, high quality loss |
82
  | [Mistral-Large-Instruct-2407-Q3_K_S.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q3_K_S.gguf) | Q3_K_S | 3 | 3.17 GB| very small, high quality loss |
83
  | [Mistral-Large-Instruct-2407-Q4_0.gguf](https://huggingface.co/second-state/Mistral-Large-Instruct-2407-GGUF/blob/main/Mistral-Large-Instruct-2407-Q4_0.gguf) | Q4_0 | 4 | 4.11 GB| legacy; small, very high quality loss - prefer using Q3_K_M |