Edit model card

Exllama v2 Quantizations of Mistral-22B-v0.2

Using turboderp's ExLlamaV2 v0.0.18 for quantization.

The "main" branch only contains the measurement.json, download one of the other branches for the model (see below)

Each branch contains an individual bits per weight, with the main one containing only the meaurement.json for further conversions.

Original model: https://huggingface.co/Vezora/Mistral-22B-v0.2

Prompt Format

### System: {system_prompt}
### Human: {prompt}
### Assistant:

Available sizes

Branch Bits lm_head bits VRAM (4k) VRAM (16k) VRAM (32k) Description
8_0 8.0 8.0 23.5 GB 26.0 GB 29.5 GB Near unquantized performance, max quality ExLlamaV2 can create.
6_5 6.5 8.0 19.4 GB 21.9 GB 25.4 GB Near unquantized performance at vastly reduced size, recommended.
5_0 5.0 6.0 15.5 GB 18.0 GB 21.5 GB Smaller size, lower quality, still very high performance, recommended.
4_25 4.25 6.0 13.3 GB 15.8 GB 19.3 GB GPTQ equivalent bits per weight, slightly higher quality.
3_5 3.5 6.0 11.6 GB 14.1 GB 17.6 GB Lower quality, only use if you have to.
3_0 3.0 6.0 9.8 GB 12.3 GB 15.8 GB Very low quality. Usable on 12GB with low context or 16gb with 32k.

Download instructions

With git:

git clone --single-branch --branch 6_5 https://huggingface.co/bartowski/Mistral-22B-v0.2-exl2

With huggingface hub (credit to TheBloke for instructions):

pip3 install huggingface-hub

To download the main (only useful if you only care about measurement.json) branch to a folder called Mistral-22B-v0.2-exl2:

mkdir Mistral-22B-v0.2-exl2
huggingface-cli download bartowski/Mistral-22B-v0.2-exl2 --local-dir Mistral-22B-v0.2-exl2 --local-dir-use-symlinks False

To download from a different branch, add the --revision parameter:

Linux:

mkdir Mistral-22B-v0.2-exl2-6_5
huggingface-cli download bartowski/Mistral-22B-v0.2-exl2 --revision 6_5 --local-dir Mistral-22B-v0.2-exl2-6_5 --local-dir-use-symlinks False

Windows (which apparently doesn't like _ in folders sometimes?):

mkdir Mistral-22B-v0.2-exl2-6.5
huggingface-cli download bartowski/Mistral-22B-v0.2-exl2 --revision 6_5 --local-dir Mistral-22B-v0.2-exl2-6.5 --local-dir-use-symlinks False
Downloads last month
1
Inference Examples
Unable to determine this model's library. Check the docs .