--- base_model: - TheBloke/Llama-2-13B-fp16 tags: - mergekit - merge license: cc-by-nc-4.0 quantized_by: bartowski pipeline_tag: text-generation --- ## Exllama v2 Quantizations of LLaMA2-13B-Estopia Using turboderp's ExLlamaV2 v0.0.13 for quantization. The "main" branch only contains the measurement.json, download one of the other branches for the model (see below) Each branch contains an individual bits per weight, with the main one containing only the meaurement.json for further conversions. Original model: https://huggingface.co/KoboldAI/LLaMA2-13B-Estopia No GQA - VRAM requirements will be higher | Branch | Bits | lm_head bits | VRAM (4k) | VRAM (16k) | Description | | ----- | ---- | ------- | ------ | ------ | ------------ | | [6_5](https://huggingface.co/bartowski/LLaMA2-13B-Estopia-exl2/tree/6_5) | 6.5 | 8.0 | 14.4 GB | 24.0 GB | Near unquantized performance at vastly reduced size, **recommended**. | | [5_0](https://huggingface.co/bartowski/LLaMA2-13B-Estopia-exl2/tree/5_0) | 5.0 | 6.0 | 12.1 GB | 21.7 GB | Slightly lower perplexity vs 6.5, can fit in 12 GB card with even lower context. | | [4_25](https://huggingface.co/bartowski/LLaMA2-13B-Estopia-exl2/tree/4_25) | 4.25 | 6.0 | 10.9 GB | 20.5 GB | GPTQ equivalent bits per weight. | | [3_75](https://huggingface.co/bartowski/LLaMA2-13B-Estopia-exl2/tree/3_75) | 3.75 | 6.0 | 10.1 GB | 19.7 GB | Lower quality but still generally usable. | | [3_0](https://huggingface.co/bartowski/LLaMA2-13B-Estopia-exl2/tree/3_0) | 3.0 | 6.0 | 9.1 GB | 18.7 GB | Very low quality, not recommended unless you have to. | VRAM requirements listed for both 4k context and 16k context since without GQA the differences are massive (9.6 GB) ## Download instructions With git: ```shell git clone --single-branch --branch 6_5 https://huggingface.co/bartowski/LLaMA2-13B-Estopia-exl2 LLaMA2-13B-Estopia-exl2-6_5 ``` With huggingface hub (credit to TheBloke for instructions): ```shell pip3 install huggingface-hub ``` To download the `main` (only useful if you only care about measurement.json) branch to a folder called `LLaMA2-13B-Estopia-exl2`: ```shell mkdir LLaMA2-13B-Estopia-exl2 huggingface-cli download bartowski/LLaMA2-13B-Estopia-exl2 --local-dir LLaMA2-13B-Estopia-exl2 --local-dir-use-symlinks False ``` To download from a different branch, add the `--revision` parameter: Linux: ```shell mkdir LLaMA2-13B-Estopia-exl2-6_5 huggingface-cli download bartowski/LLaMA2-13B-Estopia-exl2 --revision 6_5 --local-dir LLaMA2-13B-Estopia-exl2-6_5 --local-dir-use-symlinks False ``` Windows (which apparently doesn't like _ in folders sometimes?): ```shell mkdir LLaMA2-13B-Estopia-exl2-6.5 huggingface-cli download bartowski/LLaMA2-13B-Estopia-exl2 --revision 6_5 --local-dir LLaMA2-13B-Estopia-exl2-6.5 --local-dir-use-symlinks False ``` Want to support my work? Visit my ko-fi page here: https://ko-fi.com/bartowski