--- license: apache-2.0 base_model: 01-ai/Yi-1.5-34B tags: - generated_from_trainer - axolotl datasets: - cognitivecomputations/Dolphin-2.9 - teknium/OpenHermes-2.5 - m-a-p/CodeFeedback-Filtered-Instruction - cognitivecomputations/dolphin-coder - cognitivecomputations/samantha-data - microsoft/orca-math-word-problems-200k - Locutusque/function-calling-chatml - internlm/Agent-FLAN quantized_by: bartowski pipeline_tag: text-generation --- ## Exllama v2 Quantizations of dolphin-2.9.1-yi-1.5-34b Using turboderp's ExLlamaV2 v0.0.21 for quantization. The "main" branch only contains the measurement.json, download one of the other branches for the model (see below) Each branch contains an individual bits per weight, with the main one containing only the meaurement.json for further conversions. Original model: https://huggingface.co/cognitivecomputations/dolphin-2.9.1-yi-1.5-34b ## Prompt format ``` <|im_start|> system {system_prompt}<|im_end|> <|im_start|> user {prompt}<|im_end|> <|im_start|> assistant ``` ## Available sizes | Branch | Bits | lm_head bits | VRAM (4k) | VRAM (16k) | VRAM (32k) | Description | | ------ | ---- | ------------ | ---- | ---- | ---- | ----------- | | [8_0](https://huggingface.co/bartowski/dolphin-2.9.1-yi-1.5-34b-exl2/tree/8_0) | 8.0 | 8.0 | 34.9 GB | 37.6 GB | 41.6 GB | Max quality producable by ExLlamav2, generally unneeded but maximum performance | | [6_5](https://huggingface.co/bartowski/dolphin-2.9.1-yi-1.5-34b-exl2/tree/6_5) | 6.5 | 8.0 | 28.9 GB | 31.6 GB | 35.6 GB | Near unquantized performance at vastly reduced size, **recommended**. | | [5_0](https://huggingface.co/bartowski/dolphin-2.9.1-yi-1.5-34b-exl2/tree/5_0) | 5.0 | 8.0 | 22.6 GB | 25.3 GB | 29.3 GB | Very high quality, usable at 4k context on 24GB. | | [4_25](https://huggingface.co/bartowski/dolphin-2.9.1-yi-1.5-34b-exl2/tree/4_25) | 4.25 | 6.0 | 19.5 GB | 22.2 GB | 26.2 GB | GPTQ equivalent bits per weight, slightly higher quality. | | [3_5](https://huggingface.co/bartowski/dolphin-2.9.1-yi-1.5-34b-exl2/tree/3_5) | 3.5 | 6.0 | 16.5 GB | 19.2 GB | 23.2 GB | Lower quality, only use if you have to. | | [3_0](https://huggingface.co/bartowski/dolphin-2.9.1-yi-1.5-34b-exl2/tree/3_0) | 3.0 | 6.0 | 14.3 GB | 17.0 GB | 21.0 GB | Very low quality, usable with 16gb of VRAM. | ## Download instructions With git: ```shell git clone --single-branch --branch 6_5 https://huggingface.co/bartowski/dolphin-2.9.1-yi-1.5-34b-exl2 dolphin-2.9.1-yi-1.5-34b-exl2-6_5 ``` With huggingface hub (credit to TheBloke for instructions): ```shell pip3 install huggingface-hub ``` To download a specific branch, use the `--revision` parameter. For example, to download the 6.5 bpw branch: Linux: ```shell huggingface-cli download bartowski/dolphin-2.9.1-yi-1.5-34b-exl2 --revision 6_5 --local-dir dolphin-2.9.1-yi-1.5-34b-exl2-6_5 --local-dir-use-symlinks False ``` Windows (which apparently doesn't like _ in folders sometimes?): ```shell huggingface-cli download bartowski/dolphin-2.9.1-yi-1.5-34b-exl2 --revision 6_5 --local-dir dolphin-2.9.1-yi-1.5-34b-exl2-6.5 --local-dir-use-symlinks False ``` Want to support my work? Visit my ko-fi page here: https://ko-fi.com/bartowski