File size: 3,367 Bytes
00f6a93 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 |
---
license: apache-2.0
base_model: 01-ai/Yi-1.5-34B
tags:
- generated_from_trainer
- axolotl
datasets:
- cognitivecomputations/Dolphin-2.9
- teknium/OpenHermes-2.5
- m-a-p/CodeFeedback-Filtered-Instruction
- cognitivecomputations/dolphin-coder
- cognitivecomputations/samantha-data
- microsoft/orca-math-word-problems-200k
- Locutusque/function-calling-chatml
- internlm/Agent-FLAN
quantized_by: bartowski
pipeline_tag: text-generation
---
## Exllama v2 Quantizations of dolphin-2.9.1-yi-1.5-34b
Using <a href="https://github.com/turboderp/exllamav2/releases/tag/v0.0.21">turboderp's ExLlamaV2 v0.0.21</a> for quantization.
<b>The "main" branch only contains the measurement.json, download one of the other branches for the model (see below)</b>
Each branch contains an individual bits per weight, with the main one containing only the meaurement.json for further conversions.
Original model: https://huggingface.co/cognitivecomputations/dolphin-2.9.1-yi-1.5-34b
## Prompt format
```
<|im_start|> system
{system_prompt}<|im_end|>
<|im_start|> user
{prompt}<|im_end|>
<|im_start|> assistant
```
## Available sizes
| Branch | Bits | lm_head bits | VRAM (4k) | VRAM (16k) | VRAM (32k) | Description |
| ------ | ---- | ------------ | ---- | ---- | ---- | ----------- |
| [8_0](https://huggingface.co/bartowski/dolphin-2.9.1-yi-1.5-34b-exl2/tree/8_0) | 8.0 | 8.0 | 34.9 GB | 37.6 GB | 41.6 GB | Max quality producable by ExLlamav2, generally unneeded but maximum performance |
| [6_5](https://huggingface.co/bartowski/dolphin-2.9.1-yi-1.5-34b-exl2/tree/6_5) | 6.5 | 8.0 | 28.9 GB | 31.6 GB | 35.6 GB | Near unquantized performance at vastly reduced size, **recommended**. |
| [5_0](https://huggingface.co/bartowski/dolphin-2.9.1-yi-1.5-34b-exl2/tree/5_0) | 5.0 | 8.0 | 22.6 GB | 25.3 GB | 29.3 GB | Very high quality, usable at 4k context on 24GB. |
| [4_25](https://huggingface.co/bartowski/dolphin-2.9.1-yi-1.5-34b-exl2/tree/4_25) | 4.25 | 6.0 | 19.5 GB | 22.2 GB | 26.2 GB | GPTQ equivalent bits per weight, slightly higher quality. |
| [3_5](https://huggingface.co/bartowski/dolphin-2.9.1-yi-1.5-34b-exl2/tree/3_5) | 3.5 | 6.0 | 16.5 GB | 19.2 GB | 23.2 GB | Lower quality, only use if you have to. |
| [3_0](https://huggingface.co/bartowski/dolphin-2.9.1-yi-1.5-34b-exl2/tree/3_0) | 3.0 | 6.0 | 14.3 GB | 17.0 GB | 21.0 GB | Very low quality, usable with 16gb of VRAM. |
## Download instructions
With git:
```shell
git clone --single-branch --branch 6_5 https://huggingface.co/bartowski/dolphin-2.9.1-yi-1.5-34b-exl2 dolphin-2.9.1-yi-1.5-34b-exl2-6_5
```
With huggingface hub (credit to TheBloke for instructions):
```shell
pip3 install huggingface-hub
```
To download a specific branch, use the `--revision` parameter. For example, to download the 6.5 bpw branch:
Linux:
```shell
huggingface-cli download bartowski/dolphin-2.9.1-yi-1.5-34b-exl2 --revision 6_5 --local-dir dolphin-2.9.1-yi-1.5-34b-exl2-6_5 --local-dir-use-symlinks False
```
Windows (which apparently doesn't like _ in folders sometimes?):
```shell
huggingface-cli download bartowski/dolphin-2.9.1-yi-1.5-34b-exl2 --revision 6_5 --local-dir dolphin-2.9.1-yi-1.5-34b-exl2-6.5 --local-dir-use-symlinks False
```
Want to support my work? Visit my ko-fi page here: https://ko-fi.com/bartowski
|