File size: 1,533 Bytes
12c3d00 c03d5d3 12c3d00 a134840 12c3d00 ec82009 c03d5d3 12c3d00 a134840 12c3d00 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 |
---
datasets:
- Norquinal/claude_multiround_chat_1k
license: cc-by-nc-4.0
---
## Exllama v2 Quantizations of Mistral-7B-claude-chat
Using <a href="https://github.com/turboderp/exllamav2/releases/tag/v0.0.6">turboderp's ExLlamaV2 v0.0.6</a> for quantization.
Each branches contains an individual bits per weight.
Conversion was done using wikitext.parquet as calibration dataset.
Original model: https://huggingface.co/Norquinal/Mistral-7B-claude-chat
<a href="https://huggingface.co/bartowski/Mistral-7B-claude-chat-exl2/tree/8.0">8.0 bits per weight</a>
<a href="https://huggingface.co/bartowski/Mistral-7B-claude-chat-exl2/tree/6.0">6.0 bits per weight</a>
# To download, you can use the following command:
With git:
```shell
git clone --single-branch --branch 4.0 https://huggingface.co/bartowski/Mistral-7B-claude-chat-exl2
```
With huggingface hub (credit to TheBloke for instructions):
```shell
pip3 install huggingface-hub
```
To download the `main` (only useful if you only care about measurement.json) branch to a folder called `Mistral-7B-claude-chat-exl2`:
```shell
mkdir Mistral-7B-claude-chat-exl2
huggingface-cli download bartowski/Mistral-7B-claude-chat-exl2 --local-dir Mistral-7B-claude-chat-exl2 --local-dir-use-symlinks False
```
To download from a different branch, add the `--revision` parameter:
```shell
mkdir Mistral-7B-claude-chat-exl2
huggingface-cli download bartowski/Mistral-7B-claude-chat-exl2 --revision 4.0 --local-dir Mistral-7B-claude-chat-exl2 --local-dir-use-symlinks False
```
|