|
--- |
|
datasets: |
|
- Norquinal/claude_multiround_chat_1k |
|
license: cc-by-nc-4.0 |
|
--- |
|
|
|
## Exllama v2 Quantizations of Mistral-7B-claude-chat |
|
|
|
Using <a href="https://github.com/turboderp/exllamav2/releases/tag/v0.0.6">turboderp's ExLlamaV2 v0.0.6</a> for quantization. |
|
|
|
Each branches contains an individual bits per weight. |
|
|
|
Conversion was done using wikitext.parquet as calibration dataset. |
|
|
|
Original model: https://huggingface.co/Norquinal/Mistral-7B-claude-chat |
|
|
|
<a href="https://huggingface.co/bartowski/Mistral-7B-claude-chat-exl2/tree/4.0">4.0 bits per weight</a> |
|
|
|
<a href="https://huggingface.co/bartowski/Mistral-7B-claude-chat-exl2/tree/8.0">8.0 bits per weight</a> |
|
|
|
<a href="https://huggingface.co/bartowski/Mistral-7B-claude-chat-exl2/tree/6.0">6.0 bits per weight</a> |
|
|
|
# To download, you can use the following command: |
|
|
|
With git: |
|
|
|
```shell |
|
git clone --single-branch --branch 4.0 https://huggingface.co/bartowski/Mistral-7B-claude-chat-exl2 |
|
``` |
|
|
|
With huggingface hub (credit to TheBloke for instructions): |
|
|
|
```shell |
|
pip3 install huggingface-hub |
|
``` |
|
|
|
To download the `main` (only useful if you only care about measurement.json) branch to a folder called `Mistral-7B-claude-chat-exl2`: |
|
|
|
```shell |
|
mkdir Mistral-7B-claude-chat-exl2 |
|
huggingface-cli download bartowski/Mistral-7B-claude-chat-exl2 --local-dir Mistral-7B-claude-chat-exl2 --local-dir-use-symlinks False |
|
``` |
|
|
|
To download from a different branch, add the `--revision` parameter: |
|
|
|
```shell |
|
mkdir Mistral-7B-claude-chat-exl2 |
|
huggingface-cli download bartowski/Mistral-7B-claude-chat-exl2 --revision 4.0 --local-dir Mistral-7B-claude-chat-exl2 --local-dir-use-symlinks False |
|
``` |
|
|