Text Generation
4 languages
File size: 2,924 Bytes
a99863a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
---
license: gpl-3.0
language:
- en
- zh
- ja
- de
datasets:
- JosephusCheung/GuanacoDataset
- meta-math/MetaMathQA
- jondurbin/airoboros-3.1
- WizardLM/WizardLM_evol_instruct_V2_196k
- RyokoAI/ShareGPT52K
- RyokoAI/Fandom23K
- milashkaarshif/MoeGirlPedia_wikitext_raw_archive
- wikipedia
- wiki_lingua
- garage-bAInd/Open-Platypus
- LDJnr/Puffin
- BAAI/COIG
- TigerResearch/tigerbot-zhihu-zh-10k
- liwu/MNBVC
- teknium/openhermes
- CausalLM/Refined-Anime-Text
- microsoft/orca-math-word-problems-200k
- m-a-p/CodeFeedback-Filtered-Instruction
quantized_by: bartowski
pipeline_tag: text-generation
---

## Exllama v2 Quantizations of 35b-beta-long

Using <a href="https://github.com/turboderp/exllamav2/releases/tag/v0.0.20">turboderp's ExLlamaV2 v0.0.20</a> for quantization.

<b>The "main" branch only contains the measurement.json, download one of the other branches for the model (see below)</b>

Each branch contains an individual bits per weight, with the main one containing only the meaurement.json for further conversions.

Conversion was done using the default calibration dataset.

Default arguments used except when the bits per weight is above 6.0, at that point the lm_head layer is quantized at 8 bits per weight instead of the default 6.

Original model: https://huggingface.co/CausalLM/35b-beta-long


<a href="https://huggingface.co/bartowski/35b-beta-long-exl2/tree/8_0">8.0 bits per weight</a>

<a href="https://huggingface.co/bartowski/35b-beta-long-exl2/tree/6_5">6.5 bits per weight</a>

<a href="https://huggingface.co/bartowski/35b-beta-long-exl2/tree/5_0">5.0 bits per weight</a>

<a href="https://huggingface.co/bartowski/35b-beta-long-exl2/tree/4_25">4.25 bits per weight</a>

<a href="https://huggingface.co/bartowski/35b-beta-long-exl2/tree/3_5">3.5 bits per weight</a>

<a href="https://huggingface.co/bartowski/35b-beta-long-exl2/tree/3_0">3.0 bits per weight</a>


## Download instructions

With git:

```shell
git clone --single-branch --branch 6_5 https://huggingface.co/bartowski/35b-beta-long-exl2
```

With huggingface hub (credit to TheBloke for instructions):

```shell
pip3 install huggingface-hub
```

To download the `main` (only useful if you only care about measurement.json) branch to a folder called `35b-beta-long-exl2`:

```shell
mkdir 35b-beta-long-exl2
huggingface-cli download bartowski/35b-beta-long-exl2 --local-dir 35b-beta-long-exl2 --local-dir-use-symlinks False
```

To download from a different branch, add the `--revision` parameter:

Linux:

```shell
mkdir 35b-beta-long-exl2-6_5
huggingface-cli download bartowski/35b-beta-long-exl2 --revision 6_5 --local-dir 35b-beta-long-exl2-6_5 --local-dir-use-symlinks False
```

Windows (which apparently doesn't like _ in folders sometimes?):

```shell
mkdir 35b-beta-long-exl2-6.5
huggingface-cli download bartowski/35b-beta-long-exl2 --revision 6_5 --local-dir 35b-beta-long-exl2-6.5 --local-dir-use-symlinks False
```