bartowski commited on
Commit
5c86604
1 Parent(s): c187d82

Main branch

Browse files
Files changed (2) hide show
  1. README.md +54 -0
  2. measurement.json +0 -0
README.md ADDED
@@ -0,0 +1,54 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: cc-by-nc-4.0
3
+ datasets:
4
+ - berkeley-nest/Nectar
5
+ language:
6
+ - en
7
+ library_name: transformers
8
+ tags:
9
+ - reward model
10
+ - RLHF
11
+ - RLAIF
12
+ quantized_by: bartowski
13
+ pipeline_tag: text-generation
14
+ ---
15
+
16
+ ## Exllama v2 Quantizations of Starling-LM-7B-alpha
17
+
18
+ Using <a href="https://github.com/turboderp/exllamav2/releases/tag/v0.0.9">turboderp's ExLlamaV2 v0.0.9</a> for quantization.
19
+
20
+ Each branch contains an individual bits per weight, with the main one containing only the meaurement.json for further conversions.
21
+
22
+ Conversion was done using wikitext-103-raw-v1-test.parquet as calibration dataset.
23
+
24
+ Default arguments used except when the bits per weight is above 6.0, at that point the lm_head layer is quantized at 8 bits per weight instead of the default 6.
25
+
26
+ Original model: https://huggingface.co/berkeley-nest/Starling-LM-7B-alpha
27
+
28
+ ## Download instructions
29
+
30
+ With git:
31
+
32
+ ```shell
33
+ git clone --single-branch --branch 4_0 https://huggingface.co/bartowski/Starling-LM-7B-alpha-exl2
34
+ ```
35
+
36
+ With huggingface hub (credit to TheBloke for instructions):
37
+
38
+ ```shell
39
+ pip3 install huggingface-hub
40
+ ```
41
+
42
+ To download the `main` (only useful if you only care about measurement.json) branch to a folder called `Starling-LM-7B-alpha-exl2`:
43
+
44
+ ```shell
45
+ mkdir Starling-LM-7B-alpha-exl2
46
+ huggingface-cli download bartowski/Starling-LM-7B-alpha-exl2 --local-dir Starling-LM-7B-alpha-exl2 --local-dir-use-symlinks False
47
+ ```
48
+
49
+ To download from a different branch, add the `--revision` parameter:
50
+
51
+ ```shell
52
+ mkdir Starling-LM-7B-alpha-exl2
53
+ huggingface-cli download bartowski/Starling-LM-7B-alpha-exl2 --revision 4_0 --local-dir Starling-LM-7B-alpha-exl2 --local-dir-use-symlinks False
54
+ ```
measurement.json ADDED
The diff for this file is too large to render. See raw diff