English
sft
StableLM
bartowski commited on
Commit
a4f6580
1 Parent(s): 65cb4c5

Main branch

Browse files
Files changed (2) hide show
  1. README.md +52 -0
  2. measurement.json +0 -0
README.md ADDED
@@ -0,0 +1,52 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - eng
4
+ tags:
5
+ - sft
6
+ - StableLM
7
+ license:
8
+ - mit
9
+ datasets:
10
+ - LDJnr/LessWrong-Amplify-Instruct
11
+ - LDJnr/Pure-Dove
12
+ - LDJnr/Verified-Camel
13
+ quantized_by: bartowski
14
+ ---
15
+
16
+ ## Exllama v2 Quantizations of Nous-Capybara-7B-V1.9
17
+
18
+ Using <a href="https://github.com/turboderp/exllamav2/releases/tag/v0.0.7">turboderp's ExLlamaV2 v0.0.7</a> for quantization.
19
+
20
+ Each branches contains an individual bits per weight.
21
+
22
+ Conversion was done using wikitext.parquet as calibration dataset.
23
+
24
+ Original model: https://huggingface.co/NousResearch/Nous-Capybara-7B-V1.9
25
+
26
+ ## Download instructions
27
+
28
+ With git:
29
+
30
+ ```shell
31
+ git clone --single-branch --branch 4.0 https://huggingface.co/bartowski/Nous-Capybara-7B-V1.9-exl2
32
+ ```
33
+
34
+ With huggingface hub (credit to TheBloke for instructions):
35
+
36
+ ```shell
37
+ pip3 install huggingface-hub
38
+ ```
39
+
40
+ To download the `main` (only useful if you only care about measurement.json) branch to a folder called `Nous-Capybara-7B-V1.9-exl2`:
41
+
42
+ ```shell
43
+ mkdir Nous-Capybara-7B-V1.9-exl2
44
+ huggingface-cli download bartowski/Nous-Capybara-7B-V1.9-exl2 --local-dir Nous-Capybara-7B-V1.9-exl2 --local-dir-use-symlinks False
45
+ ```
46
+
47
+ To download from a different branch, add the `--revision` parameter:
48
+
49
+ ```shell
50
+ mkdir Nous-Capybara-7B-V1.9-exl2
51
+ huggingface-cli download bartowski/Nous-Capybara-7B-V1.9-exl2 --revision 4.0 --local-dir Nous-Capybara-7B-V1.9-exl2 --local-dir-use-symlinks False
52
+ ```
measurement.json ADDED
The diff for this file is too large to render. See raw diff