measurement.json
Browse files- README.md +79 -0
- measurement.json +0 -0
README.md
ADDED
@@ -0,0 +1,79 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: other
|
3 |
+
license_name: yi-license
|
4 |
+
license_link: LICENSE
|
5 |
+
widget:
|
6 |
+
- example_title: "Yi-34B-Chat"
|
7 |
+
text: "hi"
|
8 |
+
output:
|
9 |
+
text: " Hello! How can I assist you today?"
|
10 |
+
- example_title: "Yi-34B"
|
11 |
+
text: "There's a place where time stands still. A place of breath taking wonder, but also"
|
12 |
+
output:
|
13 |
+
text: " an eerie sense that something is just not right…\nBetween the two worlds lies The Forgotten Kingdom - home to creatures long since thought extinct and ancient magic so strong it defies belief! Only here can you find what has been lost for centuries: An Elixir Of Life which will restore youth and vitality if only those who seek its power are brave enough to face up against all manner of dangers lurking in this mysterious land! But beware; some say there may even exist powerful entities beyond our comprehension whose intentions towards humanity remain unclear at best ---- they might want nothing more than destruction itself rather then anything else from their quest after immortality (and maybe someone should tell them about modern medicine)? In any event though – one thing remains true regardless : whether or not success comes easy depends entirely upon how much effort we put into conquering whatever challenges lie ahead along with having faith deep down inside ourselves too ;) So let’s get started now shall We?"
|
14 |
+
pipeline_tag: text-generation
|
15 |
+
quantized_by: bartowski
|
16 |
+
---
|
17 |
+
|
18 |
+
## Exllama v2 Quantizations of Yi-9B-200K
|
19 |
+
|
20 |
+
Using <a href="https://github.com/turboderp/exllamav2/releases/tag/v0.0.15">turboderp's ExLlamaV2 v0.0.15</a> for quantization.
|
21 |
+
|
22 |
+
## The "main" branch only contains the measurement.json, download one of the other branches for the model (see below)
|
23 |
+
|
24 |
+
Each branch contains an individual bits per weight, with the main one containing only the meaurement.json for further conversions.
|
25 |
+
|
26 |
+
Conversion was done using the default calibration dataset.
|
27 |
+
|
28 |
+
Default arguments used except when the bits per weight is above 6.0, at that point the lm_head layer is quantized at 8 bits per weight instead of the default 6.
|
29 |
+
|
30 |
+
Original model: https://huggingface.co/01-ai/Yi-9B-200K
|
31 |
+
|
32 |
+
|
33 |
+
<a href="https://huggingface.co/bartowski/Yi-9B-200K-exl2/tree/8_0">8.0 bits per weight</a>
|
34 |
+
|
35 |
+
<a href="https://huggingface.co/bartowski/Yi-9B-200K-exl2/tree/6_5">6.5 bits per weight</a>
|
36 |
+
|
37 |
+
<a href="https://huggingface.co/bartowski/Yi-9B-200K-exl2/tree/5_0">5.0 bits per weight</a>
|
38 |
+
|
39 |
+
<a href="https://huggingface.co/bartowski/Yi-9B-200K-exl2/tree/4_25">4.25 bits per weight</a>
|
40 |
+
|
41 |
+
<a href="https://huggingface.co/bartowski/Yi-9B-200K-exl2/tree/3_5">3.5 bits per weight</a>
|
42 |
+
|
43 |
+
|
44 |
+
## Download instructions
|
45 |
+
|
46 |
+
With git:
|
47 |
+
|
48 |
+
```shell
|
49 |
+
git clone --single-branch --branch 6_5 https://huggingface.co/bartowski/Yi-9B-200K-exl2
|
50 |
+
```
|
51 |
+
|
52 |
+
With huggingface hub (credit to TheBloke for instructions):
|
53 |
+
|
54 |
+
```shell
|
55 |
+
pip3 install huggingface-hub
|
56 |
+
```
|
57 |
+
|
58 |
+
To download the `main` (only useful if you only care about measurement.json) branch to a folder called `Yi-9B-200K-exl2`:
|
59 |
+
|
60 |
+
```shell
|
61 |
+
mkdir Yi-9B-200K-exl2
|
62 |
+
huggingface-cli download bartowski/Yi-9B-200K-exl2 --local-dir Yi-9B-200K-exl2 --local-dir-use-symlinks False
|
63 |
+
```
|
64 |
+
|
65 |
+
To download from a different branch, add the `--revision` parameter:
|
66 |
+
|
67 |
+
Linux:
|
68 |
+
|
69 |
+
```shell
|
70 |
+
mkdir Yi-9B-200K-exl2-6_5
|
71 |
+
huggingface-cli download bartowski/Yi-9B-200K-exl2 --revision 6_5 --local-dir Yi-9B-200K-exl2-6_5 --local-dir-use-symlinks False
|
72 |
+
```
|
73 |
+
|
74 |
+
Windows (which apparently doesn't like _ in folders sometimes?):
|
75 |
+
|
76 |
+
```shell
|
77 |
+
mkdir Yi-9B-200K-exl2-6.5
|
78 |
+
huggingface-cli download bartowski/Yi-9B-200K-exl2 --revision 6_5 --local-dir Yi-9B-200K-exl2-6.5 --local-dir-use-symlinks False
|
79 |
+
```
|
measurement.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|