File size: 4,054 Bytes
09917d3
e23e44c
 
09917d3
e23e44c
 
 
 
09917d3
3570c21
09917d3
3570c21
 
9bd2a0c
 
3570c21
 
 
 
 
 
09917d3
 
aab7f02
 
09917d3
 
9bd2a0c
df07a42
 
bf2c72a
df07a42
e86e99e
bf2c72a
df07a42
 
 
 
bf2c72a
1e89a10
e86e99e
d7fa418
 
ca1c822
09917d3
3570c21
91ce221
 
aab7f02
91ce221
aab7f02
f65f788
 
 
bdb8752
 
 
 
 
 
09917d3
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
---
language:
- en
library_name: transformers
quantized_by: mradermacher
tags:
- llama
- llama 2
---
## About

weighted/imatrix quants of https://huggingface.co/Doctor-Shotgun/lzlv-limarpv3-l2-70b

<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/lzlv-limarpv3-l2-70b-GGUF
## Usage

If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.

## Provided Quants

(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)

| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/lzlv-limarpv3-l2-70b-i1-GGUF/resolve/main/lzlv-limarpv3-l2-70b.i1-IQ1_S.gguf) | i1-IQ1_S | 15.0 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/lzlv-limarpv3-l2-70b-i1-GGUF/resolve/main/lzlv-limarpv3-l2-70b.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 18.7 |  |
| [GGUF](https://huggingface.co/mradermacher/lzlv-limarpv3-l2-70b-i1-GGUF/resolve/main/lzlv-limarpv3-l2-70b.i1-IQ2_XS.gguf) | i1-IQ2_XS | 20.8 |  |
| [GGUF](https://huggingface.co/mradermacher/lzlv-limarpv3-l2-70b-i1-GGUF/resolve/main/lzlv-limarpv3-l2-70b.i1-IQ2_M.gguf) | i1-IQ2_M | 23.7 |  |
| [GGUF](https://huggingface.co/mradermacher/lzlv-limarpv3-l2-70b-i1-GGUF/resolve/main/lzlv-limarpv3-l2-70b.i1-Q2_K.gguf) | i1-Q2_K | 25.9 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/lzlv-limarpv3-l2-70b-i1-GGUF/resolve/main/lzlv-limarpv3-l2-70b.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 27.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/lzlv-limarpv3-l2-70b-i1-GGUF/resolve/main/lzlv-limarpv3-l2-70b.i1-IQ3_XS.gguf) | i1-IQ3_XS | 28.6 |  |
| [GGUF](https://huggingface.co/mradermacher/lzlv-limarpv3-l2-70b-i1-GGUF/resolve/main/lzlv-limarpv3-l2-70b.i1-Q3_K_XS.gguf) | i1-Q3_K_XS | 28.7 |  |
| [GGUF](https://huggingface.co/mradermacher/lzlv-limarpv3-l2-70b-i1-GGUF/resolve/main/lzlv-limarpv3-l2-70b.i1-Q3_K_S.gguf) | i1-Q3_K_S | 30.3 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/lzlv-limarpv3-l2-70b-i1-GGUF/resolve/main/lzlv-limarpv3-l2-70b.i1-Q3_K_M.gguf) | i1-Q3_K_M | 33.7 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/lzlv-limarpv3-l2-70b-i1-GGUF/resolve/main/lzlv-limarpv3-l2-70b.i1-Q3_K_L.gguf) | i1-Q3_K_L | 36.6 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/lzlv-limarpv3-l2-70b-i1-GGUF/resolve/main/lzlv-limarpv3-l2-70b.i1-IQ4_XS.gguf) | i1-IQ4_XS | 37.2 |  |
| [GGUF](https://huggingface.co/mradermacher/lzlv-limarpv3-l2-70b-i1-GGUF/resolve/main/lzlv-limarpv3-l2-70b.i1-Q4_K_S.gguf) | i1-Q4_K_S | 39.7 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/lzlv-limarpv3-l2-70b-i1-GGUF/resolve/main/lzlv-limarpv3-l2-70b.i1-Q4_K_M.gguf) | i1-Q4_K_M | 41.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/lzlv-limarpv3-l2-70b-i1-GGUF/resolve/main/lzlv-limarpv3-l2-70b.i1-Q5_K_S.gguf) | i1-Q5_K_S | 47.9 |  |
| [GGUF](https://huggingface.co/mradermacher/lzlv-limarpv3-l2-70b-i1-GGUF/resolve/main/lzlv-limarpv3-l2-70b.i1-Q5_K_M.gguf) | i1-Q5_K_M | 49.2 |  |
| [PART 1](https://huggingface.co/mradermacher/lzlv-limarpv3-l2-70b-i1-GGUF/resolve/main/lzlv-limarpv3-l2-70b.i1-Q6_K.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/lzlv-limarpv3-l2-70b-i1-GGUF/resolve/main/lzlv-limarpv3-l2-70b.i1-Q6_K.gguf.part2of2) | i1-Q6_K | 57.0 | practically like static Q6_K |


Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png)

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9

## Thanks

I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.

<!-- end -->