File size: 6,287 Bytes
282a039
 
 
 
 
 
 
 
 
 
 
 
 
 
 
dcfc4a2
282a039
dcfc4a2
 
4d204ea
 
dcfc4a2
 
 
 
 
 
282a039
 
31e9791
 
282a039
 
4d204ea
8ed2e3c
 
69b4ee2
8ed2e3c
 
 
69b4ee2
8ed2e3c
73fdc2d
8ed2e3c
 
6808e42
8ed2e3c
366ba0c
d16c8e9
366ba0c
282a039
dcfc4a2
7ac8a3d
 
31e9791
7ac8a3d
31e9791
6db32c4
 
 
422c849
 
 
 
 
 
282a039
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
---
base_model:
- cognitivecomputations/dolphin-2.2-70b
- WizardLM/WizardMath-70B-V1.0
- migtissera/SynthIA-70B-v1.2b
- epfl-llm/meditron-70b
language:
- en
library_name: transformers
license: llama2
quantized_by: mradermacher
tags:
- mergekit
- merge
---
## About

weighted/imatrix quants of https://huggingface.co/abacusai/TheProfessor-155b

<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/TheProfessor-155b-GGUF
## Usage

If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.

## Provided Quants

(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)

| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-IQ1_S.gguf) | i1-IQ1_S | 32.8 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 41.3 |  |
| [GGUF](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-IQ2_XS.gguf) | i1-IQ2_XS | 45.9 |  |
| [PART 1](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-IQ2_M.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-IQ2_M.gguf.part2of2) | i1-IQ2_M | 52.3 |  |
| [PART 1](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q2_K.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q2_K.gguf.split-ab) | i1-Q2_K | 57.1 | IQ3_XXS probably better |
| [PART 1](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-IQ3_XXS.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-IQ3_XXS.gguf.split-ab) | i1-IQ3_XXS | 60.6 | fast, lower quality |
| [PART 1](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q3_K_XS.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q3_K_XS.gguf.split-ab) | i1-Q3_K_XS | 63.2 |  |
| [PART 1](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-IQ3_XS.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-IQ3_XS.gguf.part2of2) | i1-IQ3_XS | 63.3 |  |
| [PART 1](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q3_K_S.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q3_K_S.gguf.split-ab) | i1-Q3_K_S | 66.9 | IQ3_XS probably better |
| [PART 1](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-IQ3_S.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-IQ3_S.gguf.part2of2) | i1-IQ3_S | 67.1 | fast, beats Q3_K* |
| [PART 1](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q3_K_M.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q3_K_M.gguf.split-ab) | i1-Q3_K_M | 74.7 | IQ3_S probably better |
| [PART 1](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q3_K_L.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q3_K_L.gguf.split-ab) | i1-Q3_K_L | 81.3 | IQ3_M probably better |
| [PART 1](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q4_K_S.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q4_K_S.gguf.split-ab) | i1-Q4_K_S | 88.1 | optimal size/speed/quality |
| [PART 1](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q4_K_M.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q4_K_M.gguf.split-ab) | i1-Q4_K_M | 93.1 | fast, medium quality |
| [PART 1](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q5_K_S.gguf.part1of3) [PART 2](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q5_K_S.gguf.part2of3) [PART 3](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q5_K_S.gguf.part3of3) | i1-Q5_K_S | 106.7 |  |
| [PART 1](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q5_K_M.gguf.part1of3) [PART 2](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q5_K_M.gguf.part2of3) [PART 3](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q5_K_M.gguf.part3of3) | i1-Q5_K_M | 109.6 |  |
| [PART 1](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q6_K.gguf.part1of3) [PART 2](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q6_K.gguf.part2of3) [PART 3](https://huggingface.co/mradermacher/TheProfessor-155b-i1-GGUF/resolve/main/TheProfessor-155b.i1-Q6_K.gguf.part3of3) | i1-Q6_K | 127.2 | practically like static Q6_K |


Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png)

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9

## Thanks

I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.

<!-- end -->