RichardErkhov commited on
Commit
bee5326
1 Parent(s): ae1615f

uploaded readme

Browse files
Files changed (1) hide show
  1. README.md +78 -0
README.md ADDED
@@ -0,0 +1,78 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Quantization made by Richard Erkhov.
2
+
3
+ [Github](https://github.com/RichardErkhov)
4
+
5
+ [Discord](https://discord.gg/pvy7H8DZMG)
6
+
7
+ [Request more models](https://github.com/RichardErkhov/quant_request)
8
+
9
+
10
+ Tippy-Toppy-7b - GGUF
11
+ - Model creator: https://huggingface.co/Azazelle/
12
+ - Original model: https://huggingface.co/Azazelle/Tippy-Toppy-7b/
13
+
14
+
15
+ | Name | Quant method | Size |
16
+ | ---- | ---- | ---- |
17
+ | [Tippy-Toppy-7b.Q2_K.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q2_K.gguf) | Q2_K | 2.53GB |
18
+ | [Tippy-Toppy-7b.IQ3_XS.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.IQ3_XS.gguf) | IQ3_XS | 2.81GB |
19
+ | [Tippy-Toppy-7b.IQ3_S.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.IQ3_S.gguf) | IQ3_S | 2.96GB |
20
+ | [Tippy-Toppy-7b.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q3_K_S.gguf) | Q3_K_S | 2.95GB |
21
+ | [Tippy-Toppy-7b.IQ3_M.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.IQ3_M.gguf) | IQ3_M | 3.06GB |
22
+ | [Tippy-Toppy-7b.Q3_K.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q3_K.gguf) | Q3_K | 3.28GB |
23
+ | [Tippy-Toppy-7b.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q3_K_M.gguf) | Q3_K_M | 3.28GB |
24
+ | [Tippy-Toppy-7b.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q3_K_L.gguf) | Q3_K_L | 3.56GB |
25
+ | [Tippy-Toppy-7b.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.IQ4_XS.gguf) | IQ4_XS | 3.67GB |
26
+ | [Tippy-Toppy-7b.Q4_0.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q4_0.gguf) | Q4_0 | 3.83GB |
27
+ | [Tippy-Toppy-7b.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.IQ4_NL.gguf) | IQ4_NL | 3.87GB |
28
+ | [Tippy-Toppy-7b.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q4_K_S.gguf) | Q4_K_S | 3.86GB |
29
+ | [Tippy-Toppy-7b.Q4_K.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q4_K.gguf) | Q4_K | 4.07GB |
30
+ | [Tippy-Toppy-7b.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q4_K_M.gguf) | Q4_K_M | 4.07GB |
31
+ | [Tippy-Toppy-7b.Q4_1.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q4_1.gguf) | Q4_1 | 4.24GB |
32
+ | [Tippy-Toppy-7b.Q5_0.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q5_0.gguf) | Q5_0 | 4.65GB |
33
+ | [Tippy-Toppy-7b.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q5_K_S.gguf) | Q5_K_S | 4.65GB |
34
+ | [Tippy-Toppy-7b.Q5_K.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q5_K.gguf) | Q5_K | 4.78GB |
35
+ | [Tippy-Toppy-7b.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q5_K_M.gguf) | Q5_K_M | 4.78GB |
36
+ | [Tippy-Toppy-7b.Q5_1.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q5_1.gguf) | Q5_1 | 5.07GB |
37
+ | [Tippy-Toppy-7b.Q6_K.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q6_K.gguf) | Q6_K | 5.53GB |
38
+ | [Tippy-Toppy-7b.Q8_0.gguf](https://huggingface.co/RichardErkhov/Azazelle_-_Tippy-Toppy-7b-gguf/blob/main/Tippy-Toppy-7b.Q8_0.gguf) | Q8_0 | 7.17GB |
39
+
40
+
41
+
42
+
43
+ Original model description:
44
+ ---
45
+ pipeline_tag: text-generation
46
+ tags:
47
+ - mistral
48
+ - merge
49
+ license: cc-by-4.0
50
+ ---
51
+ # Model Card for Tippy-Toppy-7b
52
+ <!-- Provide a quick summary of what the model is/does. -->
53
+ DARE merge intended to be build on Toppy-M-7b.
54
+
55
+ .yaml file for mergekit
56
+ ```.yaml:
57
+ models:
58
+ - model: mistralai/Mistral-7B-v0.1
59
+ # no parameters necessary for base model
60
+ - model: Undi95/Toppy-M-7B #175
61
+ parameters:
62
+ weight: 0.54
63
+ density: 0.81
64
+ - model: PistachioAlt/Noromaid-Bagel-7B-Slerp #75
65
+ parameters:
66
+ weight: 0.23
67
+ density: 0.61
68
+ - model: OpenPipe/mistral-ft-optimized-1227 #100
69
+ parameters:
70
+ weight: 0.31
71
+ density: 0.68
72
+ merge_method: dare_ties
73
+ base_model: mistralai/Mistral-7B-v0.1
74
+ parameters:
75
+ int8_mask: true
76
+ dtype: bfloat16
77
+ ```
78
+