GGUF
mergekit
Merge
ycros commited on
Commit
68180e8
1 Parent(s): e059b62

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +61 -0
README.md ADDED
@@ -0,0 +1,61 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - mistralai/Mixtral-8x7B-v0.1
4
+ - mistralai/Mixtral-8x7B-Instruct-v0.1
5
+ - jondurbin/bagel-dpo-8x7b-v0.2
6
+ - Doctor-Shotgun/limarp-zloss-mixtral-8x7b-qlora
7
+ - Sao10K/Sensualize-Mixtral-bf16
8
+ tags:
9
+ - mergekit
10
+ - merge
11
+ license: cc-by-nc-4.0
12
+ ---
13
+ # BagelMIsteryTour-8x7B-GGUF
14
+
15
+ These are GGUF quantized versions of [BagelMIsteryTour-8x7B](https://huggingface.co/ycros/BagelMIsteryTour-8x7B)
16
+
17
+ Bagel, Mixtral Instruct, with extra spices. Give it a taste. Works with Alpaca prompt formats, though the Mistral format should also work.
18
+
19
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
20
+
21
+ ## Merge Details
22
+ ### Merge Method
23
+
24
+ This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [mistralai/Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) as a base.
25
+
26
+ ### Models Merged
27
+
28
+ The following models were included in the merge:
29
+ * [mistralai/Mixtral-8x7B-Instruct-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1)
30
+ * [jondurbin/bagel-dpo-8x7b-v0.2](https://huggingface.co/jondurbin/bagel-dpo-8x7b-v0.2)
31
+ * [mistralai/Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) + [Doctor-Shotgun/limarp-zloss-mixtral-8x7b-qlora](https://huggingface.co/Doctor-Shotgun/limarp-zloss-mixtral-8x7b-qlora)
32
+ * [Sao10K/Sensualize-Mixtral-bf16](https://huggingface.co/Sao10K/Sensualize-Mixtral-bf16)
33
+
34
+ ### Configuration
35
+
36
+ The following YAML configuration was used to produce this model:
37
+
38
+ ```yaml
39
+ base_model: mistralai/Mixtral-8x7B-v0.1
40
+ models:
41
+ - model: mistralai/Mixtral-8x7B-v0.1+Doctor-Shotgun/limarp-zloss-mixtral-8x7b-qlora
42
+ parameters:
43
+ density: 0.5
44
+ weight: 0.2
45
+ - model: Sao10K/Sensualize-Mixtral-bf16
46
+ parameters:
47
+ density: 0.5
48
+ weight: 0.2
49
+ - model: mistralai/Mixtral-8x7B-Instruct-v0.1
50
+ parameters:
51
+ density: 0.6
52
+ weight: 1.0
53
+ - model: jondurbin/bagel-dpo-8x7b-v0.2
54
+ parameters:
55
+ density: 0.6
56
+ weight: 0.5
57
+ merge_method: dare_ties
58
+ dtype: bfloat16
59
+
60
+
61
+ ```