ABX-AI commited on
Commit
12d3fb0
1 Parent(s): 89694cf

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +62 -0
README.md ADDED
@@ -0,0 +1,62 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - ABX-AI/Cerebral-Infinity-7B
4
+ - ABX-AI/Spicy-Laymonade-7B
5
+ library_name: transformers
6
+ tags:
7
+ - mergekit
8
+ - merge
9
+
10
+ ---
11
+ # GGUF / IQ / Imatrix for [Cosmic-Citrus-9B](https://huggingface.com/ABX-AI/Cosmic-Citrus-9B)
12
+
13
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/65d936ad52eca001fdcd3245/mm8eJBytwElxWw_V1voDW.png)
14
+
15
+ **Why Importance Matrix?**
16
+
17
+ **Importance Matrix**, at least based on my testing, has shown to improve the output and performance of "IQ"-type quantizations, where the compression becomes quite heavy.
18
+ The **Imatrix** performs a calibration, using a provided dataset. Testing has shown that semi-randomized data can help perserve more important segments as the compression is applied.
19
+
20
+ Related discussions in Github:
21
+ [[1]](https://github.com/ggerganov/llama.cpp/discussions/5006) [[2]](https://github.com/ggerganov/llama.cpp/discussions/5263#discussioncomment-8395384)
22
+
23
+ The imatrix.txt file that I used contains general, semi-random data, with some custom kink.
24
+
25
+
26
+ # Cosmic-Citrus-9B
27
+
28
+ Another attempt at merging Cerebrum, InfinityRP, LemonadeRP, and Laymonade, all already merged in my previous merges, now into a 9B containing TheSpice.
29
+
30
+ So far in my tests, it seems to follow my cards in intriguing way, using refined language, with more consideration of what the prompt is saying.
31
+
32
+
33
+ ## Merge Details
34
+
35
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
36
+
37
+ ### Merge Method
38
+
39
+ This model was merged using the passthrough merge method.
40
+
41
+ ### Models Merged
42
+
43
+ The following models were included in the merge:
44
+ * [ABX-AI/Cerebral-Infinity-7B](https://huggingface.co/ABX-AI/Cerebral-Infinity-7B)
45
+ * [ABX-AI/Spicy-Laymonade-7B](https://huggingface.co/ABX-AI/Spicy-Laymonade-7B)
46
+
47
+ ### Configuration
48
+
49
+ The following YAML configuration was used to produce this model:
50
+
51
+ ```yaml
52
+ slices:
53
+ - sources:
54
+ - model: ABX-AI/Cerebral-Infinity-7B
55
+ layer_range: [0, 20]
56
+ - sources:
57
+ - model: ABX-AI/Spicy-Laymonade-7B
58
+ layer_range: [12, 32]
59
+ merge_method: passthrough
60
+ dtype: float16
61
+
62
+ ```