FluffyKaeloky commited on
Commit
63650ce
1 Parent(s): f6a73b5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -27
README.md CHANGED
@@ -6,36 +6,16 @@ tags:
6
  - merge
7
 
8
  ---
9
- # merge
10
 
11
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
12
 
13
- ## Merge Details
14
- ### Merge Method
 
15
 
16
- This model was merged using the passthrough merge method.
17
 
18
- ### Models Merged
19
 
20
- The following models were included in the merge:
21
- * ./models/sophosympatheiaMidnight-Miqu-70B-v1.5
22
 
23
- ### Configuration
24
-
25
- The following YAML configuration was used to produce this model:
26
-
27
- ```yaml
28
- slices:
29
- - sources:
30
- - model: ./models/sophosympatheiaMidnight-Miqu-70B-v1.5
31
- layer_range: [0, 40] # 40
32
- - sources:
33
- - model: ./models/sophosympatheiaMidnight-Miqu-70B-v1.5
34
- layer_range: [20, 60] # 40
35
- - sources:
36
- - model: ./models/sophosympatheiaMidnight-Miqu-70B-v1.5
37
- layer_range: [40, 80] # 40
38
- merge_method: passthrough
39
- dtype: float16
40
-
41
- ```
 
6
  - merge
7
 
8
  ---
 
9
 
 
10
 
11
+ <div style="width: auto; margin-left: auto; margin-right: auto">
12
+ <img src="https://i.imgur.com/Tn9MBg6.png" alt="MidnightMiqu" style="width: 100%; min-width: 400px; display: block; margin: auto;">
13
+ </div>
14
 
15
+ # Midnight-Miqu-103B-v1.5-exl2-4.0bpw-rpcal
16
 
17
+ This is a 4.0bpw EXL2 quant of [FluffyKaeloky/Midnight-Miqu-103B-v1.5](https://huggingface.co/FluffyKaeloky/Midnight-Miqu-103B-v1.5)
18
 
19
+ The pippa file used for calibration is optimised for roleplay. The measurement file can be found in the files if you want to do your own quants.
 
20
 
21
+ Details about the model and the merge info can be found at the fp16 model link above.