GGUF
Not-For-All-Audiences
nsfw
Undi95 commited on
Commit
1e8c324
1 Parent(s): ff904cb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +98 -2
README.md CHANGED
@@ -5,6 +5,102 @@ tags:
5
  - nsfw
6
  ---
7
 
8
- *slap this shit*
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9
 
10
- it have +60 layers so bear with me, testing, don't download if you don't know what you do
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5
  - nsfw
6
  ---
7
 
8
+ First :
9
+ ```shell
10
+ layer_slices:
11
+ - model: Undi95/MLewd-L2-Chat-13B
12
+ start: 0
13
+ end: 16
14
+ - model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
15
+ start: 8
16
+ end: 20
17
+ - model: Undi95/MLewd-L2-Chat-13B
18
+ start: 17
19
+ end: 32
20
+ - model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
21
+ start: 21
22
+ end: 40
23
+ ```
24
 
25
+ Inverted:
26
+ ```shell
27
+ layer_slices:
28
+ - model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
29
+ start: 0
30
+ end: 16
31
+ - model: Undi95/MLewd-L2-Chat-13B
32
+ start: 8
33
+ end: 20
34
+ - model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
35
+ start: 17
36
+ end: 32
37
+ - model: Undi95/MLewd-L2-Chat-13B
38
+ start: 21
39
+ end: 40
40
+ ```
41
+
42
+ Precise:
43
+ ```shell
44
+ layer_slices:
45
+ - model: Undi95/MLewd-L2-Chat-13B
46
+ start: 0
47
+ end: 8
48
+ - model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
49
+ start: 4
50
+ end: 12
51
+ - model: Undi95/MLewd-L2-Chat-13B
52
+ start: 9
53
+ end: 16
54
+ - model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
55
+ start: 13
56
+ end: 22
57
+ - model: Undi95/MLewd-L2-Chat-13B
58
+ start: 17
59
+ end: 24
60
+ - model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
61
+ start: 23
62
+ end: 32
63
+ - model: Undi95/MLewd-L2-Chat-13B
64
+ start: 25
65
+ end: 32
66
+ - model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
67
+ start: 33
68
+ end: 40
69
+ ```
70
+
71
+ PreciseInverted:
72
+ ```shell
73
+ layer_slices:
74
+ - model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
75
+ start: 0
76
+ end: 8
77
+ - model: Undi95/MLewd-L2-Chat-13B
78
+ start: 4
79
+ end: 12
80
+ - model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
81
+ start: 9
82
+ end: 16
83
+ - model: Undi95/MLewd-L2-Chat-13B
84
+ start: 13
85
+ end: 22
86
+ - model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
87
+ start: 17
88
+ end: 24
89
+ - model: Undi95/MLewd-L2-Chat-13B
90
+ start: 23
91
+ end: 32
92
+ - model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
93
+ start: 25
94
+ end: 32
95
+ - model: Undi95/MLewd-L2-Chat-13B
96
+ start: 33
97
+ end: 40
98
+ ```
99
+
100
+ Part1 = ReMM v2.1 merged /w MLewd low weight to keep consistency. I call this "dilution" and result show consistency and coherency without repeat/loop beside the small amount of duplicated datas.
101
+
102
+ The goal is to find the best way to interlace layers the best way possible to have a sweetspot between 13B and +30B.
103
+
104
+ Normal/Inverted is by chunk of 16 layers and Precise/PreciseInverted is by chunk of 8 layers.
105
+
106
+ All the models are made of 64(+1) layers. Need testing.