Text Generation
Transformers
GGUF
English
mergekit
Mixture of Experts
mixture of experts
Merge
4x8B
Llama3 MOE
creative
creative writing
fiction writing
plot generation
sub-plot generation
story generation
scene continue
storytelling
fiction story
science fiction
romance
all genres
story
writing
vivid prosing
vivid writing
fiction
roleplaying
bfloat16
swearing
rp
horror
Inference Endpoints
conversational
Update README.md
Browse files
README.md
CHANGED
@@ -38,9 +38,11 @@ tags:
|
|
38 |
pipeline_tag: text-generation
|
39 |
---
|
40 |
|
|
|
|
|
41 |
<B><font color="red">WARNING:</font> NSFW. Vivid prose. INTENSE. Visceral Details. Violence. HORROR. GORE. Swearing. UNCENSORED... humor, romance, fun. </B>
|
42 |
|
43 |
-
<h2>L3-
|
44 |
|
45 |
<img src="dark-p-infinite.jpg" style="float:right; width:300px; height:300px; padding:10px;">
|
46 |
|
@@ -96,14 +98,6 @@ Example outputs below.
|
|
96 |
|
97 |
This model is comprised of the following 4 models ("the experts") (in full):
|
98 |
|
99 |
-
[ https://huggingface.co/DavidAU/L3-Dark-Planet-8B-GGUF ]
|
100 |
-
|
101 |
-
[ https://huggingface.co/DavidAU/L3-Dark-Planet-8B-V2-Eight-Orbs-Of-Power-GGUF ]
|
102 |
-
|
103 |
-
[ https://huggingface.co/DavidAU/L3-Dark-Planet-Ring-World-8B-F32-GGUF ]
|
104 |
-
|
105 |
-
[ https://huggingface.co/DavidAU/L3.1-Dark-Planet-SpinFire-Uncensored-8B-GGUF ]
|
106 |
-
|
107 |
The mixture of experts is set at 2 experts, but you can use 3 or 4 too.
|
108 |
|
109 |
This "team" has a Captain (first listed model), and then all the team members contribute to the to "token"
|
@@ -148,6 +142,14 @@ Special credit goes to MERGEKIT, without you this project / model would not have
|
|
148 |
|
149 |
[ https://github.com/arcee-ai/mergekit ]
|
150 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
151 |
<B>Special Operations Notes for this MOE model:</B>
|
152 |
|
153 |
Because of how this "MOE" model is configured, even though the default is 2 experts, the "selected" 2 will vary during generation.
|
@@ -208,6 +210,13 @@ This repo contains regular quants and 3 "ARM" quants (format "...Q4_x_x_x.gguf")
|
|
208 |
|
209 |
For more information on quants, quants choices, and LLM/AI apps to "run" quants see the section below: "Highest Quality Settings..."
|
210 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
211 |
|
212 |
<B>Template:</B>
|
213 |
|
|
|
38 |
pipeline_tag: text-generation
|
39 |
---
|
40 |
|
41 |
+
(examples to be added)
|
42 |
+
|
43 |
<B><font color="red">WARNING:</font> NSFW. Vivid prose. INTENSE. Visceral Details. Violence. HORROR. GORE. Swearing. UNCENSORED... humor, romance, fun. </B>
|
44 |
|
45 |
+
<h2>L3-MOE-4x8B-Dark-Planet-Rebel-FURY-25B</h2>
|
46 |
|
47 |
<img src="dark-p-infinite.jpg" style="float:right; width:300px; height:300px; padding:10px;">
|
48 |
|
|
|
98 |
|
99 |
This model is comprised of the following 4 models ("the experts") (in full):
|
100 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
101 |
The mixture of experts is set at 2 experts, but you can use 3 or 4 too.
|
102 |
|
103 |
This "team" has a Captain (first listed model), and then all the team members contribute to the to "token"
|
|
|
142 |
|
143 |
[ https://github.com/arcee-ai/mergekit ]
|
144 |
|
145 |
+
Special thanks to Team "Mradermacher":
|
146 |
+
|
147 |
+
They saved me a tonne of time uploading the quants and created IMATRIX quants too.
|
148 |
+
|
149 |
+
IMATRIX GGUFS:
|
150 |
+
|
151 |
+
[ https://huggingface.co/mradermacher/L3-MOE-4x8B-Dark-Planet-Rebel-FURY-25B-i1-GGUF ]
|
152 |
+
|
153 |
<B>Special Operations Notes for this MOE model:</B>
|
154 |
|
155 |
Because of how this "MOE" model is configured, even though the default is 2 experts, the "selected" 2 will vary during generation.
|
|
|
210 |
|
211 |
For more information on quants, quants choices, and LLM/AI apps to "run" quants see the section below: "Highest Quality Settings..."
|
212 |
|
213 |
+
Special thanks to Team "Mradermacher":
|
214 |
+
|
215 |
+
They saved me a tonne of time uploading the quants and created IMATRIX quants too.
|
216 |
+
|
217 |
+
IMATRIX GGUFS:
|
218 |
+
|
219 |
+
[ https://huggingface.co/mradermacher/L3-MOE-4x8B-Dark-Planet-Rebel-FURY-25B-i1-GGUF ]
|
220 |
|
221 |
<B>Template:</B>
|
222 |
|