Update README.md
Browse files
README.md
CHANGED
@@ -9,11 +9,23 @@ tags:
|
|
9 |
- gptq
|
10 |
---
|
11 |
|
12 |
-
13B-Chimera
|
13 |
-
[] = applied as LoRA
|
14 |
-
() = combined as composite model
|
15 |
|
16 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
17 |
|
18 |
manticore-13b [Epoch3] by openaccess-ai-collective
|
19 |
|
@@ -39,6 +51,8 @@ Metharme 13b by PygmalionAI
|
|
39 |
|
40 |
https://huggingface.co/PygmalionAI/metharme-13b
|
41 |
|
|
|
|
|
42 |
Each model and LoRA was hand picked and considered for what it could contribute to this ensemble.
|
43 |
Thanks to each and every one of you for your incredible work developing some of the best things
|
44 |
-
|
|
|
9 |
- gptq
|
10 |
---
|
11 |
|
12 |
+
### 13B-Chimera
|
|
|
|
|
13 |
|
14 |
+
|
15 |
+
[] = applied as LoRA to a composite model | () = combined as composite models
|
16 |
+
|
17 |
+
((MantiCore3E+VicunaCocktail)+[SuperCOT+[StorytellingV2+(SuperHOTProtoType-8192ctx+Metharme)]])
|
18 |
+
|
19 |
+
This model is the result of an experimental use of LoRAs on language models and model merges that are not the base HuggingFace-format LLaMA model they were intended for.
|
20 |
+
|
21 |
+
This experiment is to determine:
|
22 |
+
|
23 |
+
Outcomes of applying Lower Order Rank Adapters in unconventional ways.
|
24 |
+
Determine if applying LoRAs and stacking LoRAs onto merged models bypasses the zero-sum result of weight-sum model merging.
|
25 |
+
|
26 |
+
The desired result is to additively apply desired features without paradoxically watering down a model's effective behavior.
|
27 |
+
|
28 |
+
Language Models and LoRAs Used Credits:
|
29 |
|
30 |
manticore-13b [Epoch3] by openaccess-ai-collective
|
31 |
|
|
|
51 |
|
52 |
https://huggingface.co/PygmalionAI/metharme-13b
|
53 |
|
54 |
+
Also thanks to Meta for LLaMA.
|
55 |
+
|
56 |
Each model and LoRA was hand picked and considered for what it could contribute to this ensemble.
|
57 |
Thanks to each and every one of you for your incredible work developing some of the best things
|
58 |
+
to come out of this community.
|