brucethemoose
commited on
Commit
•
65b9724
1
Parent(s):
1d2ffad
Update README.md
Browse files
README.md
CHANGED
@@ -1,5 +1,77 @@
|
|
1 |
---
|
2 |
license: other
|
3 |
license_name: yi-license
|
4 |
-
license_link: https://huggingface.co/01-ai/Yi-34B
|
|
|
|
|
|
|
|
|
5 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
license: other
|
3 |
license_name: yi-license
|
4 |
+
license_link: https://huggingface.co/01-ai/Yi-34B/blob/main/LICENSE
|
5 |
+
language:
|
6 |
+
- en
|
7 |
+
library_name: transformers
|
8 |
+
pipeline_tag: text-generation
|
9 |
---
|
10 |
+
|
11 |
+
**NousResearch/Nous-Capybara-34B**, **migtissera/Tess-M-v1.3** and **bhenrym14/airoboros-3_1-yi-34b-200k** merged with a new, experimental implementation of "dare ties" via mergekit. See:
|
12 |
+
|
13 |
+
> Language Models are Super Mario: Absorbing Abilities from Homologous Models as a Free Lunch
|
14 |
+
|
15 |
+
https://github.com/yule-BUAA/MergeLM
|
16 |
+
|
17 |
+
https://github.com/cg123/mergekit/tree/dare
|
18 |
+
|
19 |
+
***
|
20 |
+
|
21 |
+
Merged with the following config, and the tokenizer from Yi Llamafied:
|
22 |
+
```
|
23 |
+
models:
|
24 |
+
- model: /home/alpha/Storage/Models/Raw/chargoddard_Yi-34B-200K-Llama
|
25 |
+
# no parameters necessary for base model
|
26 |
+
- model: /home/alpha/Storage/Models/Raw/migtissera_Tess-M-v1.3
|
27 |
+
parameters:
|
28 |
+
weight: 0.41
|
29 |
+
density: 0.50
|
30 |
+
- model: /home/alpha//Storage/Models/Raw/bhenrym14_airoboros-3_1-yi-34b-200k
|
31 |
+
parameters:
|
32 |
+
weight: 0.18
|
33 |
+
density: 0.46
|
34 |
+
- model: /home/alpha/Storage/Models/Raw/Nous-Capybara-34B
|
35 |
+
parameters:
|
36 |
+
weight: 0.41
|
37 |
+
density: 0.50
|
38 |
+
merge_method: dare_ties
|
39 |
+
base_model: /home/alpha/Storage/Models/Raw/chargoddard_Yi-34B-200K-Llama
|
40 |
+
parameters:
|
41 |
+
int8_mask: true
|
42 |
+
dtype: bfloat16
|
43 |
+
```
|
44 |
+
|
45 |
+
dare_ties is testing with better perplexity than a regular ties merge with the same merge configuration. Model weights that add up to one also seem optimal from testing. And results seem... better than the previous dare merge with Tess 1.2? Maybe?
|
46 |
+
|
47 |
+
I chose not to include other finetunes, such as Dolphin, because they aren't trained on the 200K base. If any other 200K finetunes pop up, let me know.
|
48 |
+
|
49 |
+
***
|
50 |
+
|
51 |
+
## Prompt template: Orca-Vicuna
|
52 |
+
|
53 |
+
```
|
54 |
+
SYSTEM: {system_message}
|
55 |
+
USER: {prompt}
|
56 |
+
ASSISTANT:
|
57 |
+
|
58 |
+
```
|
59 |
+
Being a Yi model, try disabling the BOS token and/or running a lower temperature with MinP if output doesn't seem right.
|
60 |
+
|
61 |
+
Sometimes the model "spells out" the stop token as `</s>` like Capybara, so you may need to add `</s>` as an additional stopping condition. It also might respond to the llama-2 chat format.
|
62 |
+
|
63 |
+
***
|
64 |
+
|
65 |
+
Credits:
|
66 |
+
|
67 |
+
https://github.com/cg123/mergekit/tree/dare
|
68 |
+
|
69 |
+
https://huggingface.co/NousResearch/Nous-Capybara-34B/
|
70 |
+
|
71 |
+
https://huggingface.co/bhenrym14/airoboros-3_1-yi-34b-200k
|
72 |
+
|
73 |
+
https://huggingface.co/migtissera/Tess-M-v1.3
|
74 |
+
|
75 |
+
https://huggingface.co/larryvrh/Yi-34B-200K-Llamafied
|
76 |
+
|
77 |
+
https://huggingface.co/01-ai/Yi-34B-200K
|