brucethemoose commited on
Commit
6af9d60
1 Parent(s): 886bb74

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +39 -0
README.md CHANGED
@@ -3,3 +3,42 @@ license: other
3
  license_name: yi-license
4
  license_link: https://huggingface.co/01-ai/Yi-34B-200K/blob/main/LICENSE
5
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
3
  license_name: yi-license
4
  license_link: https://huggingface.co/01-ai/Yi-34B-200K/blob/main/LICENSE
5
  ---
6
+ Just a test of a very high density DARE ties merge, for benchmarking on the open llm leaderboard. Config:
7
+
8
+ ```
9
+ models:
10
+ - model: /home/alpha/Storage/Models/Raw/chargoddard_Yi-34B-200K-Llama
11
+ # no parameters necessary for base model
12
+ - model: /home/alpha/Storage/Models/Raw/migtissera_Tess-34B-v1.4
13
+ parameters:
14
+ weight: 0.19
15
+ density: 0.83
16
+ - model: /home/alpha//Storage/Models/Raw/bhenrym14_airoboros-3_1-yi-34b-200k
17
+ parameters:
18
+ weight: 0.14
19
+ density: 0.6
20
+ - model: /home/alpha/Storage/Models/Raw/Nous-Capybara-34B
21
+ parameters:
22
+ weight: 0.19
23
+ density: 0.83
24
+ - model: /home/alpha/Storage/Models/Raw/kyujinpy_PlatYi-34B-200K-Q
25
+ parameters:
26
+ weight: 0.14
27
+ density: 0.6
28
+ - model: /home/alpha/FastModels/ehartford_dolphin-2.2-yi-34b-200k
29
+ parameters:
30
+ weight: 0.19
31
+ density: 0.83
32
+ - model: /home/alpha/FastModels/fblgit_una-xaberius-34b-v1beta
33
+ parameters:
34
+ weight: 0.15
35
+ density: 0.08
36
+ merge_method: dare_ties
37
+ base_model: /home/alpha/Storage/Models/Raw/chargoddard_Yi-34B-200K-Llama
38
+ parameters:
39
+
40
+ int8_mask: true
41
+ dtype: bfloat16
42
+ ```
43
+
44
+ See the main model card: https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties