bcse commited on
Commit
dce86fa
1 Parent(s): 239418b

Upload folder using huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +73 -0
README.md ADDED
@@ -0,0 +1,73 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - ShinojiResearch/Senku-70B-Full
4
+ - Sao10K/Euryale-1.3-L2-70B
5
+ library_name: transformers
6
+ tags:
7
+ - mergekit
8
+ - merge
9
+
10
+ ---
11
+ # Bernstein-120b
12
+
13
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
14
+
15
+ ## Merge Details
16
+ ### Merge Method
17
+
18
+ This model was merged using the [linear](https://arxiv.org/abs/2203.05482) merge method.
19
+
20
+ ### Models Merged
21
+
22
+ The following models were included in the merge:
23
+ * [ShinojiResearch/Senku-70B-Full](https://huggingface.co/ShinojiResearch/Senku-70B-Full)
24
+ * [Sao10K/Euryale-1.3-L2-70B](https://huggingface.co/Sao10K/Euryale-1.3-L2-70B)
25
+
26
+ ### Configuration
27
+
28
+ The following YAML configuration was used to produce this model:
29
+
30
+ ```yaml
31
+ merge_method: linear
32
+ parameters:
33
+ weight: 1.0
34
+ slices:
35
+ - sources:
36
+ - model: ShinojiResearch/Senku-70B-Full
37
+ layer_range: [0, 1]
38
+ - model: Sao10K/Euryale-1.3-L2-70B
39
+ layer_range: [0, 1]
40
+ parameters:
41
+ weight: 0
42
+ - sources:
43
+ - model: ShinojiResearch/Senku-70B-Full
44
+ layer_range: [1, 20]
45
+ - sources:
46
+ - model: Sao10K/Euryale-1.3-L2-70B
47
+ layer_range: [10, 30]
48
+ - sources:
49
+ - model: ShinojiResearch/Senku-70B-Full
50
+ layer_range: [20, 40]
51
+ - sources:
52
+ - model: Sao10K/Euryale-1.3-L2-70B
53
+ layer_range: [30, 50]
54
+ - sources:
55
+ - model: ShinojiResearch/Senku-70B-Full
56
+ layer_range: [40, 60]
57
+ - sources:
58
+ - model: Sao10K/Euryale-1.3-L2-70B
59
+ layer_range: [50, 70]
60
+ - sources:
61
+ - model: ShinojiResearch/Senku-70B-Full
62
+ layer_range: [60, 79]
63
+ - sources:
64
+ - model: ShinojiResearch/Senku-70B-Full
65
+ layer_range: [79, 80]
66
+ - model: Sao10K/Euryale-1.3-L2-70B
67
+ layer_range: [79, 80]
68
+ parameters:
69
+ weight: 0
70
+ dtype: float16
71
+ tokenizer_source: model:ShinojiResearch/Senku-70B-Full
72
+
73
+ ```