Sumail commited on
Commit
aaf69de
1 Parent(s): 5dc70d2

Upload folder using huggingface_hub

Browse files
Files changed (3) hide show
  1. README.md +15 -15
  2. mergekit_config.yml +14 -13
  3. model-00001-of-00001.safetensors +1 -1
README.md CHANGED
@@ -15,12 +15,11 @@ This is a merge of pre-trained language models created using [mergekit](https://
15
  ## Merge Details
16
  ### Merge Method
17
 
18
- This model was merged using the SLERP merge method.
19
 
20
  ### Models Merged
21
 
22
  The following models were included in the merge:
23
- * [abc88767/2c75](https://huggingface.co/abc88767/2c75)
24
  * [rwh/s1_1712781243](https://huggingface.co/rwh/s1_1712781243)
25
 
26
  ### Configuration
@@ -29,21 +28,22 @@ The following YAML configuration was used to produce this model:
29
 
30
  ```yaml
31
 
32
- slices:
33
- - sources:
34
- - model: rwh/s1_1712781243
35
- layer_range: [0, 24]
36
- - model: abc88767/2c75
37
- layer_range: [0, 24]
38
- merge_method: slerp
 
 
 
 
 
39
  base_model: abc88767/2c75
40
  parameters:
41
- t:
42
- - filter: self_attn
43
- value: [0, 0.5, 0.3, 0.7, 1]
44
- - filter: mlp
45
- value: [1, 0.5, 0.7, 0.3, 0]
46
- - value: 0.5
47
  dtype: bfloat16
48
 
 
49
  ```
 
15
  ## Merge Details
16
  ### Merge Method
17
 
18
+ This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [abc88767/2c75](https://huggingface.co/abc88767/2c75) as a base.
19
 
20
  ### Models Merged
21
 
22
  The following models were included in the merge:
 
23
  * [rwh/s1_1712781243](https://huggingface.co/rwh/s1_1712781243)
24
 
25
  ### Configuration
 
28
 
29
  ```yaml
30
 
31
+ models:
32
+ - model: abc88767/2c75
33
+ # no parameters necessary for base model
34
+ - model: abc88767/2c75
35
+ parameters:
36
+ density: 0.5
37
+ weight: 0.5
38
+ - model: rwh/s1_1712781243
39
+ parameters:
40
+ density: 0.5
41
+ weight: 0.3
42
+ merge_method: ties
43
  base_model: abc88767/2c75
44
  parameters:
45
+ normalize: true
 
 
 
 
 
46
  dtype: bfloat16
47
 
48
+
49
  ```
mergekit_config.yml CHANGED
@@ -1,17 +1,18 @@
1
 
2
- slices:
3
- - sources:
4
- - model: rwh/s1_1712781243
5
- layer_range: [0, 24]
6
- - model: abc88767/2c75
7
- layer_range: [0, 24]
8
- merge_method: slerp
 
 
 
 
 
9
  base_model: abc88767/2c75
10
  parameters:
11
- t:
12
- - filter: self_attn
13
- value: [0, 0.5, 0.3, 0.7, 1]
14
- - filter: mlp
15
- value: [1, 0.5, 0.7, 0.3, 0]
16
- - value: 0.5
17
  dtype: bfloat16
 
 
1
 
2
+ models:
3
+ - model: abc88767/2c75
4
+ # no parameters necessary for base model
5
+ - model: abc88767/2c75
6
+ parameters:
7
+ density: 0.5
8
+ weight: 0.5
9
+ - model: rwh/s1_1712781243
10
+ parameters:
11
+ density: 0.5
12
+ weight: 0.3
13
+ merge_method: ties
14
  base_model: abc88767/2c75
15
  parameters:
16
+ normalize: true
 
 
 
 
 
17
  dtype: bfloat16
18
+
model-00001-of-00001.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:339df45ee76ecd981d85a14b75e1f3e037d2c55f180322cb4ebb17aba535880c
3
  size 3289069520
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dc06da6673c61351a2a8895c6d9220531c36275c9e455d2dd25478168d0d2090
3
  size 3289069520