shyamieee commited on
Commit
caf70bd
1 Parent(s): d8212a0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -23
README.md CHANGED
@@ -4,7 +4,7 @@ library_name: transformers
4
  tags:
5
  - mergekit
6
  - merge
7
-
8
  ---
9
  # padma_v7_folder
10
 
@@ -13,26 +13,6 @@ This is a merge of pre-trained language models created using [mergekit](https://
13
  ## Merge Details
14
  ### Merge Method
15
 
16
- This model was merged using the [task arithmetic](https://arxiv.org/abs/2212.04089) merge method using models/NeuralCeptrix-7B-slerp as a base.
17
-
18
- ### Models Merged
19
-
20
- The following models were included in the merge:
21
- * models/NeuralOmniWestBeaglake-7B
22
-
23
- ### Configuration
24
-
25
- The following YAML configuration was used to produce this model:
26
 
27
- ```yaml
28
- models:
29
- - model: models/NeuralCeptrix-7B-slerp
30
- parameters:
31
- weight: 0.4
32
- - model: models/NeuralOmniWestBeaglake-7B
33
- parameters:
34
- weight: 0.6
35
- base_model: models/NeuralCeptrix-7B-slerp
36
- merge_method: task_arithmetic
37
- dtype: bfloat16
38
- ```
 
4
  tags:
5
  - mergekit
6
  - merge
7
+ license: apache-2.0
8
  ---
9
  # padma_v7_folder
10
 
 
13
  ## Merge Details
14
  ### Merge Method
15
 
16
+ This model was merged using the [task arithmetic](https://arxiv.org/abs/2212.04089) merge method.
 
 
 
 
 
 
 
 
 
17
 
18
+ ### Models Merged