cookinai commited on
Commit
3432d47
1 Parent(s): 649ff4a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +13 -12
README.md CHANGED
@@ -6,20 +6,21 @@ tags:
6
  - merge
7
 
8
  ---
9
- # merge-output
10
 
11
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
12
-
13
- ## Merge Details
14
- ### Merge Method
15
-
16
- This model was merged using the SLERP merge method.
17
 
18
  ### Models Merged
19
 
20
  The following models were included in the merge:
21
- * D:\\ai\\merges\\transformers\\models\\miquhermes\\miqu-openhermes-full
22
- * D:\\ai\\merges\\transformers\\models\\senku\\Senku-70B-Full
 
 
 
 
 
23
 
24
  ### Configuration
25
 
@@ -28,12 +29,12 @@ The following YAML configuration was used to produce this model:
28
  ```yaml
29
  slices:
30
  - sources:
31
- - model: D:\\ai\\merges\\transformers\\models\\senku\\Senku-70B-Full
32
  layer_range: [0, 80]
33
- - model: D:\\ai\\merges\\transformers\\models\\miquhermes\\miqu-openhermes-full
34
  layer_range: [0, 80]
35
  merge_method: slerp
36
- base_model: D:\\ai\\merges\\transformers\\models\\senku\\Senku-70B-Full
37
  parameters:
38
  t:
39
  - filter: self_attn
 
6
  - merge
7
 
8
  ---
9
+ # OrcaHermes-Mistral-70B
10
 
11
+ This model was created by SLERP Merging 2 Miqu Models trained on 2 high preforming datsets
12
+ Just an experiment have not seen much miqu slerps yet.
 
 
 
 
13
 
14
  ### Models Merged
15
 
16
  The following models were included in the merge:
17
+
18
+ [Miqu-Openhermes](https://huggingface.co/alicecomfy/miqu-openhermes-full)
19
+ - Base Miqu Trained on [Openhermes](https://huggingface.co/datasets/teknium/OpenHermes-2.5)
20
+
21
+ [ShinojiResearch/Senku-70B-Full](https://huggingface.co/ShinojiResearch/Senku-70B-Full)
22
+ - Base Miqu Trained on [Slimorca](https://huggingface.co/datasets/Open-Orca/SlimOrca)
23
+
24
 
25
  ### Configuration
26
 
 
29
  ```yaml
30
  slices:
31
  - sources:
32
+ - model: local//path//to//Senku-70B-Full
33
  layer_range: [0, 80]
34
+ - model: local//path//to//miqu-openhermes-full
35
  layer_range: [0, 80]
36
  merge_method: slerp
37
+ base_model: local//path//to//Senku-70B-Full
38
  parameters:
39
  t:
40
  - filter: self_attn