automerger commited on
Commit
efcae9a
1 Parent(s): 7a77c24

Upload folder using huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +8 -23
README.md CHANGED
@@ -5,38 +5,23 @@ tags:
5
  - mergekit
6
  - lazymergekit
7
  - automerger
8
- base_model:
9
- - mahiatlinux/ShadowM7EXP-7B
10
- - MiniMoog/Mergerix-7b-v0.3
11
  ---
12
 
13
  # Shadowm7expMergerix-7B
14
 
15
  Shadowm7expMergerix-7B is an automated merge created by [Maxime Labonne](https://huggingface.co/mlabonne) using the following configuration.
16
- * [mahiatlinux/ShadowM7EXP-7B](https://huggingface.co/mahiatlinux/ShadowM7EXP-7B)
17
- * [MiniMoog/Mergerix-7b-v0.3](https://huggingface.co/MiniMoog/Mergerix-7b-v0.3)
18
 
19
  ## 🧩 Configuration
20
 
21
  ```yaml
22
- slices:
23
- - sources:
24
- - model: mahiatlinux/ShadowM7EXP-7B
25
- layer_range: [0, 32]
26
- - model: MiniMoog/Mergerix-7b-v0.3
27
- layer_range: [0, 32]
28
- merge_method: slerp
29
- base_model: mahiatlinux/ShadowM7EXP-7B
30
- parameters:
31
- t:
32
- - filter: self_attn
33
- value: [0, 0.5, 0.3, 0.7, 1]
34
- - filter: mlp
35
- value: [1, 0.5, 0.7, 0.3, 0]
36
- - value: 0.5
37
- dtype: bfloat16
38
- random_seed: 0
39
- ```
40
 
41
  ## 💻 Usage
42
 
 
5
  - mergekit
6
  - lazymergekit
7
  - automerger
 
 
 
8
  ---
9
 
10
  # Shadowm7expMergerix-7B
11
 
12
  Shadowm7expMergerix-7B is an automated merge created by [Maxime Labonne](https://huggingface.co/mlabonne) using the following configuration.
 
 
13
 
14
  ## 🧩 Configuration
15
 
16
  ```yaml
17
+ models:
18
+ - model: mistralai/Mistral-7B-v0.1
19
+ - model: mahiatlinux/ShadowM7EXP-7B
20
+ - model: MiniMoog/Mergerix-7b-v0.3
21
+ merge_method: model_stock
22
+ base_model: mistralai/Mistral-7B-v0.1
23
+ dtype: bfloat16
24
+ ```
 
 
 
 
 
 
 
 
 
 
25
 
26
  ## 💻 Usage
27