automerger commited on
Commit
fe79c94
1 Parent(s): e91de17

Upload folder using huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +8 -23
README.md CHANGED
@@ -5,38 +5,23 @@ tags:
5
  - mergekit
6
  - lazymergekit
7
  - automerger
8
- base_model:
9
- - automerger/YamshadowExperiment28-7B
10
- - nlpguy/T3QM7XP
11
  ---
12
 
13
  # Yamshadowexperiment28T3qm7xp-7B
14
 
15
  Yamshadowexperiment28T3qm7xp-7B is an automated merge created by [Maxime Labonne](https://huggingface.co/mlabonne) using the following configuration.
16
- * [automerger/YamshadowExperiment28-7B](https://huggingface.co/automerger/YamshadowExperiment28-7B)
17
- * [nlpguy/T3QM7XP](https://huggingface.co/nlpguy/T3QM7XP)
18
 
19
  ## 🧩 Configuration
20
 
21
  ```yaml
22
- slices:
23
- - sources:
24
- - model: automerger/YamshadowExperiment28-7B
25
- layer_range: [0, 32]
26
- - model: nlpguy/T3QM7XP
27
- layer_range: [0, 32]
28
- merge_method: slerp
29
- base_model: automerger/YamshadowExperiment28-7B
30
- parameters:
31
- t:
32
- - filter: self_attn
33
- value: [0, 0.5, 0.3, 0.7, 1]
34
- - filter: mlp
35
- value: [1, 0.5, 0.7, 0.3, 0]
36
- - value: 0.5
37
- dtype: bfloat16
38
- random_seed: 0
39
- ```
40
 
41
  ## 💻 Usage
42
 
 
5
  - mergekit
6
  - lazymergekit
7
  - automerger
 
 
 
8
  ---
9
 
10
  # Yamshadowexperiment28T3qm7xp-7B
11
 
12
  Yamshadowexperiment28T3qm7xp-7B is an automated merge created by [Maxime Labonne](https://huggingface.co/mlabonne) using the following configuration.
 
 
13
 
14
  ## 🧩 Configuration
15
 
16
  ```yaml
17
+ models:
18
+ - model: mistralai/Mistral-7B-v0.1
19
+ - model: automerger/YamshadowExperiment28-7B
20
+ - model: nlpguy/T3QM7XP
21
+ merge_method: model_stock
22
+ base_model: mistralai/Mistral-7B-v0.1
23
+ dtype: bfloat16
24
+ ```
 
 
 
 
 
 
 
 
 
 
25
 
26
  ## 💻 Usage
27