LeroyDyer commited on
Commit
695aa20
1 Parent(s): b216166

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -39
README.md CHANGED
@@ -1,11 +1,6 @@
1
  ---
2
- base_model:
3
- - mistralai/Mistral-7B-Instruct-v0.1
4
-
5
  library_name: transformers
6
- tags:
7
- - mergekit
8
- - merge
9
  license: mit
10
  language:
11
  - en
@@ -16,39 +11,6 @@ metrics:
16
  ---
17
  # Mixtral_Chat_7b
18
 
19
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
20
-
21
- ## Merge Details
22
- ### Merge Method
23
-
24
- This model was merged using the [linear](https://arxiv.org/abs/2203.05482) merge method.
25
-
26
- ### Models Merged
27
-
28
- The following models were included in the merge:
29
-
30
- Locutusque/Hercules-3.1-Mistral-7B:
31
- -Contains erronious
32
- mistralai/Mistral-7B-Instruct-v0.2:
33
- -contains the eronous model !
34
- NousResearch/Hermes-2-Pro-Mistral-7B:
35
- -Good
36
-
37
-
38
- ### Configuration
39
-
40
- The following YAML configuration was used to produce this model:
41
-
42
- ```yaml
43
 
44
- models:
45
- - model: LeroyDyer/Mixtral_Base_Chat_7b
46
- parameters:
47
- weight: 0.7
48
- - model: LeroyDyer/Mixtral_Base_Chat_7b_2.0
49
- parameters:
50
- weight: 0.3
51
- merge_method: linear
52
- dtype: float16
53
 
54
  ```
 
1
  ---
 
 
 
2
  library_name: transformers
3
+
 
 
4
  license: mit
5
  language:
6
  - en
 
11
  ---
12
  # Mixtral_Chat_7b
13
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
14
 
 
 
 
 
 
 
 
 
 
15
 
16
  ```