Update README.md
Browse files
README.md
CHANGED
@@ -6,10 +6,12 @@ tags:
|
|
6 |
- merge
|
7 |
|
8 |
---
|
9 |
-
|
10 |
|
11 |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
|
12 |
|
|
|
|
|
13 |
## Merge Details
|
14 |
### Merge Method
|
15 |
|
@@ -18,9 +20,12 @@ This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](
|
|
18 |
### Models Merged
|
19 |
|
20 |
The following models were included in the merge:
|
21 |
-
*
|
22 |
-
*
|
23 |
-
*
|
|
|
|
|
|
|
24 |
|
25 |
### Configuration
|
26 |
|
|
|
6 |
- merge
|
7 |
|
8 |
---
|
9 |
+
## Evolutionary model merge
|
10 |
|
11 |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
|
12 |
|
13 |
+
104 evaluations
|
14 |
+
|
15 |
## Merge Details
|
16 |
### Merge Method
|
17 |
|
|
|
20 |
### Models Merged
|
21 |
|
22 |
The following models were included in the merge:
|
23 |
+
* Mistral-7B-v0.1-flashback-v2
|
24 |
+
* Mistral-7B-Merge-14-v0.2
|
25 |
+
* Starling-LM-7B-beta_581094980
|
26 |
+
|
27 |
+
Base model: mlabonne/NeuralBeagle14-7B
|
28 |
+
|
29 |
|
30 |
### Configuration
|
31 |
|