Upload README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,72 @@
|
|
1 |
---
|
2 |
license: cc-by-nc-4.0
|
|
|
|
|
3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
license: cc-by-nc-4.0
|
3 |
+
tags:
|
4 |
+
- merge
|
5 |
---
|
6 |
+
|
7 |
+
## Description
|
8 |
+
|
9 |
+
This repo contains bf16 files of Lonepino-11B. Just a normal model.
|
10 |
+
|
11 |
+
## Model used
|
12 |
+
|
13 |
+
- [Intel/neural-chat-7b-v3-3-Slerp](https://huggingface.co/Intel/neural-chat-7b-v3-3-Slerp)
|
14 |
+
- [NeverSleep/Noromaid-7b-v0.2](https://huggingface.co/NeverSleep/Noromaid-7b-v0.2)
|
15 |
+
- [chargoddard/loyal-piano-m7-cdpo](https://huggingface.co/chargoddard/loyal-piano-m7-cdpo)
|
16 |
+
- [maywell/PiVoT-0.1-Starling-LM-RP](https://huggingface.co/maywell/PiVoT-0.1-Starling-LM-RP)
|
17 |
+
|
18 |
+
## The secret sauce
|
19 |
+
|
20 |
+
neural-maid-11B
|
21 |
+
```
|
22 |
+
slices:
|
23 |
+
- sources:
|
24 |
+
- model: Intel/neural-chat-7b-v3-3-Slerp
|
25 |
+
layer_range: [0, 24]
|
26 |
+
- sources:
|
27 |
+
- model: NeverSleep/Noromaid-7b-v0.2
|
28 |
+
layer_range: [8, 32]
|
29 |
+
|
30 |
+
merge_method: passthrough
|
31 |
+
dtype: bfloat16
|
32 |
+
```
|
33 |
+
|
34 |
+
loyal-PiVoT-11B
|
35 |
+
```
|
36 |
+
slices:
|
37 |
+
- sources:
|
38 |
+
- model: chargoddard/loyal-piano-m7-cdpo
|
39 |
+
layer_range: [0, 24]
|
40 |
+
- sources:
|
41 |
+
- model: maywell/PiVoT-0.1-Starling-LM-RP
|
42 |
+
layer_range: [8, 32]
|
43 |
+
|
44 |
+
merge_method: passthrough
|
45 |
+
dtype: bfloat16
|
46 |
+
```
|
47 |
+
|
48 |
+
Lonepino-11B :
|
49 |
+
```
|
50 |
+
slices:
|
51 |
+
- sources:
|
52 |
+
- model: "./neural-maid-11B"
|
53 |
+
layer_range: [0, 48]
|
54 |
+
- model: "./loyal-PiVoT-11B"
|
55 |
+
layer_range: [0, 48]
|
56 |
+
merge_method: slerp
|
57 |
+
base_model: "./neural-maid-11B"
|
58 |
+
parameters:
|
59 |
+
t:
|
60 |
+
- value: 0.4
|
61 |
+
dtype: bfloat16
|
62 |
+
```
|
63 |
+
|
64 |
+
## Prompt template
|
65 |
+
|
66 |
+
Alpaca. Or any you like.
|
67 |
+
|
68 |
+
=w=
|
69 |
+
|
70 |
+
I use [mergekit](https://github.com/cg123/mergekit) for all the manipulation told here.
|
71 |
+
|
72 |
+
Thanks to the [Undi95](https://huggingface.co/Undi95) for the original [11B mistral merge](https://huggingface.co/Undi95/Mistral-11B-OmniMix) recipe.
|