Update README.md
Browse files
README.md
CHANGED
@@ -9,7 +9,7 @@ tags:
|
|
9 |
- not-for-all-audiences
|
10 |
- nsfw
|
11 |
---
|
12 |
-
# IceCaffeLatteRP-7b
|
13 |
|
14 |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
|
15 |
|
@@ -21,7 +21,7 @@ This model was merged using the SLERP merge method.
|
|
21 |
### Models Merged
|
22 |
|
23 |
The following models were included in the merge:
|
24 |
-
* IceLatteRP
|
25 |
* IceMochaccinoRP-7b
|
26 |
|
27 |
### Configuration
|
@@ -33,11 +33,11 @@ slices:
|
|
33 |
- sources:
|
34 |
- model: IceMochaccinoRP-7b
|
35 |
layer_range: [0, 32]
|
36 |
-
- model: IceLatteRP
|
37 |
layer_range: [0, 32]
|
38 |
|
39 |
merge_method: slerp
|
40 |
-
base_model: IceLatteRP
|
41 |
parameters:
|
42 |
t:
|
43 |
- filter: self_attn
|
|
|
9 |
- not-for-all-audiences
|
10 |
- nsfw
|
11 |
---
|
12 |
+
# IceCaffeLatteRP-7b-4.2bpw-exl2
|
13 |
|
14 |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
|
15 |
|
|
|
21 |
### Models Merged
|
22 |
|
23 |
The following models were included in the merge:
|
24 |
+
* IceLatteRP-7b
|
25 |
* IceMochaccinoRP-7b
|
26 |
|
27 |
### Configuration
|
|
|
33 |
- sources:
|
34 |
- model: IceMochaccinoRP-7b
|
35 |
layer_range: [0, 32]
|
36 |
+
- model: IceLatteRP-7b
|
37 |
layer_range: [0, 32]
|
38 |
|
39 |
merge_method: slerp
|
40 |
+
base_model: IceLatteRP-7b
|
41 |
parameters:
|
42 |
t:
|
43 |
- filter: self_attn
|