Lewdiculous commited on
Commit
af884ce
1 Parent(s): 578a000

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -15
README.md CHANGED
@@ -40,22 +40,13 @@ Base⇢ GGUF(F16)⇢ Imatrix-Data(F16)⇢ GGUF(Imatrix-Quants)
40
 
41
  ## Original model information:
42
 
43
- # Kool-Aid
44
-
45
- ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/626dfb8786671a29c715f8a9/Bpz3HVIKPaEn_Sz6OD86Z.jpeg)
46
-
47
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
48
-
49
- ## Merge Details
50
- ### Merge Method
51
-
52
  This model was merged using the SLERP merge method.
53
 
54
  ### Models Merged
55
 
56
  The following models were included in the merge:
57
- * ErosEris
58
- * CookieNexus
59
 
60
  ### Configuration
61
 
@@ -64,12 +55,12 @@ The following YAML configuration was used to produce this model:
64
  ```yaml
65
  slices:
66
  - sources:
67
- - model: CookieNexus
68
  layer_range: [0, 32]
69
- - model: ErosEris
70
  layer_range: [0, 32]
71
  merge_method: slerp
72
- base_model: CookieNexus
73
  parameters:
74
  t:
75
  - filter: self_attn
@@ -78,4 +69,4 @@ parameters:
78
  value: [1, 0.5, 0.7, 0.3, 0]
79
  - value: 0.5
80
  dtype: bfloat16
81
- ```
 
40
 
41
  ## Original model information:
42
 
 
 
 
 
 
 
 
 
 
43
  This model was merged using the SLERP merge method.
44
 
45
  ### Models Merged
46
 
47
  The following models were included in the merge:
48
+ * [Endevor/InfinityRP-v1-7B](https://huggingface.co/Endevor/InfinityRP-v1-7B)
49
+ * [l3utterfly/mistral-7b-v0.1-layla-v4](https://huggingface.co/l3utterfly/mistral-7b-v0.1-layla-v4)
50
 
51
  ### Configuration
52
 
 
55
  ```yaml
56
  slices:
57
  - sources:
58
+ - model: Endevor/InfinityRP-v1-7B
59
  layer_range: [0, 32]
60
+ - model: l3utterfly/mistral-7b-v0.1-layla-v4
61
  layer_range: [0, 32]
62
  merge_method: slerp
63
+ base_model: Endevor/InfinityRP-v1-7B
64
  parameters:
65
  t:
66
  - filter: self_attn
 
69
  value: [1, 0.5, 0.7, 0.3, 0]
70
  - value: 0.5
71
  dtype: bfloat16
72
+ ```