ParasiticRogue commited on
Commit
8c4b29f
1 Parent(s): 66d3d3a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -7
README.md CHANGED
@@ -103,10 +103,10 @@ Yes, this is just ChatML mixed with Vicuna, but without the im_start tokens, and
103
 
104
  The following models were included in the merge:
105
 
106
- https://huggingface.co/migtissera/Tess-34B-v1.5b
107
-
108
  https://huggingface.co/NousResearch/Nous-Capybara-34B
109
 
 
 
110
  https://huggingface.co/jondurbin/nontoxic-bagel-34b-v0.2
111
 
112
  https://huggingface.co/maywell/PiVoT-SUS-RP
@@ -123,24 +123,24 @@ The following YAML configuration was used to produce this model:
123
 
124
  ```yaml
125
  models:
126
- - model: /home/oem/Desktop/merge/Nyakura-CausalLM-RP-34B
127
  parameters:
128
  weight: 0.16
129
  density: 0.42
130
- - model: /home/oem/Desktop/merge/Nontoxic-PiVoT-Bagel-RP-34b
131
  parameters:
132
  weight: 0.22
133
  density: 0.54
134
- - model: /home/oem/Desktop/merge/Tess-34B-v1.5b
135
  parameters:
136
  weight: 0.28
137
  density: 0.66
138
- - model: /home/oem/Desktop/merge/Nous-Capybara-34B-V1.9
139
  parameters:
140
  weight: 0.34
141
  density: 0.78
142
  merge_method: dare_ties
143
- base_model: /home/oem/Desktop/merge/Yi-34B-200K-Llama
144
  parameters:
145
  int8_mask: true
146
  dtype: bfloat16
 
103
 
104
  The following models were included in the merge:
105
 
 
 
106
  https://huggingface.co/NousResearch/Nous-Capybara-34B
107
 
108
+ https://huggingface.co/migtissera/Tess-34B-v1.5b
109
+
110
  https://huggingface.co/jondurbin/nontoxic-bagel-34b-v0.2
111
 
112
  https://huggingface.co/maywell/PiVoT-SUS-RP
 
123
 
124
  ```yaml
125
  models:
126
+ - model: Nyakura-CausalLM-RP-34B
127
  parameters:
128
  weight: 0.16
129
  density: 0.42
130
+ - model: Nontoxic-PiVoT-Bagel-RP-34b
131
  parameters:
132
  weight: 0.22
133
  density: 0.54
134
+ - model: Tess-34B-v1.5b
135
  parameters:
136
  weight: 0.28
137
  density: 0.66
138
+ - model: Nous-Capybara-34B-V1.9
139
  parameters:
140
  weight: 0.34
141
  density: 0.78
142
  merge_method: dare_ties
143
+ base_model: Yi-34B-200K-Llama
144
  parameters:
145
  int8_mask: true
146
  dtype: bfloat16