sophosympatheia commited on
Commit
eb8a093
1 Parent(s): 3d79ec5

Update README.md

Browse files

Added a few merge notes

Files changed (1) hide show
  1. README.md +5 -0
README.md CHANGED
@@ -203,3 +203,8 @@ dtype: float16
203
  tokenizer_source: model:/home/llm/mergequant/models/BASE/152334H_miqu-1-70b-sf
204
 
205
  ```
 
 
 
 
 
 
203
  tokenizer_source: model:/home/llm/mergequant/models/BASE/152334H_miqu-1-70b-sf
204
 
205
  ```
206
+
207
+ Just a note on the configuration above. I tried several variations of the t parameter for this merge. I liked the results from the one above the best, but these other t arrays produced fine results too.
208
+ * [0, 0, 0.1, 0.2, 0.4, 0.8, 0.4, 0.2, 0.1, 0, 0] -- This one definitely brought out more of Midnight Rose but was a little too similar for my liking
209
+ * [0, 0, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0, 0] -- It worked, but I would say this one was the runt of the litter
210
+ * [0, 0, 0.1, 0.2, 0.3, 0.35, 0.3, 0.2, 0.1, 0, 0] -- This was my second-favorite merge after the one I released, which suggests that favoring Miqu over the secondary model is the way to go.