Qwen merges
Collection
4 items
•
Updated
•
1
Based that QWQ is really good even without reasoning, and still based on Qwen 2,5, i decided to try merge some cool q2.5 models with qwq rp tune.
My goal was to create QWQ-based, good for roleplay (ERP too), model, that can also be used on russian language.
Model is creative, smart, stable and very good at instruction - following.
Ru rp is possible, and model understands not fully translated cards fine.
Model tested on 100 answers, reasoning is possible, but due to how slow it is on my hardware, i didn't tested reasoning more than proof of possibility.
tested on Q3_K_L, chatML, T1.01, XTC 0.1 0.1 for ru xtc 0.1 00.1