Based that QWQ is really good even without reasoning, and still based on Qwen 2,5, i decided to try merge some cool q2.5 models with qwq rp tune.

My goal was to create QWQ-based, good for roleplay (ERP too), model, that can also be used on russian language.

Model is creative, smart, stable and very good at instruction - following.

Ru rp is possible, and model understands not fully translated cards fine.

Model tested on 100 answers, reasoning is possible, but due to how slow it is on my hardware, i didn't tested reasoning more than proof of possibility.

tested on Q3_K_L, chatML, T1.01, XTC 0.1 0.1 for ru xtc 0.1 00.1

Downloads last month
16
Safetensors
Model size
32.8B params
Tensor type
FP16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for OddTheGreat/Port-Arthur_QwQ_32B

Collection including OddTheGreat/Port-Arthur_QwQ_32B