Your scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should.

a coder finetuned for RP. because why not?

trained on a mixture of synthetic and natural RP data, as well as a mixture of storywriting/novel data from various sources (sugarquill, a subset of Recursal's SCP dataset, and miscellaneous novels) for 17-ish hours on 2x3090 from runpod

original: https://huggingface.co/Hasnonname/Qwen2.5-Monte-7B-v0.0

it's either overcooked or undercooked, and I can't tell which. regardless, thanks for giving it a shot.

use if you want:

  • lack of anatomical and spatial awareness
  • crazy mood swings
  • mean characters actually being mean (sometimes)
  • (occasionally) human-like prose
Downloads last month
310
GGUF
Model size
7.62B params
Architecture
qwen2

4-bit

5-bit

6-bit

8-bit

Inference API
Unable to determine this model's library. Check the docs .

Model tree for Hasnonname/Qwen2.5-Monte-7B-v0.0-GGUF

Base model

Qwen/Qwen2.5-7B
Quantized
(1)
this model