Your scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should.
a coder finetuned for RP. because why not?
trained on a mixture of synthetic and natural RP data, as well as a mixture of storywriting/novel data from various sources (sugarquill, a subset of Recursal's SCP dataset, and miscellaneous novels) for 17-ish hours on 2x3090 from runpod
original: https://huggingface.co/Hasnonname/Qwen2.5-Monte-7B-v0.0
it's either overcooked or undercooked, and I can't tell which. regardless, thanks for giving it a shot.
use if you want:
- lack of anatomical and spatial awareness
- crazy mood swings
- mean characters actually being mean (sometimes)
- (occasionally) human-like prose
- Downloads last month
- 310
Model tree for Hasnonname/Qwen2.5-Monte-7B-v0.0-GGUF
Base model
Qwen/Qwen2.5-7B
Finetuned
Qwen/Qwen2.5-Coder-7B
Finetuned
Qwen/Qwen2.5-Coder-7B-Instruct
Finetuned
Hasnonname/Qwen2.5-Monte-7B-v0.0