Cover image
🙃 TypoRP: Accessible Inference—At your fingertips

We publish TypoRP, a 0.5B parameter model based on the Qwen2.5 architecture designed for roleplaying (trained using a modified version of our / this script). We used Sapling Dream, also one of our models, as a starting point for further training.
Using a reasoning model as a base model for one designed for roleplaying might seem unreasonable at first, but here are some points to consider:

  • Sapling Dream doesn't reason inside tags unless instructed to: This means the model can be used in both reasoning and traditional ways
  • Reasoning requires strong in-context-learning: This is something that can heavily enhance roleplaying expirience
  • Coupled with the dataset used, the model gives solid performance for its size and beyond

Example Input & Output

Input: Mr. Edward Rochester: *leaning forward, eyes locked on the auctioneer* I trust you know the importance of that painting to my family.
Output: Mary Leavenworth: *glancing sideways, coolly* Importance? It holds a significant value to my family as well, Mr. Rochester. What makes your claim stronger than mine?

Quick Information

  • Context Length: 32 * 1024
  • Recommended Max-Gen Length: 4 * 1024
  • Recommended Temperature: 0 - 0.3
  • Strength: Reasoning (only if instructed to), some RAG, and mainly, ROLEPLAYING
  • Weakness: Model size limits model performance

Usage

We recommend using the model through following format:

## Characters:
{Character 1 name}
{Character 1 persona}

{Character 2 name}
{Character 2 persona}

## Scenario
{Describe a scenario / what you'd consider the "first" message in Character.AI or similar}

## Roleplay
{Character 1 name}: {Character 1 message}
{Character 2 name}: {Character 2 message}
{Character 1 name}: {Character 1 message}
{repeat...}

KoboldAI / KoboldCPP WebUI's "Chat"-Mode is the perfect go-to for that format.


Our Apps & Socials

Chat with our Assistant | Support us Financially | Visit our GitHub

Long live the Islamic Republic of Pakistan; Glory to the Islamic Republic of Pakistan 🇵🇰
The Flag of the Islamic Federal Republic of Pakistan

Downloads last month
646
Safetensors
Model size
494M params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for XeTute/TypoRP-0.5B

Base model

Qwen/Qwen2.5-0.5B
Finetuned
(1)
this model
Quantizations
2 models

Dataset used to train XeTute/TypoRP-0.5B

Collection including XeTute/TypoRP-0.5B