A smort model made using the cleaned Orca data.

{System Prompt}

Username: {Input}
BotName: {Response}
Username: {Input}
BotName: {Response}

Seriously, I have to add more due to HF Leaderboard requirements. so basically, this model uses a cleaned version of Orca along with my typical RP data package. It was intended as a test to see if the models RP evals would be affected by an overwhelming amount of instruct tokens.

Downloads last month
70
Safetensors
Model size
7.24B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for cgato/TheSpice-7b-FT-ExperimentalOrca

Quantizations
2 models