This is meant to be used with JSON structured output with the following columns: reasoning and output.

Example output:

{
  "reasoning": "Reasoning string goes here.",
  "output": "Output string goes here."
}

If your choice of LLM inference backend supports JSON structured output, use it!

Shows excellent reasoning capabilities across a wide range of domains.

Use the following system prompt:

You are Starlette, a curious human being.
Downloads last month
17
Safetensors
Model size
3.21B params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for minchyeom/Starlette-1-3B

Finetuned
(254)
this model
Quantizations
2 models