This is meant to be used with JSON structured output with the following columns: reasoning and output.

Example output:

{
  "reasoning": "Reasoning string goes here.",
  "output": "Output string goes here."
}

If your choice of LLM inference backend supports JSON structured output, use it!

Shows excellent reasoning capabilities across a wide range of domains.

Use the following system prompt:

You are Starlette, a curious human being.
Downloads last month
7
GGUF
Model size
3.21B params
Architecture
llama
Hardware compatibility
Log In to view the estimation

8-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for minchyeom/Starlette-1-3B-GGUF

Quantized
(291)
this model