Model Card for TikZero+ (10b)

TikZero+ (10b) is a multimodal language model that automatically synthesizes scientific figures as editable, semantics-preserving TikZ graphics programs conditioned on text captions. It is based on DeTikZifyv2 (8b) and LLaMA3.2 (1b), and integrates TikZero with additional end-to-end fine-tuning. Check out the DeTikZify project for more information and tips on how to best run the model.

Usage

from detikzify.model import load
from detikzify.infer import DetikzifyPipeline

caption = "A multi-layer perceptron with two hidden layers."
pipeline = DetikzifyPipeline(*load(
    model_name_or_path="nllg/tikzero-plus-10b",
    device_map="auto",
    torch_dtype="bfloat16",
))

# generate a single TikZ program
fig = pipeline.sample(text=caption)

# if it compiles, rasterize it and show it
if fig.is_rasterizable:
    fig.rasterize().show()

Acknowledgments

This model was trained using computational resources provided by the bwForCluster Helix, as part of the bwHPC-S5 project. The authors acknowledge support from the state of Baden-Württemberg through the bwHPC initiative and the German Research Foundation (DFG) under grant INST 35/1597-1 FUGG.

Downloads last month
11
Safetensors
Model size
8.47B params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Collection including nllg/tikzero-plus-10b