File size: 1,502 Bytes
663c72a
 
 
 
 
ee788e6
 
 
 
 
 
a963d54
 
 
ee788e6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
857956c
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
---
library_name: transformers
tags: []
---

# Model Card for Ti*k*Zero+ (10b)
Ti*k*Zero+ (10b) is a multimodal language model that automatically synthesizes
scientific figures as editable, semantics-preserving
[Ti*k*Z](https://github.com/pgf-tikz/pgf) graphics programs conditioned on text
captions. It is based on [DeTi*k*Zify<sub>v2</sub>
(8b)](https://huggingface.co/nllg/detikzify-v2-8b) and [LLaMA<sub>3.2</sub>
(1b)](https://huggingface.co/meta-llama/Llama-3.2-1B), and integrates
[Ti*k*Zero](https://huggingface.co/nllg/tikzero-adapter) with additional
end-to-end fine-tuning. Check out the
[DeTi*k*Zify](https://github.com/potamides/DeTikZify) project for more
information and tips on how to best run the model.

## Usage
```python
from detikzify.model import load
from detikzify.infer import DetikzifyPipeline

caption = "A multi-layer perceptron with two hidden layers."
pipeline = DetikzifyPipeline(*load(
    model_name_or_path="nllg/tikzero-plus-10b",
    device_map="auto",
    torch_dtype="bfloat16",
))

# generate a single TikZ program
fig = pipeline.sample(text=caption)

# if it compiles, rasterize it and show it
if fig.is_rasterizable:
    fig.rasterize().show()
```

## Acknowledgments
This model was trained using computational resources provided by the
bwForCluster Helix, as part of the bwHPC-S5 project. The authors acknowledge
support from the state of Baden-Württemberg through the bwHPC initiative and
the German Research Foundation (DFG) under grant INST 35/1597-1 FUGG.