|
--- |
|
license: mit |
|
--- |
|
|
|
### llmstep: [L]LM proofstep suggestions in Lean |
|
https://github.com/wellecks/llmstep |
|
|
|
This model is a Pythia-2.8b-deduped language model fine-tuned on [LeanDojo Benchmark 4](https://zenodo.org/record/8040110). |
|
|
|
The model is fine-tuned on sequences of the form: |
|
```bash |
|
[GOAL]tactic-state[PROOFSTEP]next-tactic<|endoftext|> |
|
``` |
|
This format corresponds to the proofstep objective from [Han et al ICLR 2022](https://arxiv.org/abs/2102.06203).\ |
|
The [python/train](python/train) directory in the repository shows how the model was fine-tuned. |
|
|
|
Please see the repository for more details. |
|
|
|
``` |
|
@misc{llmstep, |
|
author = {Sean Welleck}, |
|
title = {llmstep: LLM proofstep suggestions in Lean}, |
|
year = {2023}, |
|
publisher = {GitHub}, |
|
journal = {GitHub repository}, |
|
howpublished = {\url{https://github.com/wellecks/llmstep}}, |
|
} |
|
``` |