Edit model card

ProofGPT-v0.1

Model Description

ProofGPT-v0.1 is a 6.7B parameter language model based on the GPT-NeoX architecture and trained on the proof-pile (v1.1). We initiailized training with pythia-6.7b weights, a precursor to the pythia-6.9b model that has roughly equivalent performance.

Detailed evaluations coming soon :)

Downloads last month
3,933

Dataset used to train hoskinson-center/proofGPT-v0.1-6.7B

Space using hoskinson-center/proofGPT-v0.1-6.7B 1