proofGPT-v0.1 / README.md
zhangirazerbayev's picture
Update README.md
1e4dd33
metadata
language:
  - en
tags:
  - text generation
  - pytorch
  - causal-lm
  - gpt_neox
license: mit
datasets:
  - hoskinson-center/proof-pile

ProofGPT-v0.1

Model Description

ProofGPT-v0.1 is a 1.3B parameter language model based on the GPT-NeoX architecture and trained on the proof-pile (v1.1). We initiailized training with pythia-1.3b weights, a precursor to the pythia-1.4b model that has roughly equivalent performance.

Detailed evaluations coming soon :)

Note: Commit 3bcdc4e replaced the weights with a model trained on proof-pile v1.1, as opposed to previous commits which were trained on v1.0. Commit 9695b51 updated the tokenizer to have bos, eos, and unk tokens.