proofGPT-v0.1 / README.md
zhangirazerbayev's picture
Update README.md
1673100
|
raw
history blame
No virus
759 Bytes
metadata
language:
  - en
tags:
  - text generation
  - pytorch
  - causal-lm
  - gpt_neox
license: mit
datasets:
  - hoskinson-center/proof-pile

ProofGPT-v0.1

Model Description

ProofGPT-v0.1 is a 1.3B parameter language model based on the GPT-NeoX architecture and trained on the proof-pile. The model is initialized with pythia-1.3b weights. ProofGPT-v0.1's Weights & Biases training log is viewable here.

Detailed evaluations coming soon :)

Note: Commit 9695b51 updated the tokenizer to have bos, eos, and unk tokens.