pyGPT-50M / README.md
lbourdois's picture
Add multilingual to the language tag
695f1e6
|
raw
history blame
203 Bytes
metadata
language:
  - en
  - code
  - multilingual
license: mpl-2.0

PythonGPT

A GPT2-type neural network trained on 16 gigabytes of Pyhon scripts from scratch. It has 50 million parameters.

Made as a toy.