pyGPT-50M / README.md
Danil
Add multilingual to the language tag (#1)
dee04af
metadata
language:
  - en
  - code
  - multilingual
license: mpl-2.0

PythonGPT

A GPT2-type neural network trained on 16 gigabytes of Pyhon scripts from scratch. It has 50 million parameters.

Made as a toy.