gpt2wikipedia5 / README.md
raf6423's picture
Update README.md
b324632 verified
metadata
library_name: transformers
tags: []

Model Details

  • Developed by: Rafael Espinosa Mena
  • Model type: GPT2 124M
  • Language(s) (NLP): English

Uses

Pre-Trained from scratch on 5 wikipedia articles and intended for fine tunning.

Training Details

Trained for 200 epochs on 5 wikipedia articles.

Training Data

https://huggingface.co/datasets/wikipedia

Hardware

Trained on a single V100 GPU

Model Card Authors

Rafael Espinosa Mena rafaelespinosamena@gmail.com