Model Details
- Developed by: Rafael Espinosa Mena
- Model type: GPT2 124M
- Language(s) (NLP): English
Uses
Pre-Trained from scratch on 5 wikipedia articles and intended for fine tunning.
Training Details
Trained for 200 epochs on 5 wikipedia articles.
Training Data
https://huggingface.co/datasets/wikipedia
Hardware
Trained on a single V100 GPU
Model Card Authors
Rafael Espinosa Mena rafaelespinosamena@gmail.com
- Downloads last month
- 4
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.