File size: 628 Bytes
09b63ba
 
 
 
 
 
 
 
 
09f4ead
 
 
09b63ba
 
 
09f4ead
09b63ba
 
 
 
09f4ead
09b63ba
09f4ead
09b63ba
09f4ead
09b63ba
 
 
09f4ead
09b63ba
09f4ead
09b63ba
09f4ead
09b63ba
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
---
library_name: transformers
tags: []
---



## Model Details

- **Developed by:** Rafael Espinosa Mena
- **Model type:** GPT2 124M
- **Language(s) (NLP):** English

## Uses

Pre-Trained from scratch on 200 wikipedia articles and intended for fine tunning.


## Training Details

Trained for 200 epochs on 200 wikipedia articles. Used a learning rate of 3e-5, and a sliding window approach with 1024 tokens per chunk, and a 200 token window.

### Training Data

https://huggingface.co/datasets/wikipedia

#### Hardware

Trained on a single V100 GPU

## Model Card Authors

Rafael Espinosa Mena rafaelespinosamena@gmail.com