File size: 422 Bytes
4ef369c
 
 
 
 
 
 
 
 
 
 
 
f87c51b
 
4ef369c
 
 
 
 
 
 
 
 
7521759
4ef369c
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
---
datasets:
- wikitext
language:
- en
metrics:
- perplexity
---

## Model Details
GPT-2 Pretrained on Wikitext-103 (180M sentences) on 32GB V100 GPU for around 1.10L iterations. 

Val_loss vs train_loss:
![Loss curve](https://huggingface.co/himanshubeniwal/gpt2-wikitext103/resolve/main/sample.png)


### Model Description

Perplexity: 22.87


### Out-of-Scope Use

Just a test model. Please don't expect good results.