julien-c HF staff commited on
Commit
9851938
1 Parent(s): 58ada35

Migrate model card from transformers-repo

Browse files

Read announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755
Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/distilgpt2-README.md

Files changed (1) hide show
  1. README.md +21 -0
README.md ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: en
3
+ tags:
4
+ - exbert
5
+
6
+ license: apache-2.0
7
+ datasets:
8
+ - openwebtext
9
+ ---
10
+
11
+ # DistilGPT2
12
+
13
+ DistilGPT2 English language model pretrained with the supervision of [GPT2](https://huggingface.co/gpt2) (the smallest version of GPT2) on [OpenWebTextCorpus](https://skylion007.github.io/OpenWebTextCorpus/), a reproduction of OpenAI's WebText dataset. The model has 6 layers, 768 dimension and 12 heads, totalizing 82M parameters (compared to 124M parameters for GPT2). On average, DistilGPT2 is two times faster than GPT2.
14
+
15
+ On the [WikiText-103](https://blog.einstein.ai/the-wikitext-long-term-dependency-language-modeling-dataset/) benchmark, GPT2 reaches a perplexity on the test set of 16.3 compared to 21.1 for DistilGPT2 (after fine-tuning on the train set).
16
+
17
+ We encourage to check [GPT2](https://huggingface.co/gpt2) to know more about usage, limitations and potential biases.
18
+
19
+ <a href="https://huggingface.co/exbert/?model=distilgpt2">
20
+ <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
21
+ </a>