Sussybaka commited on
Commit
26744b4
1 Parent(s): 989a819

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -26,7 +26,7 @@ co2_eq_emissions: 149200 g
26
 
27
  # DistilGPT2
28
 
29
- DistilGPT2 (short for Distilled-GPT2) is an English-language model pre-trained with the supervision of the smallest version of Generative Pre-trained Transformer 2 (GPT-2). Like GPT-2, DistilGPT2 can be used to generate text. Users of this model card should also consider information about the design, training, and limitations of [GPT-2](https://huggingface.co/gpt2).
30
 
31
  ## Model Details
32
 
 
26
 
27
  # DistilGPT2
28
 
29
+ DistilGPT2 (short for Distilled-GPT2) is an English-language model pre-trained with the supervision of the smallest version of Generative Pre-trained Transformer 2 (GPT-2). Like GPT-2, DistilGPT2 can be used to generate text. Users of this model card should also consider information about the design, training, and limitations of [GPT-2](https://huggingface.co/gpt2), And this is a Wilkins-ified Version.
30
 
31
  ## Model Details
32