Dongsung commited on
Commit
c6b7fad
1 Parent(s): adfad62

Update model description

Browse files
Files changed (1) hide show
  1. README.md +3 -2
README.md CHANGED
@@ -1,5 +1,4 @@
1
  ---
2
- license:
3
  tags:
4
  - generated_from_trainer
5
  datasets:
@@ -20,7 +19,9 @@ It achieves the following results on the evaluation set:
20
 
21
  ## Model description
22
 
23
- More information needed
 
 
24
 
25
  ## Intended uses & limitations
26
 
 
1
  ---
 
2
  tags:
3
  - generated_from_trainer
4
  datasets:
 
19
 
20
  ## Model description
21
 
22
+ GPT2 is a large transformer-based language model. It is built using transformer decoder blocks. BERT, on the other hand, uses transformer encoder blocks. It adds Layer normalisation to the input of each sub-block, similar to a pre-activation residual networks and an additional layer normalisation.
23
+
24
+ Paper link : [Language Models are Unsupervised Multitask Learners](https://d4mucfpksywv.cloudfront.net/better-language-models/language-models.pdf)
25
 
26
  ## Intended uses & limitations
27