Update model description
Browse files
README.md
CHANGED
@@ -19,7 +19,9 @@ It achieves the following results on the evaluation set:
|
|
19 |
|
20 |
## Model description
|
21 |
|
22 |
-
|
|
|
|
|
23 |
|
24 |
## Intended uses & limitations
|
25 |
|
|
|
19 |
|
20 |
## Model description
|
21 |
|
22 |
+
GPT2 is a large transformer-based language model. It is built using transformer decoder blocks. BERT, on the other hand, uses transformer encoder blocks. It adds Layer normalisation to the input of each sub-block, similar to a pre-activation residual networks and an additional layer normalisation.
|
23 |
+
|
24 |
+
Paper link : [Language Models are Unsupervised Multitask Learners](https://d4mucfpksywv.cloudfront.net/better-language-models/language-models.pdf)
|
25 |
|
26 |
## Intended uses & limitations
|
27 |
|