Update README.md
Browse files
README.md
CHANGED
@@ -9,7 +9,7 @@ widget:
|
|
9 |
|
10 |
# What is this?
|
11 |
|
12 |
-
A GPT-2 model (
|
13 |
|
14 |
# How to use
|
15 |
|
@@ -35,11 +35,11 @@ model = AutoModelForCausalLM.from_pretrained("KennethTM/gpt2-medium-danish")
|
|
35 |
|
36 |
# Model training
|
37 |
|
38 |
-
The
|
39 |
|
40 |
-
The model
|
41 |
|
42 |
-
The
|
43 |
|
44 |
For reference, the model achieves a perplexity of 24.7 on 5.000 random validation samples.
|
45 |
|
|
|
9 |
|
10 |
# What is this?
|
11 |
|
12 |
+
A GPT-2 model (medium version, ~354 M parameters) for Danish text generation. The model was not pre-trained from scratch but adapted from the English version using [CLP-Transfer](https://arxiv.org/abs/2301.09626).
|
13 |
|
14 |
# How to use
|
15 |
|
|
|
35 |
|
36 |
# Model training
|
37 |
|
38 |
+
The training data are the Danish part of the [oscar dataset](https://huggingface.co/datasets/oscar) ('unshuffled_deduplicated_da') and a context length of 1024 tokens.
|
39 |
|
40 |
+
The model weights are initialized from the English [GPT-2 medium model](https://huggingface.co/gpt2-medium) ('source model') with new word token embeddings created from the Danish [GPT-2 small model](https://huggingface.co/KennethTM/gpt2-small-danish) ('helper model') using the [CLP-Transfer method](https://github.com/malteos/clp-transfer).
|
41 |
|
42 |
+
The model is trained using ~1.000.000 samples.
|
43 |
|
44 |
For reference, the model achieves a perplexity of 24.7 on 5.000 random validation samples.
|
45 |
|