jannikskytt
commited on
Commit
•
2a281e6
1
Parent(s):
ea8dea5
Update README.md
Browse files
README.md
CHANGED
@@ -12,7 +12,7 @@ MeDa-We was trained on a Danish medical corpus of 123M tokens. The word embeddin
|
|
12 |
|
13 |
The embeddings were trained for 10 epochs using a window size of 5 and 10 negative samples.
|
14 |
|
15 |
-
The development of the corpus and word embeddings is described further in our [paper](https://
|
16 |
|
17 |
We also trained a transformer model on the developed corpus which can be found [here](https://huggingface.co/jannikskytt/MeDa-Bert).
|
18 |
|
|
|
12 |
|
13 |
The embeddings were trained for 10 epochs using a window size of 5 and 10 negative samples.
|
14 |
|
15 |
+
The development of the corpus and word embeddings is described further in our [paper](https://aclanthology.org/2023.nodalida-1.31/).
|
16 |
|
17 |
We also trained a transformer model on the developed corpus which can be found [here](https://huggingface.co/jannikskytt/MeDa-Bert).
|
18 |
|