MikkoLipsanen
commited on
Commit
•
44adac7
1
Parent(s):
0a5858c
Update README.md
Browse files
README.md
CHANGED
@@ -89,7 +89,7 @@ in order to avoid the tokenized chunks exceeding the maximum length of 512. Toke
|
|
89 |
using the tokenizer for the [bert-base-finnish-cased-v1](https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1)
|
90 |
model.
|
91 |
|
92 |
-
The training code with instructions will be available soon [here](https://github.com/DALAI-
|
93 |
|
94 |
## Evaluation results
|
95 |
|
|
|
89 |
using the tokenizer for the [bert-base-finnish-cased-v1](https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1)
|
90 |
model.
|
91 |
|
92 |
+
The training code with instructions will be available soon [here](https://github.com/DALAI-project/Train_BERT_NER).
|
93 |
|
94 |
## Evaluation results
|
95 |
|