Commit
·
c152461
1
Parent(s):
379a11e
Update README.md
Browse files
README.md
CHANGED
@@ -89,7 +89,7 @@ in order to avoid the tokenized chunks exceeding the maximum length of 512. Toke
|
|
89 |
using the tokenizer for the [bert-base-finnish-cased-v1](https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1)
|
90 |
model.
|
91 |
|
92 |
-
The training code with instructions
|
93 |
|
94 |
## Evaluation results
|
95 |
|
|
|
89 |
using the tokenizer for the [bert-base-finnish-cased-v1](https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1)
|
90 |
model.
|
91 |
|
92 |
+
The training code with instructions is available in [GitHub](https://github.com/DALAI-project/Train_BERT_NER).
|
93 |
|
94 |
## Evaluation results
|
95 |
|