File size: 4,598 Bytes
3cfaefb fc9cca3 3cfaefb |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 |
---
language: nl
license: mit
tags:
- flair
- token-classification
- sequence-tagger-model
base_model: dbmdz/bert-base-historic-multilingual-cased
widget:
- text: Professoren der Geneeskun dige Faculteit te Groningen alsook van de HH , Doctoren
en Chirurgijns van Groningen , Friesland , Noordholland , Overijssel , Gelderland
, Drenthe , in welke Provinciën dit Elixir als Medicament voor Mond en Tanden
reeds jaren bakend is .
---
# Fine-tuned Flair Model on Dutch ICDAR-Europeana NER Dataset
This Flair model was fine-tuned on the
[Dutch ICDAR-Europeana](https://github.com/stefan-it/historic-domain-adaptation-icdar)
NER Dataset using hmBERT as backbone LM.
The ICDAR-Europeana NER Dataset is a preprocessed variant of the
[Europeana NER Corpora](https://github.com/EuropeanaNewspapers/ner-corpora) for Dutch and French.
The following NEs were annotated: `PER`, `LOC` and `ORG`.
# Results
We performed a hyper-parameter search over the following parameters with 5 different seeds per configuration:
* Batch Sizes: `[8, 4]`
* Learning Rates: `[3e-05, 5e-05]`
And report micro F1-score on development set:
| Configuration | Run 1 | Run 2 | Run 3 | Run 4 | Run 5 | Avg. |
|-----------------|--------------|--------------|--------------|--------------|--------------|--------------|
| bs8-e10-lr5e-05 | [0.8191][1] | [0.8086][2] | [0.8237][3] | [0.8318][4] | [0.8235][5] | 82.13 ± 0.76 |
| bs8-e10-lr3e-05 | [0.8056][6] | [0.8183][7] | [0.8241][8] | [0.8431][9] | [0.8155][10] | 82.13 ± 1.24 |
| bs4-e10-lr5e-05 | [0.8055][11] | [0.822][12] | [0.8243][13] | [0.8093][14] | [0.8144][15] | 81.51 ± 0.72 |
| bs4-e10-lr3e-05 | [0.8039][16] | [0.8122][17] | [0.8073][18] | [0.8246][19] | [0.8132][20] | 81.22 ± 0.7 |
[1]: https://hf.co/hmbench/hmbench-icdar-nl-hmbert-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-1
[2]: https://hf.co/hmbench/hmbench-icdar-nl-hmbert-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-2
[3]: https://hf.co/hmbench/hmbench-icdar-nl-hmbert-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-3
[4]: https://hf.co/hmbench/hmbench-icdar-nl-hmbert-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-4
[5]: https://hf.co/hmbench/hmbench-icdar-nl-hmbert-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-5
[6]: https://hf.co/hmbench/hmbench-icdar-nl-hmbert-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-1
[7]: https://hf.co/hmbench/hmbench-icdar-nl-hmbert-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2
[8]: https://hf.co/hmbench/hmbench-icdar-nl-hmbert-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3
[9]: https://hf.co/hmbench/hmbench-icdar-nl-hmbert-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4
[10]: https://hf.co/hmbench/hmbench-icdar-nl-hmbert-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-5
[11]: https://hf.co/hmbench/hmbench-icdar-nl-hmbert-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-1
[12]: https://hf.co/hmbench/hmbench-icdar-nl-hmbert-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-2
[13]: https://hf.co/hmbench/hmbench-icdar-nl-hmbert-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-3
[14]: https://hf.co/hmbench/hmbench-icdar-nl-hmbert-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-4
[15]: https://hf.co/hmbench/hmbench-icdar-nl-hmbert-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-5
[16]: https://hf.co/hmbench/hmbench-icdar-nl-hmbert-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-1
[17]: https://hf.co/hmbench/hmbench-icdar-nl-hmbert-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2
[18]: https://hf.co/hmbench/hmbench-icdar-nl-hmbert-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3
[19]: https://hf.co/hmbench/hmbench-icdar-nl-hmbert-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4
[20]: https://hf.co/hmbench/hmbench-icdar-nl-hmbert-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-5
The [training log](training.log) and TensorBoard logs (only for hmByT5 and hmTEAMS based models) are also uploaded to the model hub.
More information about fine-tuning can be found [here](https://github.com/stefan-it/hmBench).
# Acknowledgements
We thank [Luisa März](https://github.com/LuisaMaerz), [Katharina Schmid](https://github.com/schmika) and
[Erion Çano](https://github.com/erionc) for their fruitful discussions about Historic Language Models.
Research supported with Cloud TPUs from Google's [TPU Research Cloud](https://sites.research.google/trc/about/) (TRC).
Many Thanks for providing access to the TPUs ❤️
|