stefan-it commited on
Commit
9ae3ae0
1 Parent(s): 442b398

readme: add initial version of model card (#1)

Browse files

- readme: add initial version of model card (801039d55a69684c50f3a957f26b38688664aa50)

Files changed (1) hide show
  1. README.md +74 -0
README.md ADDED
@@ -0,0 +1,74 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: de
3
+ license: mit
4
+ tags:
5
+ - flair
6
+ - token-classification
7
+ - sequence-tagger-model
8
+ base_model: dbmdz/bert-base-historic-multilingual-64k-td-cased
9
+ widget:
10
+ - text: Es war am 25sten , als Lord Corn wollis Dublin mit seinem Gefolge und mehrern
11
+ Truppen verließ , um in einer Central - Lage bey Sligo die Operationen der Armee
12
+ persönlich zu dirigiren . Der Feind dürfte bald in die Enge kommen , da Gen .
13
+ Lacke mit 6000 Mann ihm entgegen marschirt .
14
+ ---
15
+
16
+ # Fine-tuned Flair Model on German HIPE-2020 Dataset (HIPE-2022)
17
+
18
+ This Flair model was fine-tuned on the
19
+ [German HIPE-2020](https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-hipe2020.md)
20
+ NER Dataset using hmBERT 64k as backbone LM.
21
+
22
+ The HIPE-2020 dataset is comprised of newspapers from mid 19C to mid 20C. For information can be found
23
+ [here](https://dl.acm.org/doi/abs/10.1007/978-3-030-58219-7_21).
24
+
25
+ The following NEs were annotated: `loc`, `org`, `pers`, `prod`, `time` and `comp`.
26
+
27
+ # Results
28
+
29
+ We performed a hyper-parameter search over the following parameters with 5 different seeds per configuration:
30
+
31
+ * Batch Sizes: `[4, 8]`
32
+ * Learning Rates: `[3e-05, 5e-05]`
33
+
34
+ And report micro F1-score on development set:
35
+
36
+ | Configuration | Seed 1 | Seed 2 | Seed 3 | Seed 4 | Seed 5 | Average |
37
+ |-------------------|--------------|--------------|------------------|--------------|--------------|-----------------|
38
+ | `bs8-e10-lr3e-05` | [0.7869][1] | [0.7909][2] | [0.7897][3] | [0.7868][4] | [0.7836][5] | 0.7876 ± 0.0028 |
39
+ | `bs4-e10-lr3e-05` | [0.7814][6] | [0.7767][7] | [0.7783][8] | [0.7747][9] | [0.7826][10] | 0.7787 ± 0.0033 |
40
+ | `bs8-e10-lr5e-05` | [0.7761][11] | [0.768][12] | [0.791][13] | [0.7758][14] | [0.7806][15] | 0.7783 ± 0.0084 |
41
+ | `bs4-e10-lr5e-05` | [0.7714][16] | [0.7733][17] | [**0.7723**][18] | [0.7739][19] | [0.7746][20] | 0.7731 ± 0.0013 |
42
+
43
+ [1]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_64k-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-1
44
+ [2]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_64k-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2
45
+ [3]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_64k-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3
46
+ [4]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_64k-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4
47
+ [5]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_64k-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-5
48
+ [6]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_64k-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-1
49
+ [7]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_64k-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2
50
+ [8]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_64k-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3
51
+ [9]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_64k-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4
52
+ [10]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_64k-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-5
53
+ [11]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_64k-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-1
54
+ [12]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_64k-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-2
55
+ [13]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_64k-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-3
56
+ [14]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_64k-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-4
57
+ [15]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_64k-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-5
58
+ [16]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_64k-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-1
59
+ [17]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_64k-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-2
60
+ [18]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_64k-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-3
61
+ [19]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_64k-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-4
62
+ [20]: https://hf.co/stefan-it/hmbench-hipe2020-de-hmbert_64k-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-5
63
+
64
+ The [training log](training.log) and TensorBoard logs (not available for hmBERT Base model) are also uploaded to the model hub.
65
+
66
+ More information about fine-tuning can be found [here](https://github.com/stefan-it/hmBench).
67
+
68
+ # Acknowledgements
69
+
70
+ We thank [Luisa März](https://github.com/LuisaMaerz), [Katharina Schmid](https://github.com/schmika) and
71
+ [Erion Çano](https://github.com/erionc) for their fruitful discussions about Historic Language Models.
72
+
73
+ Research supported with Cloud TPUs from Google's [TPU Research Cloud](https://sites.research.google/trc/about/) (TRC).
74
+ Many Thanks for providing access to the TPUs ❤️