stefan-it commited on
Commit
d198314
·
1 Parent(s): bde9424

readme: add results for German and French HIPE-2020 datasets

Browse files
Files changed (1) hide show
  1. README.md +15 -12
README.md CHANGED
@@ -26,27 +26,30 @@ More details can be found in [our GitHub repository](https://github.com/stefan-i
26
  We test our pretrained language models on various datasets from HIPE-2020, HIPE-2022 and Europeana. The following table
27
  shows an overview of used datasets.
28
 
29
- | Language | Datasets
30
- |----------|----------------------------------------------------|
31
- | English | [AjMC] - [TopRes19th] |
32
- | German | [AjMC] - [NewsEye] |
33
- | French | [AjMC] - [ICDAR-Europeana] - [LeTemps] - [NewsEye] |
34
- | Finnish | [NewsEye] |
35
- | Swedish | [NewsEye] |
36
- | Dutch | [ICDAR-Europeana] |
 
37
 
38
  [AjMC]: https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-ajmc.md
39
  [NewsEye]: https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-newseye.md
40
  [TopRes19th]: https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-topres19th.md
41
  [ICDAR-Europeana]: https://github.com/stefan-it/historic-domain-adaptation-icdar
42
  [LeTemps]: https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-letemps.md
 
43
 
44
  Results:
45
 
46
- | Model | English AjMC | German AjMC | French AjMC | German NewsEye | French NewsEye | Finnish NewsEye | Swedish NewsEye | Dutch ICDAR | French ICDAR | French LeTemps | English TopRes19th | Avg. |
47
- |---------------------------------------------------------------------------|--------------|--------------|--------------|----------------|----------------|-----------------|-----------------|--------------|--------------|----------------|--------------------|-----------|
48
- | hmBERT (32k) [Schweter et al.](https://ceur-ws.org/Vol-3180/paper-87.pdf) | 85.36 ± 0.94 | 89.08 ± 0.09 | 85.10 ± 0.60 | 39.65 ± 1.01 | 81.47 ± 0.36 | 77.28 ± 0.37 | 82.85 ± 0.83 | 82.11 ± 0.61 | 77.21 ± 0.16 | 65.73 ± 0.56 | 80.94 ± 0.86 | 76.98 |
49
- | hmTEAMS (Ours) | 86.41 ± 0.36 | 88.64 ± 0.42 | 85.41 ± 0.67 | 41.51 ± 2.82 | 83.20 ± 0.79 | 79.27 ± 1.88 | 82.78 ± 0.60 | 88.21 ± 0.39 | 78.03 ± 0.39 | 66.71 ± 0.46 | 81.36 ± 0.59 | **78.32** |
 
50
 
51
  # Acknowledgements
52
 
 
26
  We test our pretrained language models on various datasets from HIPE-2020, HIPE-2022 and Europeana. The following table
27
  shows an overview of used datasets.
28
 
29
+
30
+ | Language | Datasets |
31
+ |----------|------------------------------------------------------------------|
32
+ | English | [AjMC] - [TopRes19th] |
33
+ | German | [AjMC] - [NewsEye] - [HIPE-2020] |
34
+ | French | [AjMC] - [ICDAR-Europeana] - [LeTemps] - [NewsEye] - [HIPE-2020] |
35
+ | Finnish | [NewsEye] |
36
+ | Swedish | [NewsEye] |
37
+ | Dutch | [ICDAR-Europeana] |
38
 
39
  [AjMC]: https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-ajmc.md
40
  [NewsEye]: https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-newseye.md
41
  [TopRes19th]: https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-topres19th.md
42
  [ICDAR-Europeana]: https://github.com/stefan-it/historic-domain-adaptation-icdar
43
  [LeTemps]: https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-letemps.md
44
+ [HIPE-2020]: https://github.com/hipe-eval/HIPE-2022-data/blob/main/documentation/README-hipe2020.md
45
 
46
  Results:
47
 
48
+
49
+ | Model | English AjMC | German AjMC | French AjMC | German NewsEye | French NewsEye | Finnish NewsEye | Swedish NewsEye | Dutch ICDAR | French ICDAR | French LeTemps | English TopRes19th | German HIPE-2020 | French HIPE-2020 | Avg. |
50
+ |---------------------------------------------------------------------------|--------------|--------------|--------------|----------------|----------------|-----------------|-----------------|--------------|--------------|----------------|--------------------|------------------|------------------|-----------|
51
+ | hmBERT (32k) [Schweter et al.](https://ceur-ws.org/Vol-3180/paper-87.pdf) | 85.36 ± 0.94 | 89.08 ± 0.09 | 85.10 ± 0.60 | 39.65 ± 1.01 | 81.47 ± 0.36 | 77.28 ± 0.37 | 82.85 ± 0.83 | 82.11 ± 0.61 | 77.21 ± 0.16 | 65.73 ± 0.56 | 80.94 ± 0.86 | 79.18 ± 0.38 | 83.47 ± 0.80 | 77.65 |
52
+ | hmTEAMS (Ours) | 86.41 ± 0.36 | 88.64 ± 0.42 | 85.41 ± 0.67 | 41.51 ± 2.82 | 83.20 ± 0.79 | 79.27 ± 1.88 | 82.78 ± 0.60 | 88.21 ± 0.39 | 78.03 ± 0.39 | 66.71 ± 0.46 | 81.36 ± 0.59 | 80.15 ± 0.60 | 86.07 ± 0.49 | **79.06** |
53
 
54
  # Acknowledgements
55