readme: update results section
Browse files
README.md
CHANGED
@@ -21,12 +21,14 @@ Details about the training can be found [here](https://github.com/stefan-it/hmBy
|
|
21 |
|
22 |
We evaluated the hmByT5 model on ICDAR Europeana dataset:
|
23 |
|
24 |
-
| Configuration | Run 1 |
|
25 |
-
|
26 |
-
| `wsFalse-bs4-e10-lr0.00015-poolingfirst` |
|
27 |
-
| `wsFalse-bs8-e10-lr0.
|
28 |
-
| `wsFalse-bs8-e10-lr0.
|
29 |
-
| `wsFalse-bs4-e10-lr0.00016-poolingfirst` |
|
|
|
|
|
30 |
|
31 |
# Acknowledgements
|
32 |
|
|
|
21 |
|
22 |
We evaluated the hmByT5 model on ICDAR Europeana dataset:
|
23 |
|
24 |
+
| Configuration | Run 1 | Avg. |
|
25 |
+
|------------------------------------------|-------|-------------|
|
26 |
+
| `wsFalse-bs4-e10-lr0.00015-poolingfirst` | 87.63 | 87.63 ± 0.0 |
|
27 |
+
| `wsFalse-bs8-e10-lr0.00016-poolingfirst` | 87.35 | 87.35 ± 0.0 |
|
28 |
+
| `wsFalse-bs8-e10-lr0.00015-poolingfirst` | 87.26 | 87.26 ± 0.0 |
|
29 |
+
| `wsFalse-bs4-e10-lr0.00016-poolingfirst` | 86.31 | 86.31 ± 0.0 |
|
30 |
+
|
31 |
+
We only performed fine-tuning for one epoch. Unfortunately, this ByT5 Base model shows no improvement over ByT5 Small architecture.
|
32 |
|
33 |
# Acknowledgements
|
34 |
|