stefan-it commited on
Commit
cd45896
1 Parent(s): 3d6156f

readme: fix link reference for ByT5 embedding implementation

Browse files
Files changed (1) hide show
  1. README.md +24 -24
README.md CHANGED
@@ -27,7 +27,7 @@ The following NEs were annotated: `PER`, `LOC` and `ORG`.
27
 
28
  # ⚠️ Inference Widget ⚠️
29
 
30
- Fine-Tuning ByT5 models in Flair is currently done by implementing an own [`ByT5Embedding`][1] class.
31
 
32
  This class needs to be present when running the model with Flair.
33
 
@@ -35,7 +35,7 @@ Thus, the inference widget is not working with hmByT5 at the moment on the Model
35
 
36
  This should be fixed in future, when ByT5 fine-tuning is supported in Flair directly.
37
 
38
- [1]: https://github.com/stefan-it/hmBench/blob/main/byt5_embeddings.py
39
 
40
  # Results
41
 
@@ -53,28 +53,28 @@ And report micro F1-score on development set:
53
  | bs4-e10-lr0.00016 | [0.8516][11] | [0.8654][12] | [0.87][13] | [0.8642][14] | [0.8541][15] | 86.11 ± 0.7 |
54
  | bs4-e10-lr0.00015 | [0.8599][16] | [0.8649][17] | [0.8652][18] | [0.8482][19] | [0.854][20] | 85.84 ± 0.65 |
55
 
56
- [1]: https://hf.co/hmbench/hmbench-icdar-nl-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-1
57
- [2]: https://hf.co/hmbench/hmbench-icdar-nl-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2
58
- [3]: https://hf.co/hmbench/hmbench-icdar-nl-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-3
59
- [4]: https://hf.co/hmbench/hmbench-icdar-nl-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-4
60
- [5]: https://hf.co/hmbench/hmbench-icdar-nl-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-5
61
- [6]: https://hf.co/hmbench/hmbench-icdar-nl-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-1
62
- [7]: https://hf.co/hmbench/hmbench-icdar-nl-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-2
63
- [8]: https://hf.co/hmbench/hmbench-icdar-nl-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-3
64
- [9]: https://hf.co/hmbench/hmbench-icdar-nl-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-4
65
- [10]: https://hf.co/hmbench/hmbench-icdar-nl-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5
66
- [11]: https://hf.co/hmbench/hmbench-icdar-nl-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-1
67
- [12]: https://hf.co/hmbench/hmbench-icdar-nl-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-2
68
- [13]: https://hf.co/hmbench/hmbench-icdar-nl-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-3
69
- [14]: https://hf.co/hmbench/hmbench-icdar-nl-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-4
70
- [15]: https://hf.co/hmbench/hmbench-icdar-nl-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5
71
- [16]: https://hf.co/hmbench/hmbench-icdar-nl-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-1
72
- [17]: https://hf.co/hmbench/hmbench-icdar-nl-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2
73
- [18]: https://hf.co/hmbench/hmbench-icdar-nl-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-3
74
- [19]: https://hf.co/hmbench/hmbench-icdar-nl-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-4
75
- [20]: https://hf.co/hmbench/hmbench-icdar-nl-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-5
76
-
77
- The [training log](training.log) and TensorBoard logs are also uploaded to the model hub.
78
 
79
  More information about fine-tuning can be found [here](https://github.com/stefan-it/hmBench).
80
 
 
27
 
28
  # ⚠️ Inference Widget ⚠️
29
 
30
+ Fine-Tuning ByT5 models in Flair is currently done by implementing an own [`ByT5Embedding`][0] class.
31
 
32
  This class needs to be present when running the model with Flair.
33
 
 
35
 
36
  This should be fixed in future, when ByT5 fine-tuning is supported in Flair directly.
37
 
38
+ [0]: https://github.com/stefan-it/hmBench/blob/main/byt5_embeddings.py
39
 
40
  # Results
41
 
 
53
  | bs4-e10-lr0.00016 | [0.8516][11] | [0.8654][12] | [0.87][13] | [0.8642][14] | [0.8541][15] | 86.11 ± 0.7 |
54
  | bs4-e10-lr0.00015 | [0.8599][16] | [0.8649][17] | [0.8652][18] | [0.8482][19] | [0.854][20] | 85.84 ± 0.65 |
55
 
56
+ [1]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-1
57
+ [2]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2
58
+ [3]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-3
59
+ [4]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-4
60
+ [5]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-5
61
+ [6]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-1
62
+ [7]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-2
63
+ [8]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-3
64
+ [9]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-4
65
+ [10]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5
66
+ [11]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-1
67
+ [12]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-2
68
+ [13]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-3
69
+ [14]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-4
70
+ [15]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5
71
+ [16]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-1
72
+ [17]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2
73
+ [18]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-3
74
+ [19]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-4
75
+ [20]: https://hf.co/stefan-it/hmbench-icdar-nl-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-5
76
+
77
+ The [training log](training.log) and TensorBoard logs (only for hmByT5 and hmTEAMS based models) are also uploaded to the model hub.
78
 
79
  More information about fine-tuning can be found [here](https://github.com/stefan-it/hmBench).
80