stefan-it commited on
Commit
dcefea0
1 Parent(s): 8debbdc

readme: fix link reference for ByT5 embedding implementation

Browse files
Files changed (1) hide show
  1. README.md +24 -24
README.md CHANGED
@@ -27,7 +27,7 @@ The following NEs were annotated: `PER`, `LOC` and `ORG`.
27
 
28
  # ⚠️ Inference Widget ⚠️
29
 
30
- Fine-Tuning ByT5 models in Flair is currently done by implementing an own [`ByT5Embedding`][1] class.
31
 
32
  This class needs to be present when running the model with Flair.
33
 
@@ -35,7 +35,7 @@ Thus, the inference widget is not working with hmByT5 at the moment on the Model
35
 
36
  This should be fixed in future, when ByT5 fine-tuning is supported in Flair directly.
37
 
38
- [1]: https://github.com/stefan-it/hmBench/blob/main/byt5_embeddings.py
39
 
40
  # Results
41
 
@@ -53,28 +53,28 @@ And report micro F1-score on development set:
53
  | bs4-e10-lr0.00015 | [0.7757][11] | [0.7549][12] | [0.7693][13] | [0.7597][14] | [0.7696][15] | 76.58 ± 0.75 |
54
  | bs4-e10-lr0.00016 | [0.7625][16] | [0.7575][17] | [0.769][18] | [0.7635][19] | [0.7647][20] | 76.34 ± 0.37 |
55
 
56
- [1]: https://hf.co/hmbench/hmbench-icdar-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-1
57
- [2]: https://hf.co/hmbench/hmbench-icdar-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2
58
- [3]: https://hf.co/hmbench/hmbench-icdar-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-3
59
- [4]: https://hf.co/hmbench/hmbench-icdar-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-4
60
- [5]: https://hf.co/hmbench/hmbench-icdar-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-5
61
- [6]: https://hf.co/hmbench/hmbench-icdar-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-1
62
- [7]: https://hf.co/hmbench/hmbench-icdar-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-2
63
- [8]: https://hf.co/hmbench/hmbench-icdar-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-3
64
- [9]: https://hf.co/hmbench/hmbench-icdar-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-4
65
- [10]: https://hf.co/hmbench/hmbench-icdar-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5
66
- [11]: https://hf.co/hmbench/hmbench-icdar-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-1
67
- [12]: https://hf.co/hmbench/hmbench-icdar-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2
68
- [13]: https://hf.co/hmbench/hmbench-icdar-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-3
69
- [14]: https://hf.co/hmbench/hmbench-icdar-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-4
70
- [15]: https://hf.co/hmbench/hmbench-icdar-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-5
71
- [16]: https://hf.co/hmbench/hmbench-icdar-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-1
72
- [17]: https://hf.co/hmbench/hmbench-icdar-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-2
73
- [18]: https://hf.co/hmbench/hmbench-icdar-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-3
74
- [19]: https://hf.co/hmbench/hmbench-icdar-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-4
75
- [20]: https://hf.co/hmbench/hmbench-icdar-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5
76
-
77
- The [training log](training.log) and TensorBoard logs are also uploaded to the model hub.
78
 
79
  More information about fine-tuning can be found [here](https://github.com/stefan-it/hmBench).
80
 
 
27
 
28
  # ⚠️ Inference Widget ⚠️
29
 
30
+ Fine-Tuning ByT5 models in Flair is currently done by implementing an own [`ByT5Embedding`][0] class.
31
 
32
  This class needs to be present when running the model with Flair.
33
 
 
35
 
36
  This should be fixed in future, when ByT5 fine-tuning is supported in Flair directly.
37
 
38
+ [0]: https://github.com/stefan-it/hmBench/blob/main/byt5_embeddings.py
39
 
40
  # Results
41
 
 
53
  | bs4-e10-lr0.00015 | [0.7757][11] | [0.7549][12] | [0.7693][13] | [0.7597][14] | [0.7696][15] | 76.58 ± 0.75 |
54
  | bs4-e10-lr0.00016 | [0.7625][16] | [0.7575][17] | [0.769][18] | [0.7635][19] | [0.7647][20] | 76.34 ± 0.37 |
55
 
56
+ [1]: https://hf.co/stefan-it/hmbench-icdar-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-1
57
+ [2]: https://hf.co/stefan-it/hmbench-icdar-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2
58
+ [3]: https://hf.co/stefan-it/hmbench-icdar-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-3
59
+ [4]: https://hf.co/stefan-it/hmbench-icdar-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-4
60
+ [5]: https://hf.co/stefan-it/hmbench-icdar-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-5
61
+ [6]: https://hf.co/stefan-it/hmbench-icdar-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-1
62
+ [7]: https://hf.co/stefan-it/hmbench-icdar-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-2
63
+ [8]: https://hf.co/stefan-it/hmbench-icdar-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-3
64
+ [9]: https://hf.co/stefan-it/hmbench-icdar-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-4
65
+ [10]: https://hf.co/stefan-it/hmbench-icdar-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5
66
+ [11]: https://hf.co/stefan-it/hmbench-icdar-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-1
67
+ [12]: https://hf.co/stefan-it/hmbench-icdar-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2
68
+ [13]: https://hf.co/stefan-it/hmbench-icdar-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-3
69
+ [14]: https://hf.co/stefan-it/hmbench-icdar-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-4
70
+ [15]: https://hf.co/stefan-it/hmbench-icdar-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-5
71
+ [16]: https://hf.co/stefan-it/hmbench-icdar-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-1
72
+ [17]: https://hf.co/stefan-it/hmbench-icdar-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-2
73
+ [18]: https://hf.co/stefan-it/hmbench-icdar-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-3
74
+ [19]: https://hf.co/stefan-it/hmbench-icdar-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-4
75
+ [20]: https://hf.co/stefan-it/hmbench-icdar-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5
76
+
77
+ The [training log](training.log) and TensorBoard logs (only for hmByT5 and hmTEAMS based models) are also uploaded to the model hub.
78
 
79
  More information about fine-tuning can be found [here](https://github.com/stefan-it/hmBench).
80