stefan-it commited on
Commit
797b4ef
1 Parent(s): b66a65a

readme: fix link reference for ByT5 embedding implementation

Browse files
Files changed (1) hide show
  1. README.md +25 -25
README.md CHANGED
@@ -25,7 +25,7 @@ The following NEs were annotated: `pers`, `work`, `loc`, `object`, `date` and `s
25
 
26
  # ⚠️ Inference Widget ⚠️
27
 
28
- Fine-Tuning ByT5 models in Flair is currently done by implementing an own [`ByT5Embedding`][1] class.
29
 
30
  This class needs to be present when running the model with Flair.
31
 
@@ -33,8 +33,8 @@ Thus, the inference widget is not working with hmByT5 at the moment on the Model
33
 
34
  This should be fixed in future, when ByT5 fine-tuning is supported in Flair directly.
35
 
36
- [1]: https://github.com/stefan-it/hmBench/blob/main/byt5_embeddings.py
37
-
38
  # Results
39
 
40
  We performed a hyper-parameter search over the following parameters with 5 different seeds per configuration:
@@ -51,28 +51,28 @@ And report micro F1-score on development set:
51
  | bs8-e10-lr0.00015 | [0.8172][11] | [0.8242][12] | [0.8217][13] | [0.8367][14] | [0.8323][15] | 82.64 ± 0.71 |
52
  | bs8-e10-lr0.00016 | [0.8178][16] | [0.8205][17] | [0.8126][18] | [0.8339][19] | [0.8264][20] | 82.22 ± 0.73 |
53
 
54
- [1]: https://hf.co/hmbench/hmbench-ajmc-en-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-1
55
- [2]: https://hf.co/hmbench/hmbench-ajmc-en-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-2
56
- [3]: https://hf.co/hmbench/hmbench-ajmc-en-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-3
57
- [4]: https://hf.co/hmbench/hmbench-ajmc-en-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-4
58
- [5]: https://hf.co/hmbench/hmbench-ajmc-en-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5
59
- [6]: https://hf.co/hmbench/hmbench-ajmc-en-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-1
60
- [7]: https://hf.co/hmbench/hmbench-ajmc-en-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2
61
- [8]: https://hf.co/hmbench/hmbench-ajmc-en-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-3
62
- [9]: https://hf.co/hmbench/hmbench-ajmc-en-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-4
63
- [10]: https://hf.co/hmbench/hmbench-ajmc-en-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-5
64
- [11]: https://hf.co/hmbench/hmbench-ajmc-en-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-1
65
- [12]: https://hf.co/hmbench/hmbench-ajmc-en-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2
66
- [13]: https://hf.co/hmbench/hmbench-ajmc-en-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-3
67
- [14]: https://hf.co/hmbench/hmbench-ajmc-en-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-4
68
- [15]: https://hf.co/hmbench/hmbench-ajmc-en-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-5
69
- [16]: https://hf.co/hmbench/hmbench-ajmc-en-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-1
70
- [17]: https://hf.co/hmbench/hmbench-ajmc-en-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-2
71
- [18]: https://hf.co/hmbench/hmbench-ajmc-en-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-3
72
- [19]: https://hf.co/hmbench/hmbench-ajmc-en-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-4
73
- [20]: https://hf.co/hmbench/hmbench-ajmc-en-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5
74
-
75
- The [training log](training.log) and TensorBoard logs are also uploaded to the model hub.
76
 
77
  More information about fine-tuning can be found [here](https://github.com/stefan-it/hmBench).
78
 
 
25
 
26
  # ⚠️ Inference Widget ⚠️
27
 
28
+ Fine-Tuning ByT5 models in Flair is currently done by implementing an own [`ByT5Embedding`][0] class.
29
 
30
  This class needs to be present when running the model with Flair.
31
 
 
33
 
34
  This should be fixed in future, when ByT5 fine-tuning is supported in Flair directly.
35
 
36
+ [0]: https://github.com/stefan-it/hmBench/blob/main/byt5_embeddings.py
37
+
38
  # Results
39
 
40
  We performed a hyper-parameter search over the following parameters with 5 different seeds per configuration:
 
51
  | bs8-e10-lr0.00015 | [0.8172][11] | [0.8242][12] | [0.8217][13] | [0.8367][14] | [0.8323][15] | 82.64 ± 0.71 |
52
  | bs8-e10-lr0.00016 | [0.8178][16] | [0.8205][17] | [0.8126][18] | [0.8339][19] | [0.8264][20] | 82.22 ± 0.73 |
53
 
54
+ [1]: https://hf.co/stefan-it/hmbench-ajmc-en-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-1
55
+ [2]: https://hf.co/stefan-it/hmbench-ajmc-en-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-2
56
+ [3]: https://hf.co/stefan-it/hmbench-ajmc-en-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-3
57
+ [4]: https://hf.co/stefan-it/hmbench-ajmc-en-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-4
58
+ [5]: https://hf.co/stefan-it/hmbench-ajmc-en-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5
59
+ [6]: https://hf.co/stefan-it/hmbench-ajmc-en-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-1
60
+ [7]: https://hf.co/stefan-it/hmbench-ajmc-en-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2
61
+ [8]: https://hf.co/stefan-it/hmbench-ajmc-en-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-3
62
+ [9]: https://hf.co/stefan-it/hmbench-ajmc-en-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-4
63
+ [10]: https://hf.co/stefan-it/hmbench-ajmc-en-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-5
64
+ [11]: https://hf.co/stefan-it/hmbench-ajmc-en-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-1
65
+ [12]: https://hf.co/stefan-it/hmbench-ajmc-en-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2
66
+ [13]: https://hf.co/stefan-it/hmbench-ajmc-en-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-3
67
+ [14]: https://hf.co/stefan-it/hmbench-ajmc-en-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-4
68
+ [15]: https://hf.co/stefan-it/hmbench-ajmc-en-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-5
69
+ [16]: https://hf.co/stefan-it/hmbench-ajmc-en-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-1
70
+ [17]: https://hf.co/stefan-it/hmbench-ajmc-en-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-2
71
+ [18]: https://hf.co/stefan-it/hmbench-ajmc-en-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-3
72
+ [19]: https://hf.co/stefan-it/hmbench-ajmc-en-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-4
73
+ [20]: https://hf.co/stefan-it/hmbench-ajmc-en-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5
74
+
75
+ The [training log](training.log) and TensorBoard logs (only for hmByT5 and hmTEAMS based models) are also uploaded to the model hub.
76
 
77
  More information about fine-tuning can be found [here](https://github.com/stefan-it/hmBench).
78