stefan-it commited on
Commit
169caed
1 Parent(s): 33a9125

readme: fix link reference for ByT5 embedding implementation

Browse files
Files changed (1) hide show
  1. README.md +24 -24
README.md CHANGED
@@ -27,7 +27,7 @@ The following NEs were annotated: `pers`, `work`, `loc`, `object`, `date` and `s
27
 
28
  # ⚠️ Inference Widget ⚠️
29
 
30
- Fine-Tuning ByT5 models in Flair is currently done by implementing an own [`ByT5Embedding`][1] class.
31
 
32
  This class needs to be present when running the model with Flair.
33
 
@@ -35,7 +35,7 @@ Thus, the inference widget is not working with hmByT5 at the moment on the Model
35
 
36
  This should be fixed in future, when ByT5 fine-tuning is supported in Flair directly.
37
 
38
- [1]: https://github.com/stefan-it/hmBench/blob/main/byt5_embeddings.py
39
 
40
  # Results
41
 
@@ -53,28 +53,28 @@ And report micro F1-score on development set:
53
  | bs8-e10-lr0.00016 | [0.801][11] | [0.8155][12] | [0.8248][13] | [0.8292][14] | [0.8462][15] | 82.33 ± 1.5 |
54
  | bs8-e10-lr0.00015 | [0.8098][16] | [0.8079][17] | [0.8248][18] | [0.8193][19] | [0.842][20] | 82.08 ± 1.23 |
55
 
56
- [1]: https://hf.co/hmbench/hmbench-ajmc-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-1
57
- [2]: https://hf.co/hmbench/hmbench-ajmc-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-2
58
- [3]: https://hf.co/hmbench/hmbench-ajmc-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-3
59
- [4]: https://hf.co/hmbench/hmbench-ajmc-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-4
60
- [5]: https://hf.co/hmbench/hmbench-ajmc-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5
61
- [6]: https://hf.co/hmbench/hmbench-ajmc-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-1
62
- [7]: https://hf.co/hmbench/hmbench-ajmc-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2
63
- [8]: https://hf.co/hmbench/hmbench-ajmc-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-3
64
- [9]: https://hf.co/hmbench/hmbench-ajmc-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-4
65
- [10]: https://hf.co/hmbench/hmbench-ajmc-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-5
66
- [11]: https://hf.co/hmbench/hmbench-ajmc-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-1
67
- [12]: https://hf.co/hmbench/hmbench-ajmc-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-2
68
- [13]: https://hf.co/hmbench/hmbench-ajmc-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-3
69
- [14]: https://hf.co/hmbench/hmbench-ajmc-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-4
70
- [15]: https://hf.co/hmbench/hmbench-ajmc-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5
71
- [16]: https://hf.co/hmbench/hmbench-ajmc-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-1
72
- [17]: https://hf.co/hmbench/hmbench-ajmc-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2
73
- [18]: https://hf.co/hmbench/hmbench-ajmc-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-3
74
- [19]: https://hf.co/hmbench/hmbench-ajmc-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-4
75
- [20]: https://hf.co/hmbench/hmbench-ajmc-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-5
76
-
77
- The [training log](training.log) and TensorBoard logs are also uploaded to the model hub.
78
 
79
  More information about fine-tuning can be found [here](https://github.com/stefan-it/hmBench).
80
 
 
27
 
28
  # ⚠️ Inference Widget ⚠️
29
 
30
+ Fine-Tuning ByT5 models in Flair is currently done by implementing an own [`ByT5Embedding`][0] class.
31
 
32
  This class needs to be present when running the model with Flair.
33
 
 
35
 
36
  This should be fixed in future, when ByT5 fine-tuning is supported in Flair directly.
37
 
38
+ [0]: https://github.com/stefan-it/hmBench/blob/main/byt5_embeddings.py
39
 
40
  # Results
41
 
 
53
  | bs8-e10-lr0.00016 | [0.801][11] | [0.8155][12] | [0.8248][13] | [0.8292][14] | [0.8462][15] | 82.33 ± 1.5 |
54
  | bs8-e10-lr0.00015 | [0.8098][16] | [0.8079][17] | [0.8248][18] | [0.8193][19] | [0.842][20] | 82.08 ± 1.23 |
55
 
56
+ [1]: https://hf.co/stefan-it/hmbench-ajmc-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-1
57
+ [2]: https://hf.co/stefan-it/hmbench-ajmc-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-2
58
+ [3]: https://hf.co/stefan-it/hmbench-ajmc-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-3
59
+ [4]: https://hf.co/stefan-it/hmbench-ajmc-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-4
60
+ [5]: https://hf.co/stefan-it/hmbench-ajmc-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5
61
+ [6]: https://hf.co/stefan-it/hmbench-ajmc-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-1
62
+ [7]: https://hf.co/stefan-it/hmbench-ajmc-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2
63
+ [8]: https://hf.co/stefan-it/hmbench-ajmc-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-3
64
+ [9]: https://hf.co/stefan-it/hmbench-ajmc-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-4
65
+ [10]: https://hf.co/stefan-it/hmbench-ajmc-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-5
66
+ [11]: https://hf.co/stefan-it/hmbench-ajmc-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-1
67
+ [12]: https://hf.co/stefan-it/hmbench-ajmc-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-2
68
+ [13]: https://hf.co/stefan-it/hmbench-ajmc-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-3
69
+ [14]: https://hf.co/stefan-it/hmbench-ajmc-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-4
70
+ [15]: https://hf.co/stefan-it/hmbench-ajmc-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5
71
+ [16]: https://hf.co/stefan-it/hmbench-ajmc-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-1
72
+ [17]: https://hf.co/stefan-it/hmbench-ajmc-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2
73
+ [18]: https://hf.co/stefan-it/hmbench-ajmc-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-3
74
+ [19]: https://hf.co/stefan-it/hmbench-ajmc-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-4
75
+ [20]: https://hf.co/stefan-it/hmbench-ajmc-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-5
76
+
77
+ The [training log](training.log) and TensorBoard logs (only for hmByT5 and hmTEAMS based models) are also uploaded to the model hub.
78
 
79
  More information about fine-tuning can be found [here](https://github.com/stefan-it/hmBench).
80