stefan-it commited on
Commit
1271ff3
1 Parent(s): 213ad6d

readme: fix link reference for ByT5 embedding implementation

Browse files
Files changed (1) hide show
  1. README.md +24 -24
README.md CHANGED
@@ -24,7 +24,7 @@ The following NEs were annotated: `loc`, `org` and `pers`.
24
 
25
  # ⚠️ Inference Widget ⚠️
26
 
27
- Fine-Tuning ByT5 models in Flair is currently done by implementing an own [`ByT5Embedding`][1] class.
28
 
29
  This class needs to be present when running the model with Flair.
30
 
@@ -32,7 +32,7 @@ Thus, the inference widget is not working with hmByT5 at the moment on the Model
32
 
33
  This should be fixed in future, when ByT5 fine-tuning is supported in Flair directly.
34
 
35
- [1]: https://github.com/stefan-it/hmBench/blob/main/byt5_embeddings.py
36
 
37
  # Results
38
 
@@ -50,28 +50,28 @@ And report micro F1-score on development set:
50
  | bs4-e10-lr0.00016 | [0.6423][11] | [0.6595][12] | [0.6625][13] | [0.6657][14] | [0.6538][15] | 65.68 ± 0.82 |
51
  | bs8-e10-lr0.00015 | [0.6502][16] | [0.6541][17] | [0.6607][18] | [0.6496][19] | [0.6629][20] | 65.55 ± 0.54 |
52
 
53
- [1]: https://hf.co/hmbench/hmbench-letemps-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-1
54
- [2]: https://hf.co/hmbench/hmbench-letemps-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-2
55
- [3]: https://hf.co/hmbench/hmbench-letemps-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-3
56
- [4]: https://hf.co/hmbench/hmbench-letemps-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-4
57
- [5]: https://hf.co/hmbench/hmbench-letemps-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5
58
- [6]: https://hf.co/hmbench/hmbench-letemps-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-1
59
- [7]: https://hf.co/hmbench/hmbench-letemps-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2
60
- [8]: https://hf.co/hmbench/hmbench-letemps-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-3
61
- [9]: https://hf.co/hmbench/hmbench-letemps-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-4
62
- [10]: https://hf.co/hmbench/hmbench-letemps-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-5
63
- [11]: https://hf.co/hmbench/hmbench-letemps-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-1
64
- [12]: https://hf.co/hmbench/hmbench-letemps-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-2
65
- [13]: https://hf.co/hmbench/hmbench-letemps-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-3
66
- [14]: https://hf.co/hmbench/hmbench-letemps-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-4
67
- [15]: https://hf.co/hmbench/hmbench-letemps-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5
68
- [16]: https://hf.co/hmbench/hmbench-letemps-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-1
69
- [17]: https://hf.co/hmbench/hmbench-letemps-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2
70
- [18]: https://hf.co/hmbench/hmbench-letemps-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-3
71
- [19]: https://hf.co/hmbench/hmbench-letemps-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-4
72
- [20]: https://hf.co/hmbench/hmbench-letemps-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-5
73
-
74
- The [training log](training.log) and TensorBoard logs are also uploaded to the model hub.
75
 
76
  More information about fine-tuning can be found [here](https://github.com/stefan-it/hmBench).
77
 
 
24
 
25
  # ⚠️ Inference Widget ⚠️
26
 
27
+ Fine-Tuning ByT5 models in Flair is currently done by implementing an own [`ByT5Embedding`][0] class.
28
 
29
  This class needs to be present when running the model with Flair.
30
 
 
32
 
33
  This should be fixed in future, when ByT5 fine-tuning is supported in Flair directly.
34
 
35
+ [0]: https://github.com/stefan-it/hmBench/blob/main/byt5_embeddings.py
36
 
37
  # Results
38
 
 
50
  | bs4-e10-lr0.00016 | [0.6423][11] | [0.6595][12] | [0.6625][13] | [0.6657][14] | [0.6538][15] | 65.68 ± 0.82 |
51
  | bs8-e10-lr0.00015 | [0.6502][16] | [0.6541][17] | [0.6607][18] | [0.6496][19] | [0.6629][20] | 65.55 ± 0.54 |
52
 
53
+ [1]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-1
54
+ [2]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-2
55
+ [3]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-3
56
+ [4]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-4
57
+ [5]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbyt5-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5
58
+ [6]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-1
59
+ [7]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2
60
+ [8]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-3
61
+ [9]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-4
62
+ [10]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbyt5-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-5
63
+ [11]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-1
64
+ [12]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-2
65
+ [13]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-3
66
+ [14]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-4
67
+ [15]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbyt5-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5
68
+ [16]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-1
69
+ [17]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2
70
+ [18]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-3
71
+ [19]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-4
72
+ [20]: https://hf.co/stefan-it/hmbench-letemps-fr-hmbyt5-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-5
73
+
74
+ The [training log](training.log) and TensorBoard logs (only for hmByT5 and hmTEAMS based models) are also uploaded to the model hub.
75
 
76
  More information about fine-tuning can be found [here](https://github.com/stefan-it/hmBench).
77